Article written by Tijn van der Zant Ph.D., CEO Datamaister

Just a decade ago, the cloud was a mysterious beast. The cloud was poorly defined, and cloud providers were not yet the trusted partners that they are now. People just recently realized that their data security is most likely not better than large cloud providers.Thirty-five years ago, people connected to the internet to find information on bulletin boards. News took ages to get from A to B, and telex was still an excellent way to send data between companies.

About twenty-five years ago, the internet was poorly defined. There were some protocols, people started to use email more often, and the first websites popped up. Search engines were indexed by people and not even by algorithms.Networked computers started in the ’60s as researchers in the UK and France invented time sharing. Multiple users could use the big mainframes, and humans did not have to be physically near these giant machines, which were quite often programmed by women.

The Eniac computer, with 19000 vacuum tubes

Alan Turing, pioneer of Edge AI  

The first computers had no network at all. They were big machines and challenging to use. We could call these computers Edge Devices with a bit of imagination, although it would be a bit of a stretch. But the key message is that current edge devices share something with those first computers: they had limited to no network connection and had limited computation power.

Alan Turing created probably the first Artificial Neural Network in the late ’40s on this kind of machine. That means that this computer, which can be called an Edge Device, has Edge AI.

The usefulness of Edge AI
More modern versions would be computing devices with intelligent sensors with some form of AI running on it. The Edge Device does not require the cloud to make a decision. An example would be a smart speaker. A wake word or phrase (such as “Alexa” or “OK Google”) has been trained as a machine learning model and stored locally on the speaker. Whenever the smart speaker hears the wake word or phrase, it will begin “listening.”

Edge AI is useful when it is necessary that the processing is performed locally. For example, when:

  • The speed of data processing is crucial, and we cannot even afford a millisecond of data transmission delay
  • There are problems with stable access to the network. Such a situation may occur, for example, in less populated areas or rooms deliberately blocking the signal
  • You work with sensitive data that should not leave the device.

Smart edge: from sensors to sensing
More often than not, the sensor data’s information lies in the readings and patterns recorded over time than what the sensor is currently reading. One of the essential differences between rule-based systems and Machine Learning (ML) models is that the latter can detect and recognize patterns. For instance, this allows you to identify a regular ‘normal’ pattern from an abnormal pattern. Something that can be used in cybersecurity to spot a malicious attack. Or it can be used to recognize a machine is not functioning as desired, which can indicate a defect or deterioration.

A second important feature is that Machine Learning models generally are good at making predictions based on events that have happened in the past. Using ML models, you can detect patterns in data that engineered rules would have missed and then make forecasts based on those patterns. For instance, this would allow you to notice a machine’s deterioration and then predict when this will result in an actual failure.

Putting such models on your sensors elevates the sensors from merely registering the environment to sensing the environment.

Smart Edge has several advantages:

  • It reduces bandwidth requirements
  • It speeds up processes

Reducing bandwidth requirements
Recognizing and counting different types of patterns can be very useful. Locally on the machine, you can measure the frequency, duration, etc. of the different patterns and then send only the meta values to the cloud for further processing. Very useful if you have limited bandwidth between the device and the cloud. That counts even more so if the data is extensive, like high definition camera data, for instance.

Speeding up processes
If you want to take action based on the gathered data, there are cases where sending the data to the cloud for processing is too slow. If a dangerous situation is detected, you may want to respond as soon as possible. Two examples are autonomous cars and production lines. A fast response can be the difference between life and death or massive loss of production. Having something that can process the data fast on the spot makes the difference. 

Small differences
Depending on the hardware’s computation power at the edge, the models used can differ a lot. In some cases, you have just a small microprocessor and a simple sensor; in other cases, you maybe have a camera with an ARM chipset. These are essential constraints on the types of algorithms you can use and, consequently, on the jobs and the result models can perform. If you assume that you have a decent amount of computation power at the edge, you can likely use many algorithms like deep learning with just a few adaptations. This is often the case for smart cameras and microphones. Especially with cameras, you gain a lot with local processing as camera images put a strain on the bandwidth.

But it gets even more exciting when low power consumption, low computation power, and insufficient storage space are critical. Such models need to be small and energy-efficient. The AI that runs on these devices is often very different from the AI running in the cloud due to those constraints. This is very much in contrast with the could.  Such Edge AI requires specific implementations to make it work. Also, edge devices have a limited range of programming languages and tools. The code has to be optimized, again, something which is usually less critical in the cloud. Finally, with our connected society, we are used to updating devices on the fly. That, too, might be tricky with edge devices.

To summarize, AI at the edge is something special. It needs to deal with very different constraints than when it runs on your local computer or in the cloud. Putting AI on edge certainly has its use. It is particularly applicable in situations where durability (battery life), speed (of decision making), and safety (security) are crucial.