The Case for Computing at the Edge
Mitchell Gracie posted on May 24, 2019 |

The last thing you want to see in a self-driving car is an hourglass or a spinning pinwheel icon while the car decides if the object in its path is a rabbit, a plastic bag or a human being. Sadly, this is what would happen if the artificial intelligence (AI) that controls an autonomous vehicle resided entirely in the cloud.

Communicating back and forth with the cloud takes time. Autonomous vehicles aren’t the only ones that can’t wait. Health care, with its life monitoring medical devices, is another example where milliseconds are critical.

Relaying data to the cloud and having them analyzed there can cost time, money and energy. Another concern as data flies around the Internet: security. (Image courtesy of Avnet.)
Relaying data to the cloud and having them analyzed there can cost time, money and energy. Another concern as data flies around the Internet: security. (Image courtesy of Avnet.)

This phenomenon of signals being delayed by having to travel long distances is known as latency. The infrastructure, i.e., the data centers that make up the cloud can be thousands of miles from the self-driving car. Regardless of the ingenuity of engineers and network developers, there exist too many cases where the time between a decision and action is smaller than what today’s latency associated with the cloud creates. The ultimate limit of the speed of light is 186,000 miles per second in a vacuum but it slows down to 122,000 miles per second in a fiber-optic cable, causing a delay of 0.82 milliseconds per 100 miles between the source and the receiver.

So, why not avoid latency altogether?

Enter: edge computing.

Latency can be reduced—or, for all intents and purposes, just about eliminated—with the integration of more localized computing needs. Edge computing is an architecture that puts the power to process and analyze data at a network’s edge—right into the device itself. For autonomous vehicles, this means having the AI built into the car. Processing at the edge offers advantages to situations where milliseconds—sometimes nanoseconds—are of the utmost importance. In the example of autonomous vehicles integrated with edge computing, signals need only travel from sensors to processor to vehicle control system (such as brakes or steering).

Industries That Need Edge Computing

Most of today’s challenges can be partitioned into their processing needs. Centralized processing in the cloud is great for challenges where latency issues are negligible for the experiences of end users, like data storage and the back end of web services. Localized processing, as offered with edge computing, satisfies challenges where time budgeting is imperative. The example of autonomous vehicles has already been presented, but situations such as those encountered in emergency rooms or manufacturing floors also fit into the edge’s hopper.

When it comes to deciding if a use case needs the cloud or its processing ought to be done at the network edge, it helps to consider the task’s margins for latency, need for bandwidth and level of autonomy. (Image courtesy of Avnet.)
When it comes to deciding if a use case needs the cloud or its processing ought to be done at the network edge, it helps to consider the task’s margins for latency, need for bandwidth and level of autonomy. (Image courtesy of Avnet.)

Take ABB’s Predictive Emission Monitoring System (PEMS) as an example. For one of the world’s leaders in power grids and industrial transport infrastructure, tracking emissions from its plants is necessary for regulatory compliance.

However, for one of the largest gas processing plants in the world, directly measuring emissions can be a challenge and come with high costs. Instead, by integrating AI-driven Internet of Things (IoT) solutions at the network’s edge, ABB’s PEMS takes in data from various fuel flows, pressure sensors and ambient air temperature; runs that data through an empirical model; and can then predict the emissions of its gas processing plant. There is no need to send those streams of data to expensive analytic services in the cloud or worry about malicious agents gaining access to those data streams; only intelligent analytics at the edge.

Hardware Meets Software

The proliferation of technologies that combine software and hardware is at the heart of the viability of edge processing. Examples of this are devices like Avnet’s SmartEdge Agile, with its use of Bluetooth Low Energy (BLE) electronics with Octonion’s Brainium zero-coding AI platform that’s hosted in the Microsoft Azure Sphere. SmartEdge Agile’s design and support, with its on-board AI software stacks, allows for the integration of multiple sensors for supervised and unsupervised machine learning.

Avnet’s SmartEdge Agile is an example of where AI, IoT and the cloud come together to help build better, smarter and more efficient workflows. (Image courtesy of Avnet.)
Avnet’s SmartEdge Agile is an example of where AI, IoT and the cloud come together to help build better, smarter and more efficient workflows. (Image courtesy of Avnet.)

Field-programmable gate arrays (FPGAs) are another class of electronics that allow for easier implementation of AI-enabled IoT solutions at the edge. According to All About Circuits, FPGAs are much more than just a simple array of logic gates, as the name would imply. Instead, an FPGA “is an array of carefully designed and interconnected digital sub-circuits that efficiently implement common functions while also offering very high levels of flexibility.” The fact that they are “field programmable” means that, for a specific use case’s needs, the FPGA can be configured and finely tuned to meet the case’s necessary specifications. That’s what makes FPGAs so useful for edge processing.

For example, a sensor on the manufacturing floor (or a gas processing plant) does not need access to the full suite of AI-driven services in the cloud. Instead, if an FPGA is being used, the very specific purposes of the sensor and the AI that drives the analysis of its data streams can be configured right there into the sensor itself.

Security of Networks That Utilize Edge Computing

Changing the topologies and structures of networks that harvest and process data introduces questions about the privacy and security of that data. Often, it is assumed that the remote servers in the cloud offer the best security solutions, but this assumption can be flawed. While great for storing and backing up data, the cloud represents a sequence of touchpoints and nodes between sensor and server, and those touchpoints vary in their own protocols for privacy and security. Each of those touchpoints represent an opportunity—or entry point—for malicious agents to breach and access data streams. The localization of edge processing avoids this vulnerability altogether.

Data streams to the cloud are vulnerable to attacks and leaks due to the number of touchpoints the stream can meet along the way to its destination. Edge processing helps avoid this vulnerability altogether. (Image courtesy of Pixabay.)
Data streams to the cloud are vulnerable to attacks and leaks due to the number of touchpoints the stream can meet along the way to its destination. Edge processing helps avoid this vulnerability altogether.

By the very nature of the intent of processing at the edge, a good number of sensors and devices combine software with hardware into the sensor or device themselves. This means that all raw data are natively processed.

Take visual sensors at the edge as an example. Without a reliance on AI embedded in the cloud, a visual sensor can process any images it takes, have its data one-way converted from an image into an array of values that its local AI can then analyze. Any output from the device would only be the results of this analysis.

Without the raw data being natively stored, any breach into the network—or even the device itself—would only reveal the clean or converted data. Moreover, various layers of encryption can be used so that, in the case of malicious access to a sensor or an edge processing unit, any raw or processed data would be safely secured.

Power and Energy

Beyond solving problems with time and security, edge computing also offers a solution to the cost of transmitting data to and from the cloud. That is, the latency associated with cloud computing doesn’t just cost time, it also requires more power and energy than processing data at the edge. If the point of analytics is to help decrease operational costs, why is it that a large portion of analytics is so dependent on costly cloud services and high bandwidth?

The integration of software with hardware has allowed edge processing to flourish as much as it has also allowed for downward pressures on the power appetites of edge sensors. That means that the case for edge computing is supported by an initiative to burn less carbon when sending data to the cloud for processing. Moreover, progress in batteries and AI-driven optimization of power use are proving that a complete dependence on the cloud can be more expensive than is necessary for certain use cases.

Bringing the Edge and the Cloud Together

The proliferation of edge processing does not signify an end to use of the cloud but rather, reduces an utter dependence on it. Processing at the edge alleviates the burden and time associated with using the cloud exclusively for all data processing. Both the cloud and edge computing will and should be used in conjunction. Edge processing is a bringing together of AI, the cloud and IoT to increase efficiency and decrease costs. Once intelligent analytics have processed raw data at the edge, there’s no rule saying that the results and decisions cannot be relayed into the cloud for further use.

There’s a lot of room for businesses to deploy technologies such as AI and IoT into their supply chain and product lifecycle management schemes. (Image courtesy of Microsoft and the Economist.)
There’s a lot of room for businesses to deploy technologies such as AI and IoT into their supply chain and product lifecycle management schemes. (Image courtesy of Microsoft and the Economist.)

In a survey within a Microsoft-sponsored white paper written by The Economist, 97 percent of respondents considered the cloud as important to “the effective collection, analysis and use of supply-chain data,” whereas only 71 percent thought the same for IoT and 63 percent for AI.

Regardless of a business’s requirements, if there is a place for IoT anywhere in its workflow, then it is likely that there is a reason to deploy AI-enabled edge processing that can easily be deployed along with already established cloud services.

To learn more, visit Avnet.


Avnet has sponsored this post.  All opinions are mine.  –Mitchell Gracie


Recommended For You