Edge computing is a distributed system approach that shifts some storage and computing capacity from the primary data center to the data source. The gathered data is computed locally (e.g., on a factory floor, in a store, or throughout a city) rather than sent to a centralized data center for processing and analysis. These local processing units or devices represent the system’s edge, whereas the data center is its center. The output computed at the edge is then sent back to the primary data center for further processing. Examples of edge computing include wrists gadgets or computers that analyze traffic flow.
Problem it addresses
Over the past decade, we’ve seen an increasing amount of edge devices (e.g., mobile phones, smart watches, or sensors). In some cases, real-time data processing is not only a nice-to-have but vital. Think of self-driving cars. Now imagine the data from the car’s sensors would have to be transferred to a data center for processing before being sent back to the vehicle so it can react appropriately. The inherent network latency could be fatal. While this is an extreme example, most users wouldn’t want to use a smart device unable to provide instant feedback.
How it helps
As described above, for edge devices to be useful, they must do at least part of the processing and analyzing locally to provide near real-time feedback to users. This is achieved by shifting some storage and processing resources from the data center to where the data is generated: the edge device. Processed and unprocessed data is subsequently sent to the data center for further processing and storage. In short, efficiency and speed are the primary drivers of edge computing.
Was this page helpful?
Thank you! Please let us know if you have any suggestions.
Thanks for your feedback. Please tell us how we can improve.