New Research
Cloud Performance Report: 2022 Edition

Learning Center

Edge Computing

What is Edge Computing?

Edge computing is a computing model where processing and storing of data is done at the edge of a network, distributing and caching data in different locations in concert with cloud computing that is typically resourced across one or more data centers. For Internet devices, the network edge is where the device, or the local network containing the device, communicates with the Internet. Edge computing does not replace a centralized cloud computing ecosystem but augments its computing power.

Edge computing enables smart applications and devices to process data locally at the edge of the network instead of waiting for a response from cloud-based computing and storage resources. This edge computing locality, which can incorporate machine learning algorithms, eliminates round-trip network time for application requests, thereby reducing application latency and network bandwidth requirements. While lower bandwidth translates to cost savings, lower latency is critical for IoT devices, time-sensitive cloud services, self-driving cars, and real-time manufacturing processes. Still, it benefits many other industries, including financial services, healthcare, and communications.

What is the difference between edge computing and cloud computing?

Edge computing is pushing computing apps further away from centralized nodes to the outer boundaries of the network where end-user edge devices and edge servers live. 

Cloud computing, on the other hand, requires that all things connect to centralized data storage in data centers, where vast volumes of information are processed to integrate with other IT systems or make business decisions. Typically, cloud computing is associated with complex data processing operations requiring significant computational resources and power.

Edge computing solutions make sense for many application use cases, including deployment scenarios when IoT devices are served by insufficient bandwidth, or it's not feasible for IoT devices to connect to the cloud seamlessly. High-latency network environments and lack of guaranteed wireless connectivity continue to challenge the scalability and resiliency of cloud-only computing frameworks. These factors will drive a shift in technology to augment cloud environments with edge infrastructure.

Monitoring Edge Applications

IoT-generated data volumes will place new demands on cloud resources and network topologies. Applications needing near real-time response to data processing requests with very low latency will be a crucial driver for technology transformation. New applications embrace IoT, artificial intelligence, and virtual reality as the basis for innovative edge-centric solutions. Edge computing at edge locations will eventually enable these solutions in conjunction with cloud resources to open the way for hybrid edge/cloud applications and data sources that significantly improve an application's response time.

ThousandEyes synthetic application monitoring can play a critical role in addressing the complexities of monitoring edge applications. Network operations need detailed and accurate network path visibility, along with routing and application layer data, to accurately monitor hybrid network performance. For information on how ThousandEyes has helped a company monitor IoT environments, read this case study.

Subscribe to the ThousandEyes Blog

Stay connected with blog updates and outage reports delivered while they're still fresh.

Upgrade your browser to view our website properly.

Please download the latest version of Chrome, Firefox or Microsoft Edge.

More detail