
The traditional cloud model consists of one or multiple hyper-scaled data-centers serving compute resources from a centralized node. As is the case with most cloud providers, multiple hyper scaled data-centers are spread among multiple locations. Use cases for cloud computing includes workloads such as big data analytics, scheduled processing workloads, and distributing software updates where time sensitivity and processing response time are not the most critical factor. While hosting the single ‘core’ of an application for processing inside of a cloud data-center helps with elasticity and scalability, it’s often the case that distance presents itself as an obstacle with timesensitive workloads. There are use cases where response-time from compute processing resources are critical, such as autonomous vehicles computing maneuvering decisions based on real-time sensor data. In instances like these, waiting for response times from servers in a distant data-center is no longer a fitting solution, so solutions such as edge computing become appropriate.
Edge computing uses hardware outside of a data center to process data in closer proximity to the user or sensors generating the input data. The bottleneck of remote computing and the delay that comes from it no longer hinder the business requirements of the workload. Edge computing can take the form of smaller on-board devices on a vehicle, compute devices stationed in telecommunications towers, or computer chips integrated into traffic lights to process local traffic data.

Edge resources in practical applications can be split into two categories: edge servers and edge devices. Edge devices supply the input data. This can range from camera feeds, inventory monitors, and hospital patient monitoring data. Edge devices supply the data to an edge server, a smaller compute resource compared to those of a data-center, but can process and analyze the input streams closer to the source, delivering faster processing response time.
Edge computing opens the traditional cloud model into an open hybrid cloud, where compute resources are leveraged for their strengths to the business requirements. Edge computing resources are best leveraged in cases that require real time data, yet will still require interaction with resources such as centralized cloud data centers for analytics workloads.
Edge computing is the result of the extensive technology innovations that have occurred as a result of the public cloud. As these innovations mature, the future of cloud infrastructure lies in the adoption of both edge and centralized cloud technology to for an open hybrid model of infrastructure to better serve the needs of enterprise and small businesses globally.
“Right now we have been and are building the future of computing, and what it means to be connected to the internet, for the vast majority of human beings”
Dieter Bohn