It’s rewarding to be part of a story whose headline reads “Verizon Gets Serious About Edge Computing.” In this Light Reading article dated January 31st, Mike Dano, Editorial Director for 5G and Mobile Strategies, provides a summary of outcomes from Verizon’s tests of edge computing functions on its 5G network in Houston. Among the findings – Verizon was able to cut latency in half through use of edge computing technology.
Verizon’s public announcement of the finding of its real 5G technology testing confirms an important new phase in the realm of edge computing – it has validated one of the fundamental promises of edge computing and 5G – that placing compute at the network edge, close to users, has measurable value in terms of latency reduction. In this case, Verizon’s use of edge computing resulted in a 50% reduction in latency for the use case tested. Increasingly, the reduction of latency and the related distribution of compute away from the central cloud and toward the network edge, is becoming essential to making apps work.
Qwilt has been working with this mindset for years and, as a partner with Verizon, we put edge computing to work in their network every day, especially at peak hours when our open caching technology delivers high quality streaming video to consumers’ homes in the evening. As Mike Dano observes:
Verizon inked an agreement with Qwilt in 2017 to deploy that vendor’s “open caching” into its network for video delivery. The move allowed Verizon to essentially store video content in physical locations that are closer to end users, thus eliminating the amount of traffic traveling over the operator’s network. Verizon reported a 20% reduction in traffic traveling over its network via the move. (1)
So, at Qwilt, we are witnesses to the value of edge computing each day in the over 100 commercial deployments of our software and cloud services. Today the use cases are focused on content delivery – live streaming, video on demand, software downloads and the like. Tomorrow, the use cases will expand to include AR/VR, autonomous cars, IoT devices and myriad industry-specific applications.
Edge Computing at Work – Content Delivery at the Network Edge
In keeping with the edge computing theme, Intel and Qwilt have published a solutions brief today entitled, “Qwilt’s Open Edge Cloud Puts Content Delivery at the Network Edge.” Intel’s summary of the market dynamics and the opportunity for service providers to use edge computing as a content delivery mechanism is worth reading:
While streaming video and access to cloud-based services are currently driving up demand for mobile network bandwidth, more applications are poised to add to the data demand onslaught facing MNOs Virtual reality, augmented reality gaming, connected cars, and new consumer live video streaming systems such as nanny cams, pet cams, and home security cameras will soon add to the need for high network throughput. In addition to the potential for these applications to result in significant data flows to mobile data networks, all of these next-generation applications need very low network latency—as low as 10 milliseconds in the case of augmented reality gaming.² Serving these applications from origin servers located in central data centers may result in additional latency because packets must travel farther on the transport network. Optical fiber transport networks have a latency of 5 microseconds per kilometer of fiber-optic cable.³ That assumes a direct cable connection with no network congestion and no network hops. Congestion and network hops are expected in communications service providers’ (CommSPs’) networks and will add to latency. Multi-access edge computing (MEC) is one solution to minimize transport network latency. This new cloud computing architecture consists of virtualized servers delivering network services and applications from physical locations that are at the edge of the network, including points of presence, customer premises, and cellular base stations. Since the servers are at the edge of the network and closer to the point of consumption, transport network latency can be reduced to a minimum. MEC is optimized for 4G/5G network access, but wireline CommSPs and internet service providers can utilize edge networking for the same functionality. (2)
Five Architectural Principles That Drive Qwilt’s Edge Computing Solutions
Edge computing does not arise from one decision. Rather, in our view, it is the result of thoughtful consideration of a set of architectural principles. Taken together, these principles yield the right balance among many design considerations when creating an effective edge computing environment inside a service provider network. A summary of these principles is worth sharing to frame our approach:
1. Common Compute and Storage
The edge cloud is a distributed layer of common-off-the-shelf (COTS) compute and storage resources enabling content delivery from the closest possible location to subscribers. We built the software that can unlock the potential of this layer based on our experience with hundreds of deployments at the true edge of the network – software that is 100% cloud managed, that packs maximum performance in a small form factor, and that is elastic and resilient.
It’s here that our partnership with Intel comes to the forefront. The Qwilt solution is specified for Intel® architecture processors, specifically Intel® Xeon® and Intel Atom® processor-based systems. Qwilt has written the software to take advantage of the common architecture, which allows the software to run unmodified in servers at the edge and in the core of the network.
There is a simple but powerful economic principle at work here when looking at edge computing in service provider networks. In many cases, the cost of using common compute resources at the edge of the network to create capacity for content delivery is much less expensive than the cost of network transport capacity (routers, switches, ports and links) to haul that content across the network from peering points to access. Once the organization has aligned on this point, there is a strong incentive to use common compute, whenever possible, to create network capacity. What’s more, edge delivery will also reduce internet network congestion and increase the quality of experience for consumers. So, the entire ecosystem benefits from this delivery approach.
2. Massively Distributed
The edge cloud delivers unmatched distributed performance enabling apps and content to be streamed from a distance of a few miles to several feet from consumers, thus reducing latency to levels that make future applications possible.
Intel’s solution brief rightly states:
MNOs can establish a massively distributed layer of content caching resources on virtualized Intel® processor-based servers that enable content delivery from the edge location that is closest to the subscriber. (2)
Massively Distributed Edge Computing Layer In Last Mile ISP Networks
3. Open API to Publishers
Publishers and CDNs alike can make use of the edge cloud, unlocking new content delivery capabilities that only in-network distance can deliver. To capture this opportunity, an open and comprehensive set of interfaces is required. Qwilt is already offering an ecosystem-friendly set of APIs as part of our work helping drive standardization activities within the Streaming Video Alliance in order to benefit the entire ecosystem.
The use of Open Cache APIs together with the broad CDN coverage developed for each MNO customer base makes it possible for MNOs to develop “CDN as a service” offerings that become a new revenue stream. If the MNO builds out an Open Edge Cloud CDN that completely covers its service area, it can provide the same access as today’s commercial CDN offerings, but with the advantage of being closer to the customer for outstanding quality of experience (QoE).
One Integration to Qwilt’s APIs Allows Access to Global Content Delivery Edge Cloud
4. Match Resource to Application
Some applications, like video (especially 4K), require a lot of bandwidth and a lot of storage (so that the videos people watch can be cached). Others, like gaming and VR, require split second response time for in-game actions, but not a lot of storage. The edge cloud must have the flexibility to understand which application needs which resource – and the ability to adapt accordingly. Furthermore, each Qwilt edge server can be utilized by other virtual network function (VNF)-based services, which makes it cost-effective enough to distribute throughout the MNO’s serving area
5. Complements Centralized Cloud
The centralized cloud is still of massive importance. The edge cloud creates a new layer that can work alongside the centralized cloud, and when used smartly, establishes a superior application and content delivery infrastructure – the magic is to know when to use which cloud for getting the best performance. Think about it like a hybrid car engine that needs to know when to kick in the electric engine and when to use the gas one. At Qwilt we created an end-to-end solution that has the smarts to create one secure fabric across the edge and centralized cloud.
MEC Balances Use of Centralized and Edge Cloud to Optimize Content Delivery
Follow Us to the Edge
We are thrilled to see the work we do recognized by our customers and that the industry is “Getting Serious about Edge Compute.” We’re proud to partner with Verizon and play a part in the transformation of their network architecture as they embrace edge computing. This same story is taking place in our customer networks around the world and we look forward to telling those stories soon.
And look for us in the Intel stand (Hall 3 #3E31) at MWC 2019 where we’ll be demonstrating several applications of the Open Edge Cloud.
(1) Verizon Gets Serious About the Edge – Light Reading – https://www.lightreading.com/the-edge/verizon-gets-serious-about-edge-computing/d/d-id/749185
(2) Qwilt’s Open Edge Cloud Puts Content Delivery at the Network Edge – Intel Network Builders – https://builders.intel.com/docs/networkbuilders/qwilts-open-edge-cloud-puts-content-delivery-at-the-network-edge.pdf