Live events are a hugely popular component of traditional broadcaster models. Live sports, news, red carpet events, and concerts now make up a significant portion of the OTT traffic across a service provider’s network. One of the most exciting aspects of live programming is its unpredictability. Yet, live events that suddenly attract massive audiences – for example, the US Open last year where UK player Emma Raducanu’s performance captivated the tennis world – may lead to unexpected surges in demand that can strain a network and compromise quality.
Service providers everywhere are optimizing their networks to deliver the great live streaming experiences that consumers expect. Given my focus on establishing service provider partnerships for Qwilt, I am excited to report service providers are moving in large numbers toward Open Caching technologies to more efficiently meet peak demand without sacrificing quality of experience. Furthermore, the longer-term value of Open Caching is also clear because it serves as the platform for future edge computing use cases that will inevitably come to service provider networks and their edge cloud offerings.
Handling massive spikes in demand
During large scale events, in which many millions of users want to watch the same unicast stream, the traffic peaks that ensue can be formidable. Building a network that can perform flawlessly during traffic spikes is difficult and expensive for service providers. Predicting streaming demand and traffic is getting more challenging for service providers as, at the same time, they move their own content services to an all-IP architecture. So, the service provider must contend with live and on-demand OTT traffic, as well as the surge of traffic coming from their own streaming services. A daunting task indeed.
Streaming services already have an interest in getting closer to the service provider edge to better serve consumers. Yet, logistically, having multiple proprietary caches inside the network can shift from a benefit to a burden as the numbers grow. These ‘black box’ caches from third party content providers offer no visibility to the service provider and create more fragmentation of function in the network.
This is where Open Caching really steps up to the challenge, giving service providers visibility, via rich analytics, and control, through open APIs, in a unified platform that can tackle streaming at scale.
What’s possible with live OTT delivery at the edge?
Live streaming at scale is hard and we are just at the beginning of managing mass live events. A million soccer fans each watching a live 6Mbps HD stream can quickly impact network performance. So, imagine how the problem would be magnified if they all switched to 16Mbps 4K streams. This is not out of the question as consumers seek content that will match the 4K quality of their new big screen 4K TVs.
Enabling live streaming at scale for an event like the Super Bowl is one example of streaming at scale for service providers. This year, around 8 million US viewers streamed the game – but the total US audience extends to 110 million. If we are to enable a future where all 100 million viewers in the US can stream the Super Bowl – how will service providers rise to the occasion and meet the moment?
This is where having an Open Edge Cloud can really help by pushing the content caching and delivery as far out to the edge of the network as possible. With Open Caching deeply embedded in the service provider network, the millions of unicast streams needed for a massive live event originate from edge servers that are close to end users, downstream of peering and exchange points and the core. A single live seed stream is fed to the edge server which then delivers a unicast stream for every end user who want to watch the event. The service provider benefits from the significant traffic offload, and because the stream originates closer to the end user, quality of experience goes up. Everyone wins.
Deliver today, build for tomorrow
As the world’s leading sports events and competitions become commonplace on streaming platforms, we are witnessing a massive demand for high-quality live OTT streaming. Delivering the best viewing experience is increasingly in the hands of the service providers, and the Open Edge Cloud offers a path to meet viewer expectations today and a roadmap for future innovation.
Open Caching is the gateway to realizing live streaming at scale. Streaming from the edge suddenly becomes a reality once you deeply embed caches within the service provider’s networks. And this is great news for NFL fans. Because while next year’s Super Bowl may stream to an audience of up to 10 million – the opportunity to stream to an audience 10 times greater is now closer than ever.