A transformation is underway – from Broadcast TV to Online Video
We are in the midst of a transformation that is, at the same time, profound and subtle. Broadcast TV is being replaced by online video, streaming across the Internet to our homes. In a few years, kids will look at their parents in amazement and say “So, are you telling me you actually had to wait until the day and hour that your favorite TV show was broadcast in order to watch it?”
On the one hand, this change is welcome as it places the consumer in complete control of their experience. You and I can now choose the content, time, place and viewing device. The consumption model has turned upside down. We are spoiled for choice.
On the other hand, this new found choice may come at a subtle but real cost. The myriad combinations of content, network, device and place mean the stream that reaches you at any given moment may be traversing uncharted territory. Or, at the very least, the quality of experience may change based on time of day or day of week. And, if you happen to be watching an HBO Game of Thrones Season Four episode on the same day that the latest iOS software update is released, all bets are off for a predictable quality of experience.
It is difficult therefore, when you see the insidious rebuffering screen, to know with certainty, just who to blame and how to fix the problem. Again, imagine watching the HBO Game of Thrones Finale and you see the screen below. Who do you call for help? Most consumers might pick up the phone and call their cable company. But in the case of online video, accountability can often be murky. As a starting point, therefore, let’s look more closely at what metrics define quality of experience (QoE) for streaming video and the online QoE report cards being provided by some content providers.
Rebuffering – The Menace of Streaming Video
Streaming Video – Complex interrelationships among a vast and diverse ecosystem
For online video streaming, a thorough inspection of quality of experience must examine a number of points along a diverse and complicated value chain where the handoffs are many and accountability is unclear. Today, quality of experience is derived from an ecosystem which encompasses all the following elements:
- Content – Resolution and quality of video delivered from studio and the encoding process
- Content Distribution Network (CDN) – Performance and optimization of video cache
- Network Operator – Capacity, congestion and performance along the streaming path from CDN to the consumer’s device
- Device and Application – Quality of the viewing device and client application
- User Behavior – Consumer interaction with the device and client app
This end to end view must be vastly simplified in order to reduce the problem to a small set of metrics that can be measured and reported regularly. To this end, most streaming video experts have concluded that three streaming metrics have the most impact on the quality of experience for viewers. These metrics are:
- Start Time – the elapsed time from when ‘play’ is pushed to when video starts on the screen
- Rebuffer Rate – the number of times a rebuffering event occurs during viewing
- Average Bit Rate – the average rate of the video streams, measured in Mega Bits Per Second (Mbps)
A Simple Speed Test – Falls Short of Measuring QoE
With these metrics in mind, you can quickly see the shortcoming of using the popular Speed Test from OOKLA to determine quality of experience for streaming video. This speed test gives you a snap shot of the overall download and upload speeds from your home or office through your ISP to pre designated measurement points nearby. While this may tell you something about the local performance of your ISP network, it will tell you very little about the key streaming metrics that define video quality of experience. So, using this test alone is of little value when it comes to evaluating the online video experience.
The OOKLA Speed Test http://www.speedtest.net/
The One Metric to Watch – Average Bit Rate
In a further simplification, some major content providers, such as YouTube and Netflix as well as some major ISPs have focused on Average Bit Rate as the most telling metric in Video Quality of Experience. Average Bit Rate – the average bandwidth being consumed by the video stream from origin server to the client viewing the content. The bit rate may vary for each video being streamed based on the resolution, the quality of the viewing screen, bandwidth available in the network path and congestion in the network. In the case of Adaptive Bit Rate (ABR) streaming, the bit rate may vary in real time during the stream based on network congestion.
So, during a period of time in which many videos are being streamed, the average of all streams or average bitrate for a specific content provider or a specific operator will be the best indicator of the quality of viewing experience for users.
There is logic in this approach – of the three core metrics, Average Bit Rate is the best single indicator of video quality – a higher average bit rate means, higher resolution streams (more HD and UltraHD, Less SD, for example) and, in general, fewer rebuffering events. Further, this metric can be readily measured and reported in a general consistent manner.
The Video QoE Reports Cards – Same Metric, Two Very Different Formats
So, with this context, let’s take a look at two popular video streaming report cards, the Google Video Quality Report for YouTube and the Netflix ISP Speed Index. While each focuses on Average Bit Rate as the key metric to be measured and reported, the two reports are very different in format and options.
Google Video Quality Report – A friendly, consumer focused report card
Google presents a user friendly report of streaming quality that is personalized to your location and ISP. Further Google creates a view of the quality of streams you can expect through the day which includes peak time viewing hours. Google goes further by providing their “HD Verified” stamp of approval on the report which lets the consumer know the likelihood of getting an HD YouTube stream throughout the day. I ran the test from our offices at Qwilt and got an instant profile of performance which assured me that Comcast connection would give me at 91% confidence of seeing YouTube in HD even at the peak hours of the day.
Google Video Quality Report https://www.google.com/get/videoqualityreport/
The Google Report also allows me to change locations in order to see performance in my hometown or another city on the Comcast network. I can also easily compare Comcast performance with other ISPs who serve the same city in case I want to consider switching providers.
The Google methodology, published on the site, is simple and easy to understand. Google chooses to simplify the Average Bit Rate methodology by using ranges for HD (>2.5Mbps), SD (0.7 to 2.5 Mbps) and LD (<0.7 Mbps). The actual measured bit rate is compared to the ranges pre-defined and the “HD Verified” rating is determined.
I found Google’s approach to be clear, simple and easy for a consumer to comprehend. Unfortunately, I watch very few YouTube videos so the value of the report for me is low.
Netflix ISP Speed Index – Focused on ISP QoE Rankings
At first glance, it seems clear the Netflix ISP index has a very different purpose in mind. Seen below for the US, Netflix provides a very clear ranking of all ISPs based on the Average Bit Rate of Netflix video streams measured across those ISP networks at Peak Time. There is also an indication as to how the ranking for an individual ISP has change since the last report. Unfortunately, this is where the value of the Netflix report card ends. Unlike the YouTube report, the user cannot see data for a particular city or region. So, the consumer is left with an average for the entire ISP. In the case of large ISPs, if there is regional variation in performance, the consumer cannot see it.
Netflix does publish results for all markets in which it is selling services. So, one can access country specific data which shows the ranking of ISPs in that country based on Average Bit Rate.
Netflix ISP Speed Index – http://ispspeedindex.netflix.com/usa
It seems quite clear that the Netflix agenda, as noted in many articles covering the report when it was first published, is to encourage ISPs to take actions that will improve their rankings. So, while the relative ranking of one ISP to another may have little meaning to a consumer at home, the Netflix ISP Speed Index has routinely been used to highlight the changes in ISP QoE rankings in every market where Netflix operates.
The consumer does not get much useful information from the Netflix ISP Speed Index. If a consumer is having a problem with streaming quality, it is highly unlikely that the report would be of any value. Of course, Netflix does have very helpful tools in their player that can be of value in troubleshooting a problem in real time. And Netflix has been very vocal about their use of data from the billions of hours streamed each month to improve QoE. However, in terms of reporting, there is room for improvement.
Where Do We Go From Here?
We are still at the beginning of this transformation from broadcast to streaming video. Although there is broad agreement on the important metrics, there is still an overall lack of reporting and accountability for the Quality of Experience for Streaming Video. I may be able to get some indication of QoE for YouTube and Netflix but what about all the other online video content providers?
There is much more work to do on this front. We would expect that someday soon, an online tool would tell us in real time exactly what experience we should expect from any content source when streaming across a broadband or mobile connection. The report would also provide trends so we can see quickly if our service providers are moving in the right direction. We already have broad agreement on the metrics. The challenge is to make sense of the myriad combinations of content providers, ISPs and devices involved and create a consumer friendly QoE report that gives us confidence this transformation from broadcast to streaming video will end well for everyone involved.