A joint VO and Broadpeak solution enables broadcasters and operators to start offering their customers low latency live events.
This is a joint blog post on low latency streaming written with colleagues at Broadpeak, following on from news of our joint solution recently deployed at Cellcom.
With an increased number of live sports and other premium content carried by streaming networks worldwide, the issue of latency has become a pressing one for broadcasters and operators alike. The extra processes in the streaming chain can lead to a signal delay of between 30 to 40 seconds in comparison to the four seconds of traditional broadcast. And given that viewers rarely watch content in isolation nowadays - with social media in particular linking them across countries and across platforms - the result can be significant viewer dissatisfaction with their streaming service.
During the 2022 FIFA World Cup VO and Broadpeak cooperated to install a system at Israeli operator Cellcom, that was able to bring signal latency for the matches it streamed down to as low as three seconds. By combining our technologies alongside an optimised CDN and CMAF video format, we believe most operators can now achieve a latency of only a few seconds, and even reach the same level as Broadcast with a reasonable effort.
The result will both please viewers and satisfy rights holders, making the issue of streaming latency effectively a thing of the past. But what are the issues here and how have we addressed them? Read on to find out more.
The causes of latency
There are three processes in the streaming chain that add latency: encoding, packaging, and delivery. The first two account for about three seconds each, adding six seconds to the signal, whereas delivery can take anything up to 25 seconds in the worst-case scenario.This accumulated delay can have a significant impact on viewer enjoyment, so the goal is to bring it down as much as possible without compromising the viewer experience.
As expected, saving the most amount of time is achieved by tackling the largest cause of latency, which is the variability of an HTTP public network. This issue has to be addressed on two sides, first by relying on CDN technology to secure the delivery as much as possible and, second, by implementing measures in the player to handle properly the remaining effects of this variability.
One key element we use to reduce latency is to convert the video segment to CMAF, the Common Media Application Format. CMAF allows breaking the video data down into smaller chunks of a set duration that can be published as soon as it is encoded and forwarded as soon as received. Combined with chunked transfer encoding (CTE), which follows the same sub-segmentation principle at HTTP level, CMAF avoids having to wait for a full video segments (typically 2 to 6 seconds) to be downloaded consecutively at each network element level before being processed and passed forward, and thus allows saving some precious time all along the delivery chain. Delivery takes place in near real-time while later chunks are still being processed.
Looking forward, VO and Broadpeak aim at improving latency at Cellcom and others even further in the future thanks to one particularly efficient technology, multicast-ABR (MABR). MABR is a technology that allows to distribute one single physical copy of video content to all users via a reserved multicast network path rather than individual HTTP connections. This technology provides streaming of live events with infinite scalability and guaranteed delivery which, combined with CMAF, is able to totally erase the latency induced by adaptive bitrate streaming and match the same level as broadcast.
Broadpeak’s Multicast ABR has already been integrated to VO’s ecosystem in another commercial deployment, and it is interesting to note that this operator was able to reduce the latency delta with broadcast down to 5 seconds only with this technology, even before implementing CMAF.
The key takeaway here is that streaming service providers can count on different types of technologies to improve their latency, operating at the video format level, the delivery network, or the player. With a tight collaboration among their suppliers, they can very efficiently achieve their immediate target by choosing the option that best fits their particular environment, and further down the line even reach a latency just as good as broadcast by combining them.
Achieving low latency streaming now
We estimate that using this VO-Broadpeak joint solution, the majority of operators can immediately reduce latency to a few seconds only, providing an investment on the CDN that is usually minimal and optimisations on the video format and the player that are today very well understood.
Any effort to reduce latency is worthwhile, and it is advisable to start as soon as possible given some of the future challenges faced by the industry. Internet-based video traffic is growing 20 to 35% year-on-year, which will inevitably lead to further constraints on networks that are not growing at anywhere near the same rate. And not only are viewer expectations also increasing but we are putting ever more content into the video signal, with more demands for data from sporting events, targeted ad insertion, and more.
The resulting equation is a complex one to balance, but by optimising the CDN and utilising CMAF and MABR we believe we can rival the glass-to-glass speed of the traditional broadest signal, while providing all the end-user advantages and quality of experience that has made streaming such a popular choice for live events.