Audiences for sports, concerts and other major events are increasingly being attracted to online video to watch these events. Today, viewer’s expectations for the online streaming experience have gone beyond a broadcast quality picture and no rebuffering. They are embracing watch together apps where participants can be watching on mobile devices and the traditional broadcast. This multi-screen environment requires that the latency of online streams and the broadcast closely match so everyone sees the live action at the same time. Typical streaming latency is too long to provide the viewing experience users expect, resulting in unhappy viewers and lost revenue. Fortunately, new live streaming technology makes it possible to deliver live streams with the low latency of broadcasts, enabling exciting new experiences that engage viewers in multiple ways, and increase stream monetization.
To address the limitations of the internet to deliver video, HTTP-based live streaming formats such HLS and MPEGDASH were developed to allow live internet streaming using the TCP/IP protocol. Video streams are encoded in segments (or chunks) that are delivered to the receiving application and then buffered before being played. This allows the playback application to accommodate the inherent potential delay in the transmission of live video over the internet by buffering (or storing) the video before it is played. The typical stream latency of 30 seconds to over one minute is due to the amount of video that is typically buffered, a big challenge to matching the 5-7 second latency typical of broadcast. While it’s possible to reduce the size of the buffer chunks to minimize the delay, making them too small increases the chance viewers will experience video rebuffering and other playback issues.
The two most popular HTTP-based packaging protocols, HLS and DASH, use different formats, forcing content distributors to encode and store video files twice to reach viewers across iOS devices, smartTVs and PCs. This is both costly and inefficient. To address this issue and reduce complexity in delivering online video, Apple and Microsoft worked with MPEG to establish a new standard called Common Media Application Format (CMAF). CMAF defines a container that holds audio and video content. It is not a media presentation format, like Dynamic Adaptive Streaming over HTTP (DASH) or HTTP Live Streaming (HLS). Because it is a container, CMAF can serve as the basis for interoperability between the two different formats - DASH & HLS. Apple and Microsoft agreed to move forward with the fragmented .mp4 format as the common media application, which can be deployed using either HLS or DASH for improved CDN efficiency. This by itself doesn’t have much impact on latency but chunked transferred CMAF does.
Chunked Transfer Encoding (CTE) makes it possible to break the video into smaller chunks of a set duration, which can then be immediately published upon encoding. That way, broadcast latency (5-7 seconds) can take place while later chunks are still processing. This requires that the encoders, CDNs and players of the streaming ecosystem are optimized to support CTE. There must be end to end testing among partners to ensure compatibility and robustness of the solution. The below workflow shows two options for distributing low latency HLS and DASH streams - Pull from origin and Live Push ingest to the CDN.
Low Latency Streaming workflow of live video for delivery to any device
Both of these formats promise to be able to deliver streams to supporting devices with latencies in the range of 3-7 seconds. This low latency delivery lets you match broadcast latency, making these formats ideal for streaming popular live sports and other major live events to large audiences.
There are the CAPCHECK guardrails in place to approve this type of traffic.
Live Push Ingest with Chunk Transfer Encoding (CTE) protects your live origin from being overwhelmed by requests during large live events. This allows control of bandwidth to CDN Ingest – only a single copy of a stream needs to be pushed to the CDN no matter how many viewers need to receive it.
It’s important to differentiate the use cases for Low-Latency CMAF, and WebRTC, another low-latency technology that was designed for realtime communications such as interactivity between stream providers and viewers. LowLatency CMAF, delivered as Low-Latency HLS and DASH, is for delivery at scale, making these formats ideal for live sporting events. By matching broadcast latency, fans watching the broadcast and those viewing online see all the action at the same time, optimizing multi-screen experiences such as watch together applications.