Frequently Asked Questions

Video Delivery

Did you know that 4k video can consume up to 75x more bandwidth than mobile video? As our hunger for online video grows, so does the bandwidth demand. To learn more about online video, OTT and video streaming, read on...

data streaming chart limelight networks

What is OTT?

OTT stands for over-the-top and refers to the delivery of content via the internet, bypassing (or over the-top of) traditional cable and broadcast providers. Examples of OTT video service providers include Amazon Prime Video, Disney+, Vudu, HBO Max and NBC Peacock. Video streaming is used to deliver video to OTT viewers.

What is video streaming?

Video streaming is the continuous presentation of video information to a remote user. Using video streaming technology users can watch video over the internet without having to download a file to their device. Video streaming can be broadly divided into video on-demand (VOD) and live streaming.

What is Video-on-Demand (VOD)?

VOD, or video on-demand, allows a user to view video content over the internet whenever they choose.

What is live video streaming?

Live video streaming involves streaming video content over the internet while an event is happening.

How does streaming work?

Traditional HTTP streaming involves dividing a video file into multiple smaller segments, and then transmitting these segments from the server to the client as the viewer plays them. In this way viewers don’t have to wait for the entire video file to transfer but can begin watching the video as soon as they begin receiving the first segments. It also limits the amount of data that needs to be sent to the viewer because if a viewer stops watching a video, those segments do not need to be transferred.

What is buffering?

One of the problems of streaming video over the internet is that many things can interfere with the transmission of the data. The internet connection might be unreliable, internet speeds might be slow, there may be network contention or congestion. In order to shield users from these problems the viewer’s device will store a portion of the video in a “buffer” in advance of the user watching it. This process is known as buffering. In this way, if the transmission of the video is interrupted the viewer’s device will be able to read the video segments from the buffer and the video playback will not be disrupted. If, however, the video transmission is so severely interrupted that the buffer is exhausted, the dreaded “Buffering” spinning wheel icon will appear as the device tries to repopulate the buffer.

buffer wheel limelight networks

What is Adaptive Bitrate Streaming (ABR)?

ABR, or Adaptive Bitrate Streaming, is a form of streaming that dynamically adjusts video quality sent to each viewer based on the speed of the viewer’s internet connection. For example, if the viewer’s internet connection and device are not able to sustain a higher speed needed for HD video, the sender will drop the quality (and bitrate) of the stream. ABR is designed to deliver the best possible video quality for a specific viewer’s connection.

The most popular adaptive streaming formats include HLS (HTTP Live Streaming) which is used on Apple products and the Safari browser, and DASH (Dynamic Adaptive Streaming over HTTP) which is used mainly on Android devices. Older formats such as MSS (Microsoft Smooth Streaming) which is discontinued but still used on some Microsoft devices and IE browser, and HDS (HTTP Dynamic Streaming) – a format supported by Adobe’s Flash player – also exist.

What is bitrate?

Bitrate refers to the amount of data that is processed or transferred in a unit of time. In the world of video streaming this is usually expressed in megabits-per-second or Mbps. In video streaming higher bitrates generally mean better video quality or resolution. Unfortunately, higher bitrates also consume more internet bandwidth.

How does video resolution affect bitrate?

Video resolution is the number of pixels within a certain area. The higher the resolution, the better the perceived video quality. Common video resolutions include:

  • Standard Definition or SD, also sometimes referred to as 480p (480 being the number of vertical pixels, and the “p” standing for progressive which refers to how the image is drawn on a screen).
  • High Definition or HD, also referred to as either 720p or 1080p (the latter offering more pixels in an image).
  • Ultra High Definition or UHD 4k, occasionally referred to as 2160p.


When it comes to streaming video, higher resolution formats will require more data to be transferred, which will result in higher bitrates.

Another factor that affects bitrate is the frame rate, expressed as frames per second or FPS. Sports and especially eSports benefit from higher frame rates. However, higher frames rates also require more data to be transferred, which results in higher bitrates.

Compression also affects bitrate. Streaming video is compressed in order to reduce the amount of data that has to be transferred, and to preserve internet bandwidth. Different compression technologies are used by different providers, making it difficult to quote standard video streaming bitrates. The chart at the top of this page summarizes typical bitrate ranges for different video formats.

What is transmuxing?

Transmuxing is the process of repackaging video content into different streaming formats such as HLS and DASH without modifying the codec or bitrate of the content itself. This is important for streaming video where viewers watch on a variety of devices. For example, with transmuxing you could repackage an MP4 video file into HLS or DASH for delivery to Apple or Android devices.

What is transcoding?

Transcoding refers to the process of altering pre-encoded video content in some way, and then re-encoding it. Usually the bitrate or resolution is being changed (also referred to as transrating and transsizing, respectively). Transcoding is important in video streaming because it allows you to take a high-resolution master video file and convert it into multiple bitrates and resolutions to reach the largest number of viewers with the best quality video. Today there is no universal video format. Different devices use different formats and may be better suited to different resolutions, and even to different bitrates in different environments. Transcoding enables you to create video streams at different bitrates, resolutions and formats to reach the broadest array of devices with optimal video quality.

What is low latency live video streaming?

Latency in video streaming is the delay between when a camera captures the live video and when that video is displayed on a viewer’s device. Latency can vary widely when it comes to live video streaming. Latencies of 30-45 seconds are typical with live streaming protocols like HLS and DASH. By way of contrast, latencies around 5 seconds are more common in traditional broadcast networks. Low latency streaming usually refers to reducing the latency found in typical streaming protocols closer to broadcast levels.

What is realtime streaming?

In certain live streaming cases, even streaming latencies of only a few seconds might be too long a delay. This would be the case in many examples that require some sort of interactivity – video conferencing, trivia game shows, live sports betting or auctions. In cases like these, latency would ideally be minimized to the point of being imperceptible. An emerging set of technologies (known as realtime or near realtime streaming) that provide sub-second latencies are well suited to extremely latency sensitive applications like these.