Live streams are not live

Video latency simply explained

Live streams aren't actually live, as there is usually a delay of a few seconds between the live event and video streaming on viewers' receiving devices. This is called video latency, i.e. the delay between recording a video and displaying this content to the viewer. Technically speaking, the latency or delay in streaming is the time that elapses between the signal produced on site and the recipient.

Latencies occur during live streams because it takes time to process and transmit live video. This delay process includes, for example, decoding the incoming stream, generating the individual formats (renditions) for adaptive streaming and in turn decoding on the viewer's device.

How does the coding of a live stream work?

A live stream is sent from a source where videos are being recorded — for example, an encoder behind the image mixer — using a streaming protocol (usually SRT or RTMP) to a server where the video feed is then processed further.

Depending on the setting, the video signal is segmented directly into smaller parts in an HTTP-based streaming protocol, or processed into various codecs and resolutions to form an adaptive multi-bit rate stream. The latter is necessary to be able to cover as many devices and network fluctuations and conditions as possible.

The compressed and segmented video data is distributed via a content delivery network (CDN).

Effects of high latency

Viewers want to feel like they're watching live broadcasts in real time. After all, they want to be there “live” as if they were there in person. A high delay has a negative effect on the viewer's experience and can cause the viewer to end the live stream. These viewers will then usually have a significantly lower tendency to fetch other streams from the same source; they will then be lost to the content provider.

Low Latency Graph

How does HLS streaming work?

Most HLS streams or HTTP live streaming protocols start with an RTMP ingest, which is automatically converted to HLS for delivery by the video platform. This provides viewers with high-quality HTTP live streaming, but can result in latency of 30 seconds or more, meaning a huge delay in the live stream.

RTMP is still used for video ingest and not HTTP live streaming for delivery and ingest, as HTTP live streaming has a much higher latency. The combination of RTMP ingest and HTTP live streaming delivery enables streaming on an HTML5 video player for all devices.

In order to optimize the latency of live streams, the HLS streaming protocol has significantly shortened the segments in the past and made improvements to the encoding settings, such as shortening keyframe intervals. With these optimizations, it was possible to reduce latencies from 18 — 30 and more seconds to 4 — 8 seconds.

3Q enables less than 5 seconds of latency

In order to reduce latency even further while maintaining the same quality, we are introducing the “Low-Latency HTTP Live Streaming” protocol, LL-HLS for short. If you send the signal to us via RTMP, you can reduce the streaming latency to 3 — 5 seconds with the new low latency option. If you use an SRT encoder, you can further reduce the delay.

The big advantage of our solution: The low latency mode continues to deliver your streams at very high quality and multiple levels of quality (“adaptive streaming”).

We're constantly working to further reduce latency so that our customers can produce live streams that are broadcast in near real time.

Sind Sie bereit, loszulegen?

7 Tage kostenlos testen
Flexibles Preismodell