Federico Mengozzi

Streaming of stored video


Streaming video can be classified in three categories: UDP streaming, HTTP streaming and adaptive HTTP streaming DASH. All these technique require an extensive client-side buffering and processing to mitigate the effect of end-to-end delay.

UDP streaming

The server transmit the video in chunks according to the client consumption rate, it’s free to do so since UDP doesn’t have any congestion control. The client usually have a buffer that hold around a second of video. Inside the UDP segment, the chuck is usually encapsulated in a RTP packet.

The server needs to maintain a separate control connection (to handle pausing, resuming, skipping).

There however three main drawbacks of streaming over UDP: the constant rate of transmission might not be enough to provide continuos playback, the requirements of a RTPS server to process client-side activity and lastly, UDP segments might be dropped by firewall.

HTTP streaming

HTTP streaming works on top of TCP. Its congestion control might be problematic in providing continuos playout. There are ways to solve the issue


The client will start downloading the video at the highest rate as possible, even if some frames will be played far in the future.

Client buffering

With $b$ the buffer size, $r$ the rate of transmission of the server and $c$ the rate of consumption of the client, assuming $c < r$ then the buffer will become full (after $b \cdot (r-c)$) causing back pressure to the server to slow down the transmission. For this reason, in practice, the rate of transmission cannot be higher than the client consumption rate.

Early Termination of Repositioning

HTTP uses a byte-range header in GET requests to pull specific frames. Used when repositioning the video. When this happens, the buffered data is discarded and the bandwidth wasted. For this reason it’s necessary to not keep the buffer capacity too high.

Go to top