Published on September 14, 2022 (over 2 years ago)

An HLS.js cautionary tale: QoE and video player memory

Scott Kellicker
By Scott Kellicker5 min readVideo education

We recently worked with a customer whose viewers were experiencing rebuffering and jerky video during live streams. When we looked in Mux Data, we found lower-than-expected Smoothness scores. Digging a little deeper, we noticed the Rebuffering Percentage was higher than expected, but only on some devices.

We started by eliminating Mux Video and our CDNs as contributing to the issue through CDN log analysis, and we confirmed that the live ingest was stable. Then we turned our focus to the client player.

We compared and contrasted different factors in Mux Data to look for patterns. Here’s what we observed:

  • Browser-based (HTML5) playback was substantially worse than native
  • Rebuffering occurred with both VOD and live streams
  • Rebuffering often started 30 minutes to 90 minutes into the playback session
  • Rebuffering seemed worse and happened sooner on lower-end devices

These clues led us to suspect some type of resource issue. After a bit of trial and error, we realized that the player was not releasing video memory, which was leading to memory starvation. Specifically, the hls.js player engine was not automatically evicting player memory it had already played.

LinkVideo player buffers

A player has a buffer for the media it plays (actually, it typically has more than a single buffer — at least one for audio and one for video — but let’s keep things simple). Segments are downloaded into the buffer, decoded, and then played. The time that is currently playing is called the playhead. Conceptually, the buffer can be thought of as having two parts: a forward buffer and a back buffer. The back buffer is the media data that has already been played and is behind the playhead. The forward buffer is the media that has not yet been played. To have high-quality playback, we want the forward buffer to be filled with lots of happy media frames before the decoder is ready for them.

But why have a back buffer? It has two purposes. First, the player may need frames behind the playhead to decode the current media. A small amount of media is required for this. The second purpose is seeking backward in time. When a viewer wants to jump back to rewatch an earlier part of the video, if the media is in the buffer, the seek can occur without redownloading the media.

LinkBack to the investigation

During our customer analysis, we found that the default behavior in hls.js is an infinite back buffer. This caused memory usage to increase over time, which is why things went awry well into the playback: the player was memory starved.

But why were we not seeing this in all browsers that used hls.js? To complicate the picture, it turns out that some browsers will evict back buffer memory behind the scenes. The lesson is to ensure the player is always doing this, and browser specific behavior is not relied upon.

LinkEvicting the back buffer

In general, it’s important to understand how the buffer is managed in your player. Let’s take a closer look at how this works in hls.js (and players such as video.js that use hls.js as an engine).

HLS.js has two primary parameters to control the buffer length. maxBufferLength controls the forward buffer, and backBufferLength controls the back buffer. The default for maxBufferLength is 30 seconds. For most use cases, this is a reasonable value.

However, the hls.js default for backBufferLength is Infinity, which means hls.js will not explicitly free video memory that has been played. This has a detrimental effect on your application, especially for long events.

This parameter should be explicitly set to avoid memory issues. But what to set it to?

LinkBack buffer guidelines

If you don’t want to think about it, a good default back buffer for many use cases is 30 seconds.  Sleep well tonight and skip to the next section.

If the stream is a live event with no backward seek or DVR functionality, set the back buffer to 10 seconds. This ensures that you’ll keep at least one segment in memory for decoding purposes while drastically reducing memory usage.

For use cases such as VOD or DVR, the answer to how much of a back buffer you need is “it depends.” It’s a trade-off between fast seeking behavior and memory consumption. Think about your content and use case. How far back in time is a typical seek? How frequently will backward seeks occur? What are your viewers’ expectations for a short seek versus a long seek? Generally, viewers expect a short seek to be quick and are used to a longer seek having some latency. Try starting with a short backBufferLength — say, 60 seconds — and see how it goes. You can always lengthen it later if you need to.

LinkReevaluate

  • Once you’ve made your changes, use Mux Data to monitor how your viewers’ experience is affected. Keep an eye on your Rebuffering Percentage and Seek Latency.
  • Consider using the experiments feature in Mux Data and running an A/B test.
  • Memory profiling your client application in real world situations, such as a long-running event, is also a good idea.

Link
Mux Player

Ultimately, you shouldn’t have to be thinking about back buffers and micromanaging your QoE. At Mux, our vision is to make video as easy as possible. This is why we’re developing Mux Player, which integrates easily with Mux Video and Mux Data. Mux Player takes care of the painful details, like optimizing memory usage, so you can focus on what you want to build with video.

Written By

Scott Kellicker

[@portabletext/react] Unknown block type "span", specify a component for it in the `components.types` prop

Leave your wallet where it is

No credit card required to get started.