Updated 04/23/24: SRT live ingest protocol support is now GA. You can read more about it here.
The perpetual challenge with live streaming is right there in the name - live, and It's only live once - get it wrong and your viewers have missed the winning penalty or Ryan Gosling performing "I'm just Ken" at the Oscars. We've built a lot of resilience into Mux Video, but it doesn't matter how reliable our transcode, packaging, and delivery chains are: if you can't get a reliable stream into Mux, then you can't get a reliable stream back out to your viewers. Since we launched live streaming on Mux Video over 5 years ago, we've relied on the RTMP protocol for getting that stream in.
I've been in video streaming for well over a decade at this point and for most of my time in video, RTMP has been the only universally accessible protocol for live stream contribution. It's over 20 years old and started to creak at the seams a while ago. The core problem with RTMP is that it's ultimately a connection-oriented protocol that uses TCP. On the big scary World Wide Web, it's easy for a connection to drop when you experience network connectivity problems, and even if you're following all of our best practices, (including our awesome slates technology), this still results in an interruption to your stream. Sadly, most of the time the action doesn't stop when your stream does, so your viewers might have missed the big goal.
As live streaming started to become more accessible to the more general public, (and thus was used more widely on public networks), it became clear that a replacement for RTMP was overdue. Not only was it hard to manage on imperfect networks, it wasn't future-proof. It didn’t support modern video codecs or features like multi-track audio. Around 5 years ago, the winds of change started to gust, and we saw more options coming to market to attempt to fill the gap for contribution protocols with more reliability built in. Zixi, SRT, RIST, or WHIP, just to name a few, and all of these protocols promised to be better.
As much as we'd have loved to jump on the bandwagon early, the challenge was adoption among encoders - it takes time for new contribution protocols to be adopted. Mux also has one of the largest diversities of source encoders that you might see in the wild, and initially many of those encoder vendors invested in different ecosystems. Over the last couple of years, though, we started to see much more consistent availability of SRT among the tools that our customers were using.
SRT is an open source contribution protocol developed by Haivision, and it works differently from RTMP at the core. SRT is a UDP-based protocol (so no more worries about dropped TCP connections). It instead relies on protocol-level acknowledgment to handle data loss or re-ordering. SRT implements this reliability by keeping a buffer of frames on each end of the connection to allow time for the protocol to compensate for lost or delayed data. Depending on how you use it, this can mean that SRT has higher point-to-point latencies than RTMP generally has today, however, SRT allows you to configure this buffer to make sure you're making the right tradeoffs of latency and increased reliability for your use case.
I won't go into all the technical details of SRT in this blog post - that's one for another day (and another author), but by the way, SRT stands for Secure Reliable Transport. Does the blog title make more sense now?
Introducing the Mux Video SRT public beta
Over the last few months, the team has been busy implementing SRT for Mux Video, and we're finally ready to take off the covers: SRT support for Mux Video is now in public beta. You can choose to push video to any existing or new live stream to Mux over RTMP or SRT.
What's more exciting is that you can also choose to push your stream to Mux as HEVC. This should allow you to reduce the bitrate that you're encoding locally by 30-40%, increasing the reliability of incoming streams.
Getting started with SRT
You don't need to do anything to enable your account or live streams for SRT ingest, however, authentication for SRT is a little different than with RTMP, and requires two pieces of information:
- streamid This is the same stream_key attribute you know & love from RTMP
- passphrase. This is a new piece of information exposed in the Live Streams API called srt_passphrase. You'll need to use this when your encoder asks you for a passphrase. All new and existing live streams now expose the srt_passphrase field.
You can get this information through the API using the Get Live Stream API call:
You can also see the SRT connection details from the dashboard of any live stream:
Depending on the encoder you're using, the exact path to setting the SRT configuration will vary. Many encoders accept all configuration parameters in the form of an SRT URL, in which case you'll need to construct an SRT URL as below, substituting the stream_key and srt_passphrase.
For example, OBS accepts the SRT endpoint as a single URL, and should be configured in the OBS Stream Settings panel:
Stream Key should be left empty as both the Stream ID and the Passphrase are being set in the URL field.
You can find lots more configurations for encoders including OBS, Videon, Larix Broadcaster, FFmpeg, and Gstreamer in the documentation for the SRT feature, along with more details on common optional configuration parameters.
If you've configured your encoder correctly, you should be all set to connect your encoder and start streaming. You can then check the live stream in your Mux Dashboard. You will see all the usual state transitions, events, and webhooks that you'd expect when connecting from an RTMP source.
Try it out today
All Mux Video accounts have access to SRT today for no extra cost, so why not give it a try?
We'd love to hear your thoughts and feedback on SRT support, and as always, let us know if there are other ingest protocols you'd like to see us explore.
SRT support is a beta There may be rough edges or bugs, and we may not be able to support this feature under SLAs. You should keep this in mind when deciding if you are going to use this feature for production traffic.