The New York Times is one of the best examples of how traditional media can make the transition to an online world, and has been rewarded in recent years with strong growth in both its profits and its digital subscriber base. The company has invested heavily in online video for years and has been a pioneering use of new technologies like 360 video.
Recently, the New York Times adopted the Mux QoE analytics platform. We sat down with Flavio Ribeiro, lead software engineer at the New York Times, and asked him about his experience with Mux.
Mux: Hi Flavio - tell us more about yourself and your work.
Flavio Ribeiro: I work as the lead engineer on the video technology group at The New York Times. I've been working with online video technology for 7 years, including transcoding and publishing services, live streaming infrastructures and players implementations. I also worked with CDN integrations, on the fly packaging and peer-to-peer technologies.
Mux: Great. How did you get into video technology?
Flavio: When doing my bachelor’s degree back in Brazil, I worked in a spin-off company from the research laboratory that was responsible for designing part of the Brazilian digital television system. My first contact with video delivery and video broadcast was on this project.
Mux: Cool. What do you like about working with video?
Flavio: I guess that I'm passionate about video because it's not only about software development. Video is a whole ecosystem itself, it's a mix of engineering, math, physics, together with programming languages, browsers, networking and the web. It's also a world that keeps improving, so challenges are endless on this field.
Mux: Tell us more about the role of video at the New York Times.
Flavio: Video is one of the verticals that supports our visual journalism strategy. We have a wide variety of employees in the video department, including journalists, editors, engineers, product managers, project managers, business managers and others. The technology part of the group has a goal to keep the video experience as good as the quality of our journalism. We want to offer a video playback experience at the same level as the content we publish.
Mux: You recently blogged about improvements you've made to your video platform. Can you tell us more about what you did?
Flavio: When approaching the end of the year, I like to do a review of the projects that I've worked on. So this year I decided to write down some of the projects I had a chance to collaborate on with the rest of the engineering team. We are planning to write three different blog posts.
The first post was about the changes we made on the on-demand stack. We just released the second one about the enhancements we made to our live streaming stack, We have plans to write a last one about the improvements we’ve made to accessibility - closed captions and a bunch of tweaks we made in the player to make it more accessible.
Mux: What role did Mux play in these projects?
Flavio: We decided to use Mux to make sure we were improving delivery quality after changing all the pieces that we were planning to change. We wanted to make sure we weren't hurting our users.
Mux: How specifically did you use Mux data to do that?
Flavio: First, we measured the playback experience of the old way of serving videos. We wanted to have a good picture of what was going on at that time.
After that, we built the new delivery stack with the on the fly packaging. When sending users to the new approach, we we wanted to make sure we could keep or improve playback experience. I think that was the biggest goal.
We also used Mux for doing some A/B tests, like the use of HLS.js on Android phones.
Mux: Tell us more about that specific use.
Flavio: So for our closed captions support, we decided to go with segmented WebVTT in the HLS protocol. We could see that the native playback of those closed captions on Android phones was not consistent across all the different phones and Android versions. So instead of using the native HLS playback, we decided to go with HLS.js. But we wanted to make sure, again, that we would be improving or keeping the experience that Android users have right now, which was interesting to see.
We basically sent 20% of our Android users to render HLS using HLS.js, with the rest using the native playback. At the end of the test, we could see that the startup time was not as good as on the native player, but the rebuffering score is much better. At the end, targeting a bigger goal which was serve captions across all platforms, we decided to go with HLS.js.
Mux: Do you know why the startup time was not as good but the rebuffering was better?
Flavio: That's one field we'll be investing some time in a near future. We already have some ideas to test. I have some suspicion around the manipulation of the internal buffer between the native implementation and HLS.js. I'd also like to test HLS with fragmented mp4's, avoiding the user's browser from doing the transmuxing on the javascript layer. I believe that we have plenty of room for improvement on this and I'm excited to start experimenting.
Mux: What were the most important metrics you used to make sure the quality of experience was getting better and not worse?
Flavio: For the most part, I trusted the Viewer Experience Score in Mux. When we saw outliers for rebuffering events and startup time, we would dig into the details.
Mux: Were you able to see performance improvements?
Flavio: Yes. As of right now, we see that people are facing fewer errors, and our rebuffering score is much better. We still need to work on improving the startup time, so this is something that Mux could show us: we are feeding some part of our roadmap based on the findings of Mux.
Mux: You touched on this earlier, but I'd love to hear more about why performance is important to you.
Flavio: As I said, one of the main goals that we have as the technology group is to have the technology stack be as good as the quality of our journalists. In my opinion, when it comes to video playback, performance is everything. We want to make sure that users are able to play and watch the content that we publish seamlessly.
Mux: What was your overall experience like working with the Mux product and the Mux team?
Flavio: It’s really great to be able to talk to the developers as I'm trying to solve problems and make improvements. The other day, we were talking about the granularity of the Mux graphs in the Mux channel on video-dev Slack, and it’s nice to have this kind of direct communication with the Mux team. I've seen some problems that were fixed in like 3 or 5 minutes. It’s been really easy to deal with the Mux integration, the Mux product, and Mux engineers.
Mux: What would you like to see from Mux going forward?
Flavio: I'd like to have more real-time information about what's going on with our video delivery experience, and more support for our live streaming events. Ultimately I'd love to have a TV mode so i can just look at the TV and make sure we're doing great.
Mux: What is next for the New York Times (that you can talk about)?
Flavio: We just released closed captions support for our videos, It is big deal for us. It was a nice joint effort between newsroom, product, design and the engineering groups. This feature was on our wishlist since 2014 and it will open up some interesting possibilities. It will allow us to get closer to industry standards, enable more accessibility for our watchers and allow us to do muted autoplay. It also puts us just one step away from subtitles.
For the future, we want to keep improving the experience. Smart thumbnails, fragmented MP4's, VP9/HEVC, metadata extraction, recommendation systems, among others, are on the list that will shape our next backlogs.
We've been also working a lot on the player side as well to modernize it and use newer technologies. In the future, we want to work on video metadata - to extract metadata, feed our recommendation systems and our search engine. A lot of this stuff will be coming hopefully next year.
All the decisions and projects we'll be doing will be data driven. We like the how we managed to make decisions based on A/B tests and other analytics so we'll keep this approach for defining clear OKRs for our next goals.
Mux: What advice would you have for Mux users, or for someone who is considering Mux?
Flavio: First, the player integration was not a hard task. It was actually really straightforward. We have our own player implementation but we didn't have any issues integrating with Mux at all.
My advice is to set up a trial and play with all the graphs and see if they make sense for your case. For example, for us, Mux surfaced a bunch of gaps that we have, such as the startup time problem I mentioned before. This is pretty helpful, as it will drive some pieces of our roadmap.
Mux: What advice would you have for other publishers who want to optimize their video platforms?
Flavio: One thing that we've been trying to do, and in my opinion is working pretty well, is to rely on open source projects. Not only to rely on them, but also publish software as open source. Every time we start a new project, we think if there are some components that can be open sourced, or if the whole project could be open sourced.
In my opinion, most publishing and media companies are still this black box; you don't actually know what's going on its technology stack. I'd love to see more of what they're doing available on the internet.
Mux: Great. Last question: tell us about video-dev slack.
Flavio: The Video-dev slack channel has been going for less than a year, and it's growing really fast. It's a community where we discuss all the aspects of video delivery, from open source projects to products and business. We have engineers, but also product and project people. I've been looking at the adoption rate and people are joining massively in the last few weeks - I'm really happy to see video people getting together in one place.
I’d also like to take this opportunity to invite you, if you’re reading this, to join us and talk to us. To sign up, you can just go to video-dev.org and invite yourself.