Author Archives: Simon

How to Start Playing a Video at a Specific Point in Time

Quick tip. How do you start playing a video from a specific point in time with HLS? Use the EXT-X-START tag. It’s optional, but if it’s present in a playlist it specifies the preferred point in the video to start playback.

The following attributes are defined in the spec for the EXT-X-START tag:

  • TIME-OFFSET – (required) The offset in seconds to start playing the video. If the value is negative, specifies the time from the end of the playlist. The offset must not be larger than the playlist duration.
  • PRECISE – (optional) Valid string values: YES, NO. If YES, playback starts at the point specified by TIME-OFFSET but does not render frames in the segment prior to the offset. Default is NO.

Let’s look at an example. Assume that we want to start playback precisely 25 seconds from the beginning of the video. You would add the following tag to the playlist:

#EXT-X-START:TIME-OFFSET=25,PRECISE=YES

If we want to start 15 seconds from the end of the playlist and we aren’t concerned with precision, the tag will look like this:

#EXT-X-START:TIME-OFFSET=-15

To see how this works in practice, let’s look at some examples. (I recommend watching the videos with Safari on an iOS device. I had mixed results with Safari on the desktop.) The first example starts playing at the beginning of the video, the second one – same video, different playlist – starts playing 25 seconds in.

There’s one more thing to consider if you want to use the EXT-X-START tag: It first appeared in version 6 of the protocol so you need to set the EXT-X-VERSION tag in the playlist accordingly.

Here’s the complete playlist for the example with the EXT-X-START tag:

#EXTM3U
#EXT-X-TARGETDURATION:10
#EXT-X-VERSION:6
#EXT-X-MEDIA-SEQUENCE:0
#EXT-X-START:TIME-OFFSET=25,PRECISE=YES
#EXT-X-PLAYLIST-TYPE:VOD
#EXTINF:10.00000,       
sintel0.ts
#EXTINF:10.00000,       
sintel1.ts
#EXTINF:10.00000,       
sintel2.ts
#EXTINF:10.00000,       
sintel3.ts
#EXTINF:10.00000,       
sintel4.ts
#EXTINF:10.00000,       
sintel5.ts
#EXTINF:10.00000,       
sintel6.ts
#EXT-X-ENDLIST

If you want to play the videos in a browser other than Safari, you may want to check out an earlier post that describes how to do this. If it doesn’t work as expected, this may be because the client doesn’t support the appropriate version of HLS. Your mileage may vary.

Media Playback on Apple TV

I’ve just watched a tech talk session about media playback on Apple TV.

There were some interesting uses of HLS. The video player has a sliding thumbnail view, which uses I-frame playlists, that allows viewers to seek to a specific point in the video, and there are some additional properties for displaying metadata about the video, such as chapter markers and interstitial content, but what was particularly interesting was the integration with Siri. In the presentation, the presenter pauses the video and asks Siri what the character in the video just said. This caused the video to seek backwards for 15 seconds and enable the subtitles (temporarily) for that part of the video. After 15 seconds, the subtitles were turned off. A novel use of subtitles!

It’s given me a few ideas for some new content for the book. (Looks like I’ll have to get myself an Apple TV now!)

Check it out if you have the time. It’s worth watching.

Playing HLS Video in the Browser

HLS video works in all browsers on iOS and Chrome Mobile on Android devices but on the desktop, the only browser that supports HLS video natively is Safari. The good news is that it is possible to play HLS video in other desktop browsers, such as Chrome or Firefox, but you need a little help from some third-party tools. There are two options available:

  • Flash player with a plugin that handles HLS video
  • Use the HTML5 video element and Media Source Extensions (MSE)

We’ll look at an example of each in turn and discuss how they work in more detail.

Flash Player with Plugin

Flash works across practically all desktop browsers (and some mobile ones)  so you can be confident that your HLS videos will be viewable across a variety of different browsers and platforms (Windows, Mac, Linux).

For the example, I’m going to use the Flashls plugin. It works with a number of different Flash players and supports a variety of HLS features such as adaptive streaming, encryption, and so on.

Given that Flash is being phased out, this solution may not be the best one going forward. Having said that, it’s unlikely that Flash is going to disappear completely in the next year or two but it is on the way out. The future is HTML5 video and that’s what we’ll look at next.

HTML5 + Media Source Extensions

The introduction of the video element in HTML5 means that it’s now possible to play video on the web without having to install plugins. However, not all browsers support the same video formats. The Media Source Extensions allows JavaScript applications to generate media streams dynamically for playback. This opens the door to enable third-parties to add support for different formats as we’ll see shortly.

To demonstrate how to use the HTML5 video element to play HLS video, I’m going to use the hls.js library. Check out the getting started guide for more information.

Try viewing the example in a compatible browser.

One thing you need to be aware of is browser compatibility. Although the MSE specification provides some guidelines on how the API should behave it does vary across browsers. You also need a relatively recent version of a browser both on the desktop and mobile device. (On iOS, MSE is not supported at all!)

Conclusion

As the approach outlined above clearly shows, it is possible to play HLS video across different browsers using only the HTML5 video element and MSE, so hopefully in the not-so-distant future browsers will support it natively so there will be no need to use third-party JavaScript libraries.

How to Add Subtitles to a Live HLS Stream

In this post I’ll describe how to add subtitles to a live HLS stream.

Subtitles can be added to a live video stream by creating a live subtitle playlist. Before I delve into the details, let’s recap how the playlist for a live video stream works. A live playlist contains a fixed number of entries. Entries are added and removed accordingly as time progresses. Let’s look at an example:

#EXTM3U
#EXT-X-VERSION:3
#EXT-X-MEDIA-SEQUENCE:2
#EXT-X-TARGETDURATION:10
#EXTINF:10,
fileSequence2.ts
#EXTINF:10,
fileSequence3.ts
#EXTINF:10,
fileSequence4.ts
#EXTINF:10,
fileSequence5.ts
#EXTINF:10,
fileSequence6.ts

In this instance there are 5 entries in the playlist, each segment being 10 seconds in duration. Every 10 seconds or so, the first segment is removed and a new one is added to the end of the playlist. The target duration tells the client how often it should go back to the server to retrieve new content. The client will continue to fetch the playlist until the stream ends. This occurs when the #EXT-X-ENDLIST tag is added to the playlist. (There’s more detail in the book.)
Continue reading

Segmenting Video with ffmpeg

To stream video with HLS, you need to divide your video into segments of a fixed duration and add them to a playlist. In the book I use Apple’s HTTP Live Streaming tools to do this. Here’s an example using mediafilesegmenter:

$ mediafilesegmenter -f /Library/WebServer/Documents/vod sample.mov

This command takes the video (sample.mov) and writes out the segments and the playlist to the /Library/WebServer/Documents/vod directory. Unfortunately, Apple’s tools will only work on a Mac.

However, recent versions of ffmpeg can also output HLS compatible files. Given a video as input, it will divide it into segments and create a playlist for us.

Here’s the equivalent of the command above using ffmpeg:

$ ffmpeg -y \
 -i sample.mov \
 -codec copy \
 -bsf h264_mp4toannexb \
 -map 0 \
 -f segment \
 -segment_time 10 \
 -segment_format mpegts \
 -segment_list "/Library/WebServer/Documents/vod/prog_index.m3u8" \
 -segment_list_type m3u8 \
 "/Library/WebServer/Documents/vod/fileSequence%d.ts"

We use ffmpeg‘s segment muxer to segment the video. We can specify the segment duration with the -segment_time option. The last argument passed to ffmpeg is the path to where the segments should be written; it contains a format specifier (%d) similar to those supported by the printf function in C. The %d will be replaced with the current sequence number. In this example, the segments will be named fileSequence0.ts, fileSequence1.ts, and so on.

And that’s how you process a video for streaming with HLS using ffmpeg. There are other examples in the book, including how to use ffmpeg to segment a live video stream, so if you want to learn how, buy your copy today.

In part 2 we’ll look at how to segment video using ffmpeg’s hls muxer.

Adaptive Streaming with HLS

The following article is an excerpt from Chapter 4 of HTTP Live Streaming: A Practical Guide. This is the introductory section of the chapter and is meant to give you the background you need to understand how adaptive streaming works and how to do it with HLS.

In previous chapters, we’ve seen how to stream a video with HLS for both on- demand and live video. However, a single video stream at a particular bit rate isn’t going to work on all the devices that people watch video on today. Some devices are more powerful than others, screen sizes are different, some support different H.264 profiles, and so on. Connection speeds can also vary depending on how you are connected to the internet.

For example, you can watch an HD video stream on your flat screen TV at home, but you won’t get the same experience watching the same stream on a mobile device over a cellular network. Playback will stall as the device won’t be able to download the video data fast enough. In some cases, the video may not play at all because the device won’t be powerful enough to decode the video. Then there’s the wasted bandwidth. Most screens on mobile devices do not have the same resolution as a TV so all those pixels will be wasted.

The amount of bandwidth available also plays a part. Your internet provider may promise you super-fast connection speeds, but what you actually get may fluctuate at any given time. If everybody in your neighbourhood starts streaming the latest movie from Netflix or watching YouTube videos, you can be fairly certain that your connection speed will drop off. The result? More stalls and buffering.

We need a solution that allows us to deliver a video stream that’s optimal for the device the stream is being watched on, and that is where adaptive streaming comes in to play.

Here’s the plan for what we’ll cover in this chapter:

  • First we’ll look at what adaptive streaming is, how it works, and why it’s useful.
  • Then you’ll learn how to take a video and stream it on-demand using adaptive streaming.

We’ll be doing some video encoding in this chapter so if you come across any terms you aren’t familiar with, refer to Appendix A, An Encoding Primer.

Continue reading