How to implement HTTP Live Streaming server on Unix?

alex picture alex · Dec 12, 2011 · Viewed 19.1k times · Source

I just realized that Apple required HTTP Live Streaming in order to view videos in iPhone apps. I was not aware of this before... I am now trying to understand what this involves so I can decide whether I want to do the work and make the videos available in 3G or limit video playing to users who are connected to wi-fi.

I read the overview provided by Apple, and now understand that my server needs to segment and index my media files. I also understand that I don't have to host the content to be able to stream it (I can point to a video hosted somewhere else, right?).

What's not clear to me at this point is what to implement on my server (Ubuntu Hardy) to do the actual segmenting and indexing on the fly (once again, I do not host the videos I want to serve).

I found a link explaining how to install FFmpeg and X264, but I don't know if this is the best solution (since I have an Ubuntu server, I can't use the Apple Live Streaming tools, is it correct?). Also, I do not understand at which point my server knows that a video needs to be converted and starts the job...

Any feedback that could help me understand exactly what to do on the server side to be able to stream videos on my iPhone app in 3G would be greatly appreciated! (Oh, and just it makes any difference, my app back-end is in Rails)

Answer

Roman Gaufman picture Roman Gaufman · Sep 5, 2012

There are several competing technologies, but today if you want whatever files to be compatible for streaming on Apple devices (iPhones, iPads, etc) then HLS is the way to go. Incidentally it is also supported by most browsers and Android so not a bad place to start. Note however it is not suitable for streaming live content despite the name.

Unless you want live video, you really DON'T need red5 or wowza or fms or anything like that. HLS is basically a set of short video segments (e.g. 5 minutes each) encoded at different bitrates and an m3u playlist you give to your flash or HTML5 based player in the browser. It is kind of up to you to decide the segment length or how you encode it.

This is the best article I've seen about how to pick resolutions, bitrates, segment sizes, etc: http://www.streamingmedia.com/Articles/Editorial/Featured-Articles/Adaptive-Streaming-in-the-Field-73017.aspx

From there you just for example create a directory structure, e.g.

/data/video/video_id/original.mp4
/data/video/video_id/quality1/chunk1.mp4
/data/video/video_id/quality1/chunk2.mp4
/data/video/video_id/quality2/chunk1.mp4
etc..

Then you need to generate an m3u playlist for all the chunks and qualities and it's up to the player itself to implement the switching between qualities and playing the next file (which most modern players already have).

I also highly recommend checking out: https://developer.apple.com/streaming/ - Apple provide a bunch of free tools to prepare the videos and playlists for HTTP Live Streaming.