Ok, I have been trying to wrap my head around this http live streaming. I just do not understand it and yes I have read all the apple docs and watched the wwdc videos, but still super confused, so please help a wanna be programer out!!!
The code you write goes on the server? not in xcode? If I am right how do i set this up? Do I need to set up something special on my server? like php or something? How do use the tools that are supplied by Apple.. segmenter and such?
Please help me, Thanks
HTTP Live Streaming is a streaming standard proposed by Apple. See the latest draft standard.
Files involved are
.m4a
for audio (if you want a stream of audio only)..ts
for video. This is a MPEG-2 transport, usually with a h.264/AAC payload. It contains 10 seconds of video and it is created by splitting your original video file, or by converting live video..m3u8
for the playlist. This is a UTF-8 version of the WinAmp format. Even when it's called live streaming, usually there is a delay of one minute or so during which the video is converted, the ts and m3u8 files written, and your client refresh the m3u8 file.
All these files are static files on your server. But in live events, more .ts files are added, and the m3u8 file is updated.
Since you tagged this question iOS it is relevant to mention related App Store rules:
To download the HTTP Live Streaming Tools do this:
Command line tools installed:
/usr/bin/mediastreamsegmenter
/usr/bin/mediafilesegmenter
/usr/bin/variantplaylistcreator
/usr/bin/mediastreamvalidator
/usr/bin/id3taggenerator
Descriptions from the man page:
Install Macports, go to the terminal and sudo port install ffmpeg
. Then convert the video to transport stream (.ts) using this FFMpeg script:
# bitrate, width, and height, you may want to change this
BR=512k
WIDTH=432
HEIGHT=240
input=${1}
# strip off the file extension
output=$(echo ${input} | sed 's/\..*//' )
# works for most videos
ffmpeg -y -i ${input} -f mpegts -acodec libmp3lame -ar 48000 -ab 64k -s ${WIDTH}x${HEIGHT} -vcodec libx264 -b ${BR} -flags +loop -cmp +chroma -partitions +parti4x4+partp8x8+partb8x8 -subq 7 -trellis 0 -refs 0 -coder 0 -me_range 16 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -bt 200k -maxrate ${BR} -bufsize ${BR} -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 30 -qmax 51 -qdiff 4 -level 30 -aspect ${WIDTH}:${HEIGHT} -g 30 -async 2 ${output}-iphone.ts
This will generate one .ts file. Now we need to split the files in segments and create a playlist containing all those files. We can use Apple's mediafilesegmenter
for this:
mediafilesegmenter -t 10 myvideo-iphone.ts
This will generate one .ts file for each 10 seconds of the video plus a .m3u8 file pointing to all of them.
To play a .m3u8
on iOS we point to the file with mobile safari.
Of course, first we need to put them on a web server. For Safari (or other player) to recognize the ts files, we need to add its MIME types. In Apache:
AddType application/x-mpegURL m3u8
AddType video/MP2T ts
In lighttpd:
mimetype.assign = ( ".m3u8" => "application/x-mpegURL", ".ts" => "video/MP2T" )
To link this from a web page:
<html><head>
<meta name="viewport" content="width=320; initial-scale=1.0; maximum-scale=1.0; user-scalable=0;"/>
</head><body>
<video width="320" height="240" src="stream.m3u8" />
</body></html>
To detect the device orientation see Detect and Set the iPhone & iPad's Viewport Orientation Using JavaScript, CSS and Meta Tags.
More stuff you can do is create different bitrate versions of the video, embed metadata to read it while playing as notifications, and of course have fun programming with the MoviePlayerController and AVPlayer.