I'm trying to live stream H.264 content to HTML5 using the media source extensions API.
The following method works pretty well:
ffmpeg -i rtsp://10.50.1.29/media/video1 -vcodec copy -f mp4 -reset_timestamps 1 -movflags frag_keyframe+empty_moov -loglevel quiet out.mp4
and then:
mp4box -dash 1000 -frag 1000 -frag-rap out.mp4
I can take the MP4Box output (out_dashinit.mp4
) and send it through Web Sockets, chunk by chunk, to a JavaScript client that feeds it to the media source API.
However, this is not a good method for live content.
What I'm trying to do now, is to create a single pipeline in order to do it in realtime and with the minimum possible latency.
With FFmpeg it's possible to redirect the output to stdout
instead of out.mp4
and grab the content.
I couldn't figure out if it's possible to combine MP4Box into the pipeline.
stdout
) and can it do so progressively so that whenever output data is ready, I will be able to take it and transfer it to the web client, essentially generating a never-ending dashed MP4.You don't need MP4Box to generate the required output, but you'll need to chunk the content yourself looking for boxes in the generated file.
Basically you'll generate an fMP4
with H264
, and send to the browser the moov
box for initialization and the moof+mdat
boxes for each fragment of MP4 that you generate. You'll have to code the player in JavaScript, you probably won't be able to use a standard DASH player.
To generate the correct fragmented MP4, you need to pass this to ffmpeg: -movflags empty_moov+omit_tfhd_offset+frag_keyframe+default_base_moof
.
Be sure to use the latest version available.