I'm having some trouble streaming H.264 video over RTSP. The goal is to live-stream a camera image to an RTSP client (ideally a browser plugin in the end). This has been working pretty well so far, except for one problem: the video will lag on startup, stutter every few seconds, and has a ~4-second delay. This is bad.
Our setup is to encode with x264 (w/ zerolatency & ultrafast) and packed into RTSP/RTP with libavformat from ffmpeg 0.6.5. For testing, I'm receiving the stream with a GStreamer pipeline with gst-launch when connecting to an RTSP server. However, I've been able to reproduce the same issue when streaming straight from another GStreamer instance with just RTP.
Sending machine:
gst-launch videotestsrc ! x264enc tune=zerolatency ! rtph264pay ! udpsink host=10.89.6.3
Receiving machine:
gst-launch udpsrc ! application/x-rtp,payload=96 ! rtph264depay ! decodebin ! xvimagesink
You can also run these both on the same machine, just change the host to 127.0.0.1 on the sender. On the receiving end, you should notice stuttering and generally poor-performing video, along with repeated warnings on the console:
WARNING: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(2875): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
There may be a timestamping problem, or this computer is too slow.
One commonly-suggested "fix" that I've seen all over the Internet is to use sync=false
with xvimagesink:
gst-launch udpsrc ! application/x-rtp,payload=96 ! rtph264depay ! decodebin ! xvimagesink sync=false
The video will then play back with near-zero latency, even when tested with our camera software. This is useful for testing, but is not very useful for deployment, as it won't work with Totem, VLC, or their browser plugin embeds.
I'd like to try to solve the issue at the source; I'm suspicious that there's some sort of timestamp info missing on the H.264 stream by x264 or perhaps on the RTP payloads. Is there any way to modify the source gst pipeline so that I do not need to use sync=false
on the receiver?
If that's not possible, how can I tell clients (via SDP or otherwise) that the stream should not be synchronized? Ultimately, we'd embed this in the browser using a VLC plugin of sorts, so a solution that would work there would be even better.
As root.ctrlc posted, you can use sync=FALSE. However, you might notice a huge increase in CPU usage on the sender's end. The reason is that sync=FALSE tells the sink to just push out buffers as soon as it receives them. The sink drives the whole pipeline. Therefore, sync=FALSE will cause the pipeline to encode video and push it to UDP as fast as possible; it will use 100% CPU.
What you need is the gstrtpjitterbuffer. It also takes care of the timestamps, which are broken here.
Example sender:
gst-launch-0.10 -v videotestsrc ! videorate ! video/x-raw-yuv, framerate=30/1 ! ffmpegcolorspace ! x264enc ! rtph264pay ! udpsink port=50000 host=<sender IP>
Example receiver:
gst-launch-0.10 udpsrc port=50000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000 , encoding-name=(string)H264 , payload=(int)96" ! gstrtpjitterbuffer ! rtph264depay ! ffdec_h264 ! ffmpegcolorspace ! videoscale ! "video/x-raw-yuv, width=320, height=240" ! xvimagesink