How to make rtpjitterbuffer work on a stream without timestamps?

matt picture matt · Sep 19, 2016 · Viewed 8.6k times · Source

I am sending an H.264 bytestream over RTP using gstreamer.

# sender
gst-launch-1.0 filesrc location=my_stream.h264 ! h264parse disable-passthrough=true ! rtph264pay config-interval=10 pt=96 ! udpsink host=localhost port=5004

Then I am receiving the frames, decoding and displaying in other gstreamer instance.

# receiver
gst-launch-1.0 udpsrc port=5004 ! application/x-rtp,payload=96,media="video",encoding-name="H264",clock-rate="90000" ! rtph264depay ! h264parse ! decodebin ! xvimagesink

This works as is, but I want to try adding an rtpjitterbuffer in order to perfectly smooth out playback.

# receiver
gst-launch-1.0 udpsrc port=5004 ! application/x-rtp,payload=96,media="video",encoding-name="H264",clock-rate="90000" ! rtpjitterbuffer ! rtph264depay ! h264parse ! decodebin ! xvimagesink

However, as soon as I do, the receiver only displays a single frame and freezes.

If I replace the .h264 file with an MP4 file, the playback works great.

I assume that my h264 stream does not have the required timestamps to enable the jitter buffer to function.

I made slight progress by adding identity datarate=1000000. This allows the jitterbuffer to play, however this screws with my framerate because P frames have less data than I frames. Clearly the identity element adds the correct timestamps, but just with the wrong numbers.

Is it possible to automatically generate timestamps on the sender by specifying the "framerate" caps correctly somewhere? So far my attempts have not worked.

Answer

JohnLM picture JohnLM · Apr 19, 2017

You've partially answered the problem already:

If I replace the .h264 file with an MP4 file, the playback works great.

I assume that my h264 stream does not have the required timestamps to enable the jitter buffer to function.

Your sender pipeline has no negotiated frame rate because you're using a raw h264 stream, while you should really be using a container format (e.g., MP4) which has this information. Without timestamps udpsink cannot synchronise against clock to throttle, so the sender is spitting out packets as fast as pipeline can process them. It's not a live sink.

However adding a rtpjitterbuffer makes your receiver act as live source. It freezes because it's trying its best to cope with the barrage of packets of malformed timestamps. RTP doesn't transmit "missing" timestamps to best of my knowledge, so all packets will probably have the same timestamp. Thus it probably reconstructs the first frame and drops the rest as duplicates.

I must agree with user1998586 in the sense that it ought to be better for the pipeline to crash with a good error message in this case rather trying its best.

Is it possible to automatically generate timestamps on the sender by specifying the "framerate" caps correctly somewhere? So far my attempts have not worked.

No. You should really use a container.

In theory, however, an au aligned H264 raw stream could be timestamped by just knowing the frame rate, but there are no gstreamer elements (I know of) that do this and just specifying caps won't do it.