Live streaming: node-media-server + Dash.js configured for real-time low latency

Maoration picture Maoration · Feb 10, 2020 · Viewed 9.8k times · Source

We're working on an app that enables live monitoring of your back yard. Each client has a camera connected to the internet, streaming to our public node.js server.

I'm trying to use node-media-server to publish an MPEG-DASH (or HLS) stream to be available for our app clients, on different networks, bandwidths and resolutions around the world.

Our goal is to get as close as possible to live "real-time" so you can monitor what happens in your backyard instantly.

The technical flow already accomplished is:

  1. ffmpeg process on our server processes the incoming camera stream (separate child process for each camera) and publishes the stream via RTSP on the local machine for node-media-server to use as an 'input' (we are also saving segmented files, generating thumbnails, etc.). the ffmpeg command responsible for that is:

    -c:v libx264 -preset ultrafast -tune zerolatency -b:v 900k -f flv rtmp://127.0.0.1:1935/live/office

  2. node-media-server is running with what I found as the default configuration for 'live-streaming'

    private NMS_CONFIG = {
    server: {
      secret: 'thisisnotmyrealsecret',
    },
    rtmp_server: {
      rtmp: {
        port: 1935,
        chunk_size: 60000,
        gop_cache: false,
        ping: 60,
        ping_timeout: 30,
      },
      http: {
        port: 8888,
        mediaroot: './server/media',
        allow_origin: '*',
      },
      trans: {
        ffmpeg: '/usr/bin/ffmpeg',
        tasks: [
          {
            app: 'live',
            hls: true,
            hlsFlags: '[hls_time=2:hls_list_size=3:hls_flags=delete_segments]',
            dash: true,
            dashFlags: '[f=dash:window_size=3:extra_window_size=5]',
          },
        ],
      },
    },
    

    };

  3. As I understand it, out of the box NMS (node-media-server) publishes the input stream it gets in multiple output formats: flv, mpeg-dash, hls. with all sorts of online players for these formats I'm able to access and the stream using the url on localhost. with mpeg-dash and hls I'm getting anything between 10-15 seconds of delay, and more.


My goal now is to implement a local client-side mpeg-dash player, using dash.js and configure it to be as close as possible to live.

my code for that is:

with the online test video (https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd) I see that the live latency value is close to 2 secs (but I have no way to actually confirm it. it's a video file streamed. in my office I have a camera so I can actually compare latency between real-life and the stream I get). however when working locally with my NMS, it seems this value does not want to go below 20-25 seconds.

Am I doing something wrong? any configuration on the player (client-side html) I'm forgetting? or is there a missing configuration I should add on the server side (NMS) ?

Answer

Mick picture Mick · Feb 14, 2020

HLS and MPEG DASH are not particularly low latency as standard and the figures you are getting are not unusual.

Some examples from a publicly available DASH forum document (linked below) include:

enter image description here

enter image description here

Given the resources of some of these organisations, the results you have achieved are not bad!

There is quite a focus in the streaming industry at this time on enabling lower latency, the target being to come as close as possible to traditional broadcast latency.

One key component of the latency in chunked Adaptive Bit Rate (ABR, see this answer for more info: https://stackoverflow.com/a/42365034/334402 ) is the need for the player to receive and decode one or more segments of the video before it can display it. Traditionally the player had to receive the entire segment before it could start to decode and display it. The diagram from the first linked open source reference below illustrates this:

enter image description here

Low latency DASH and HLS leverage CMAF, 'Common Media Application Format' which breaks the segments, which might be 6 seconds long for example, into smaller 'chunks' within each segment. These chunks are designed to allow the player to decode and start playing them before it has received the full segment.

Other sources of latency in a typical live stream will be any transcoding from one format to another and any delay in a streaming server receiving the feed, from the webcam in your case, and encoding and packaging it for streaming.

There is quite a lot of good information available on low latency streaming at this time both from standards bodies and open source discussions which I think will really help you appreciate the issues (all links current at time of writing). From open source and standards discussions:

and from vendors:

Note - a common use case often quoted in the broadcast world is the case where someone watching a live event like a game may hear their neighbours celebrating a goal or touchdown before they see it themselves, because their feed has higher latency than their neighbours. While this is a driver for low latency, this is really a synchronisation issue which would require other solutions if a 'perfectly' synchronised solution was the goal.

As you can see low latency streaming is not a simple challenge and it may be that you want to consider other approaches depending on the details of your use case, including how many subscribers you have, whether some loss of quality if a fair trade off for lower latency etc. As mentioned by @user1390208 in the comments, a more real time focused video communication technology like WebRTC may be a better match for the solution you are targeting.

If you want to provide a service that provides life streaming and also a recording, you may want to consider using a real time protocol for the live streaming view and HLS/DASH streaming for anyone looking back through recordings where latency may not be important but quality may be more key.