Is it possible broadcast audio with screensharing with WebRTC

Ivo Pavlik picture Ivo Pavlik · Oct 15, 2013 · Viewed 13.7k times · Source

is it possible broadcast audio with screensharing with WebRTC? Simple calling getUserMedia with audio: true fails by permission denied error. Is there any workeround which could be used to broadcast audio also? Will be audio implemented beside screensharing?

Thanks.

Answer

Muaz Khan picture Muaz Khan · Nov 19, 2013

Refer this demo: Share screen and audio/video from single peer connection!

Multiple streams are captured and attached to a single peer connection. AFAIK, audio alongwith chromeMediaSource:screen is "still" not permitted.


Updated at April 21, 2016

Now you can capture audio+screen using single getUserMedia request both on Firefox and Chrome.

However Chrome merely supports audio+tab i.e. you can NOT capture full-screen along with audio.

Audio+Tab means any chrome tab along with microphone.


Updated at Jan 09, 2017

You can capture both audio and screen streams by making two parallel (UNIQUE) getUserMedia requests.

Now you can use addTrack method to add audio tracks into screen stream:

var audioStream = captureUsingGetUserMedia();
var screenStream = captureUsingGetUserMedia();

var audioTrack = audioStream.getAudioTracks()[0];

// add audio tracks into screen stream
screenStream.addTrack( audioTrack );

Now screenStream has both audio and video tracks.

nativeRTCPeerConnection.addStream( screenStream );
nativeRTCPeerConnection.createOffer(success, failure, options);