peerConnection.removeStream(streamA) // __o_j_sep... in Screenshots below
peerConnection.addStream(streamB) // SSTREAM in Screenshots below
streamA
is a video/audio stream coming from my camera and microphone.streamB
is the screencapture I get from my extension.* 1 Remark
But if I remove streamA
from the peerConnection
and addStream(streamB)
like above nothing seems to happen.
The following works as expected (the stream on both ends is removed and re-added)
peerConnection.removeStream(streamA) // __o_j_sep...
peerConnection.addStream(streamA) // __o_j_sep...
I have found this example which does "the reverse" (Switch from screen capture to audio/video with camera) but can't spot a significant difference.
The peerConnection
RTCPeerConnection object is actually created by this SIPML library source code available here. And I access it like this:
var peerConnection = stack.o_stack.o_layer_dialog.ao_dialogs[1].o_msession_mgr.ao_sessions[0].o_pc
(Yes, this does not look right, but there is no official way to get access to the Peer Connection see discussion here) and here.
Originally I tried to just (ex)change the videoTracks of streamA
with the videoTrack of streamB
. See question here. It was suggested to me that I should try to renegotiate the Peer Connection (by removing/adding Streams to it), because the addTrack does not trigger a re-negotitation.
I've also asked for help here but the maintainer seems very busy and didn't have a chance to respond yet.
* 1 Remark: Why does streamB
not have a videoTracks
property? The stream plays in an HTML <video>
element and seems to "work". Here is how I get it:
navigator.webkitGetUserMedia({
audio: false,
video: {
mandatory: {
chromeMediaSource: 'desktop',
chromeMediaSourceId: streamId,
maxWidth: window.screen.width,
maxHeight: window.screen.height
//, maxFrameRate: 3
}
}
// success callback
}, function(localMediaStream) {
SSTREAM = localMediaStream; //streamB
// fail callback
}, function(error) {
console.log(error);
});
it also seems to have a videoTrack
:
I'm running:
To answer your first question, when modifying the MediaStream in an active peerconnection, the peerconnection object will fire an onnegotiationneeded
event. You need to handle that event and re-exchange your SDPs. The main reason behind this is so that both parties know what streams are being sent between them. When the SDPs are exchanged, the mediaStream ID is included, and if there is a new stream with a new ID(event with all other things being equal), a re-negotiation must take place.
For you second question(about SSTREAM
). It does indeed contain video tracks but there is no videotrack attribute for webkitMediaStreams
. You can grab tracks via their ID, however.
Since there is the possibility of having numerous tracks for each media type, there is no single attribute for a videotrack or audiotrack but instead an array of such. The .getVideoTracks()
call returns an array of the current videoTracks. So, you COULD grab a particular video track through indicating its index .getVideoTracks()[0]
.