1

I am trying to record the remote user's media stream. The Media Stream object that is passed from the remote user appears to be very similar to the local Media Stream object generated from getUserMedia, however upon passing this into a Media Stream visualiser nothing is output (if I pass the local Media Stream the visualiser has an output). I can hear the remote user's Media Stream so I know that something is being passed.

The remote media stream looks something like

active: true id: "Q7aYJkeOt5xhHJ53c3JVhr41scl6QQEib5lt" onactive: null onaddtrack: null onended: () oninactive: null onremovetrack: null __proto__: MediaStream 

and has an audio track

enabled: true id: "021f5032-a524-42ae-ad40-bf0798df89cd" kind: "audio" label: "021f5032-a524-42ae-ad40-bf0798df89cd" muted: false onended: null onmute: null onunmute: null readyState: "live" remote: true __proto__: MediaStreamTrack 

The local Media Stream looks something like

active: true id: "fP3smf9D78yl9YXV8jZwGPkMNL2UkwrXc2sl" onactive: null onaddtrack: null onended: () oninactive: null onremovetrack: null __proto__: MediaStream 

with an audio track

enabled: true id: "32da421e-0a35-4fe4-b553-8a3206d244ec" kind: "audio" label: "Default" muted: false onended: null onmute: null onunmute: null readyState: "live" remote: false __proto__: MediaStreamTrack 

The only real difference that I can see is the remote flag in the audio track.

1 Answer 1

1

Following code is supported since Chrome 48:

peer.onaddstream = function(event) { var stream = event.stream; window.recorder = new MediaRecorder(stream, { type: 'video/webm' }); recorder.start(99999999999999999); }; btnStopRecording.onclick = function() { if (!window.recorder) return; recorder.ondataavailable = function(event) { var blob = event.data; console.log(blob.size, blob); }; recorder.stop(); }; 

Cross-browser implementation: https://github.com/streamproc/MediaStreamRecorder

Sign up to request clarification or add additional context in comments.

2 Comments

I'm intending to use a cloud based recording infrastructure such as Twilio to record the Media Streams, mainly to reduce the client side network load. I dont see the difference between the local Media Stream and the remote other than the local can be visualised and recorded whereas the remote cannot. Do I need to parse the Media Stream object or can I use the MediaStreamTrack and submit that to the recording infrastructure?

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.