I tried concatenating audio blobs using Web RTC experiment by Muaz Khan, but when I play the concatenated audio, the HTML audio element does not show the full length of the audio file and also if you download and play, the issue will persist. I used ffmpeg to concate these blobs, though is there a way which can be used for concatenating audio blobs using the Web RTC js experiment by Muaz Khan. A similar attempt which also did not work out : Combine two audio blob recordings
- Did you ever figure this out? I had a similar issue.haberdasher– haberdasher2016-12-14 06:14:33 +00:00Commented Dec 14, 2016 at 6:14
- Here are a few links: github.com/abhijayghildyal/Audio-Recording-App-AngularJS-NodeJS, github.com/muaz-khan/RecordRTC/tree/master/RecordRTC-to-NodejsPixelPioneer– PixelPioneer2016-12-14 06:50:04 +00:00Commented Dec 14, 2016 at 6:50
Add a comment |
1 Answer
The best way is to convert the blobs into AudioBuffers (Convert blob into ArrayBuffer using FileReader and then decode those arrayBuffers into AudioBuffers). You can then merge/combine more than one AudioBuffers and get the resultant. Following code will work in such situation:
var blob="YOUR AUDIO BLOB"; var f = new FileReader(); f.onload = function (e) { audioContext.decodeAudioData(e.target.result, function (buffer) { arrayBuffer.push(buffer); if (arrayBuffer.length > 1) { resultantbuffer = appendBuffer(arrayBuffer[0], arrayBuffer[1]); arrayBuffer = []; arrayBuffer.push(resultantbuffer); } else { resultantbuffer = buffer; } }, function (e) { console.warn(e); }); }; f.readAsArrayBuffer(blob); This code read the blob and convert into arrayBuffer (e.target.result) and decode those buffers into AudioBuffers (buffer). I used appendBuffer method for appending more than one audioBuffers. Here is the method:
function appendBuffer(buffer1, buffer2) { ///Using AudioBuffer var numberOfChannels = Math.min(buffer1.numberOfChannels, buffer2.numberOfChannels); var tmp = recordingAudioContext.createBuffer(numberOfChannels, (buffer1.length + buffer2.length), buffer1.sampleRate); for (var i = 0; i < numberOfChannels; i++) { var channel = tmp.getChannelData(i); channel.set(buffer1.getChannelData(i), 0); channel.set(buffer2.getChannelData(i), buffer1.length); } return tmp; } Do let me know if you face any problem.
4 Comments
PixelPioneer
Yes. I tried something similar but the resultant audio file created is not proper. By proper I mean the length of the audio file displayed on the media player is just the length of the first blob. I haven't tried what you have mentioned in your answer but if you can put up a working solution on git it'll be extremely helpful.
Hamza Ali
If your audio blob is good then this solution will work. You just have to decode auido blob which would be in AudioBuffer (that is appendable), Above is the code of appending AudioBuffers. Yes when you receive or get resultant AudioBuffer, then you have to make them playable. You can encode that buffers into wav format. Here is the relative link you can follow: danieldemmel.me/JSSoundRecorder
PixelPioneer
Thanks I will try this out :)
Harshil Dholakiya
I have append all audiobuffer that i have and get correct buffer. and then, i have encoded my audiobuffer to audio/wav format and created blob. but, no sound and my blob is mute when i play as audio/wav. please, help.