1

I'm developing a C# video streaming application using NReco library. At this moment I was able to encode audio and video separately and save data in to queue. I can stream video using UDP protocol without audio and it played nicely in ffplay equally I can stream audio using UDP protocol without video and it also played nicely.Now I want to merge these two streams in to one and stream through UDP protocol and player should play both audio and video. But I have no idea how to do it. I would appreciate if some one can give me any points to do this or any other different method achieve this.

Thank You.

1 Answer 1

1

The answer highly depends on the source of video and audio streams. NReco.VideoConverter is a wrapper to FFMpeg tool and it actually can combine video and audio streams (see filters configuration in FFMpeg documentation) if either video or audio input can be specified as ffmpeg input source (UDP stream or direct show input device).

If both video and audio data represented by byte stream in your C# code you cannot pass them together using NReco.VideoConverter (ConvertLiveMedia method) because it uses stdin for communicating with ffmpeg and only one stream can be passed from C# code.

Sign up to request clarification or add additional context in comments.

4 Comments

Thank you, capture video using opencv -> encode using ffmpeg -> save in to queue, capture and encode audio using ffmpeg -> save in to queue, Now I want to mux these two encoded data into one byte stream and stream through UDP. As you were saying if I can't input two byte streams in to NReco is it possible to input video encoded data (from queue) as stdin and audio as a file (using -i). Also is there any way to use time stamps in audio and video and stream them separately and player manage to sync audio and video. If this is possible can you please give me some points to do this. Thank you.
I may propose another solution that involves several ffmpeg processes (not sure that it is suitable in your case): you may use 2 live streams that consume data from C# binary stream and re-stream them as RTMP. Then, you can use 1 ffmpeg process that uses these 2 RTMP streams as input and combines them into one resulting stream. To avoid excessive encoding/decoding streams you may use "-c copy" ffmpeg parameter.
Thank you, Is there any other way to merge these encoded data using any other library and then stream through UDP. video encoded data + audio encoded data -> stream Thank you.
I merge two streams like this, ffmpegConverter.Invoke(" -i udp://224.1.1.1:1250 -i udp://224.1.1.1:1260 -c copy -f avi udp://224.1.1.1:1234"); but the problem was I can't sync audio and video is there any other way to make this work. I also use this code but it didn't make any different, ffmpegConverter.Invoke(" -i udp://224.1.1.1:1250 -itsoffset 3 -i udp://224.1.1.1:1260 -c copy -map 0:0 -map 1:0 -f avi udp://224.1.1.1:1234") Please help me to figure this out. Thank you.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.