I'm trying to capture the screen video and two audio stream, one from mic, and the other one from analog monitor.
ffmpeg -video_size "$__screen_size__" \ -framerate 50 \ -thread_queue_size 512 -f x11grab -i :0.0+0,0 \ -thread_queue_size 512 -f pulse -i "$__input_mic__" \ -thread_queue_size 512 -f pulse -i "$__output_monitor__" \ -filter_complex amerge \ -ac 2 \ -vcodec libx264rgb -crf 0 -preset:v ultrafast \ -acodec pcm_s16le \ -af aresample=async=1:first_pts=0 \ -async 1 \ -y \ "$__output__" The problem is ffmpeg complains about using two filter simultaneously for one stream:
Filtergraph 'aresample=async=1:first_pts=0' was specified through the -vf/-af/-filter option for output stream 0:0, which is fed from a complex filtergraph. -vf/-af/-filter and -filter_complex cannot be used together for the same stream. The script below (only one audio stream form mic) however works perfectly fine, when I don't have to merge audio streams
ffmpeg -video_size "$__screen_size__" \ -framerate 50 \ -thread_queue_size 512 -f x11grab -i :0.0+0,0 \ -thread_queue_size 512 -f pulse -i "$__input_mic__" \ -vcodec libx264rgb -crf 0 -preset:v ultrafast \ -acodec pcm_s16le \ -af aresample=async=1:first_pts=0 \ -async 1 \ -y \ "$__output__" basically removing:
-thread_queue_size 512 -f pulse -i "$__output_monitor__" \ -filter_complex amerge \ -ac 2 \ If I remove -af aresample=async=1:first_pts=0 and keep the above then the audio and video won't be synchronized.
Basically I want to merge audio streams without introducing any latency.