[FFmpeg-user] Audio-Video Synchronization and Fastest Encoding
Stephan Monecke
stephanmonecke at gmail.com
Fri Jun 5 15:39:16 EEST 2020
Hi,
in case I've been unclear, I would like to clarify two things.
Question 2 should be
> 2. "How can I send a rawvideo to a rtmp-server as light as possible
on the CPU?"
as well as add, that the structure of the current setup are two ffmpeg
instances (question 2 is regarding the first) as follows:
1. ffmpeg [...] -r 60 -i /dev/video0 [...] -r 25 -an -f flv
rtmp://localhost/live
2. ffmpeg \
-use_wallclock_as_timestamps 1 \
-fflags +genpts \
-max_delay 2000000 \
-thread_queue_size 1024 \
-i rtmp://localhost/live \
-max_delay 2000000 \
-thread_queue_size 1024 \
-itsoffset 4.8 \
-f pulse \
-i "alsa_input.pci-0000_00_1f.3.analog-stereo" \
-af "aresample=async=1" \
-codec:a aac \
-b:a 384k \
-ar 48000 \
-vcodec copy \
-tune zerolatency \
-map 0:v -map 1:a \
-max_muxing_queue_size 99999 \
-f flv
outfile.flv
Thanks a lot!
Am Do., 4. Juni 2020 um 17:11 Uhr schrieb Stephan Monecke
<stephanmonecke at gmail.com>:
>
> Hi together,
>
>
> I have a weak computer (i3 something) connected to a HDMI-grabber
> (Magewell USB Capture HDMI PLUS, acts like a webcam) and audio via
> line over the microphone port.
>
> I want to merge those two streams together with as little audio-video
> offset as possible and as light as possible on the CPU BUT I need
> multiple other programs to be able to read the HDMI-grabber
> simultaneously (I hence somehow need to mirror /dev/video0).
>
> For the latter, I currently use an instance of the nginx-rtmp plugin
> as a local video relay -- I might as well have a look at the
> v4l2loopback module but I don't know about the performance or
> side-effects so far.
>
> I now have the following questions:
>
> 1. What is the best way to automatically produce synchronized audio
> and video? Is there some resource someone can point me at? I currently
> use `-itsoffset` with empirical value for the video and HOPE it to be
> constant. Post-production would not be a problem as long its
> automatable and command line.
>
> 2. How can I send the video to the local nginx as light as possible
> on the CPU?
>
> When using the GPU via
>
> ffmpeg
> -vaapi_device /dev/dri/renderD128
> -r 60
> -i /dev/video0
> -vf 'format=nv12,hwupload'
> -c:v h264_vaapi
> -r 25
> -profile high
> -threads 1
> -an
> -f flv rtmp://localhost/live
>
> I have around 70-80 % CPU usage on the respective core but the
> audio-video offset seems to vary about a second between multiple
> recordings. When I use
>
> ffmpeg
> -r 60
> -i /dev/video0
> -preset ultrafast
> -c:v libx264
> -r 25
> -threads 1
> -an
> -f flv rtmp://localhost/live
>
> I have dangerous 100 % CPU utilization on the respective core but did
> not notice the offset variation so far. Is there something lighter?
>
> 3. Is there a completely different more sane approach?
>
>
> Thanks a lot for any help!
>
> Stephan
More information about the ffmpeg-user
mailing list