[FFmpeg-user] Multicam synchronized output
Daniel Sevelt
daniel.sevelt at gmail.com
Tue Apr 26 19:29:31 CEST 2016
Greetings,
I am trying to create a two camera overlay with output of one composite
image to a vl42loopbace device. (identical models, sony ps3 eye's or ELP
USB cams have been tested with the same result.) This is for a machine
vision application so the composite stream needs to be as close to realtime
as it can be, meaning, I'm not interested in any more buffer than necessary
to make the output smooth. It is ideal for them to be as synced as can be
achieved with non scientific grade webcams to keep the cost down and test
the software’s ability to deal with webcams that can't truly "sync". My
online research and testing is reaching a void of online information in
that regard due to seperating out my specific issue from a flood of people
trouble shooting out of sync audio/video problems.
Any help, howto's or pointers in the right direction is much appreciated.
The problem I'm experiencing is a reliable, full second delay in one cam
with the other having no significant delay at all. My suspicions are that
since the devices are steaming data, time stamps are are going to be
arbitrary to each other to begin with and somehow I need to inform ffmpeg
to adjust to and start it's frames for the overlay at a matching starting
point, or the earliest it can find once it's ready to go.
I've tried many different options regarding buffer and -itsoffset to no
luck so far. The code I that reliably produces this undesired behavior with
out error output with was built on the example code taken from the filter
documentation @ http://ffmpeg.org/ffmpeg-filters.html#toc-Examples-67
ffmpeg -thread_queue_size 512 -f v4l2 -framerate 30 -video_size 640x480 -i
/dev/video2 -i /dev/video1 -filter_complex "
nullsrc=size=640x480 [background];
[0:v] setpts=PTS-STARTPTS, scale=640x480 [left];
[1:v] setpts=PTS-STARTPTS, scale=640x480 [right];
[background][left] overlay=shortest=1 [background+left];
[background+left][right] overlay=shortest=1:x=320
" -pix_fmt yuv420p -f v4l2 /dev/video3
I can see the logic for a video file with STARTPTS but it seems to be an
incorrect choice for live devices.
The delay seems to deal with it being a different physical device, ie. my
thought about the time stamps. When I change [1:v] setpts=PTS-STARTPTS,
scale=640x480 [right]; to read [0:v] setpts=PTS-STARTPTS, scale=640x480
[right]; replicating the same device output twice, there is no lag.
So it seems that the value of STARTPTS is determined, for both, then some
processing is done that takes a second for the second camera and ffmpg
dutifully displays the accumulated frames leaving one camera input
displaying a second behind?
Please let me know if I can provide any additional data to help solve this
problem. Thank you so much for reading.
Best,
Daniel
More information about the ffmpeg-user
mailing list