[FFmpeg-user] ffmpeg combining live video from 3 cameras
ashwin Nair
annrwork92 at gmail.com
Tue Jul 19 07:35:01 EEST 2022
Hi All,
I have 3 USB cameras streaming video at a specific resolution. I am trying
to merge the videos together using hstack and display using ffplay.
This is the command I am using:
ffmpeg -f dshow -pix_fmt uyvy422 -video_size 1280x2160 -i video="CAM_0" -f
dshow -pix_fmt uyvy422 -video_size 1280x2160 -i video="CAM_1" -f dshow
-pix_fmt uyvy422 -video_size 1280x2160 -i video="CAM_2" -filter_complex
hstack=3 -f nut - | ffplay -
There is a special case in this system. The USB Camera system will only
start video stream only when 3 cameras are enabled (or started by ffmpeg).
However when I am using the above command, ffmpeg starts CAM_0 and waits
for data. As a result CAM_1 and CAM_2 are not started and video stream does
not start.
Is there any way where I can start all 3 inputs simultaneously using ffmpeg
and then merge them together using hstack?
Thanks!
More information about the ffmpeg-user
mailing list