[FFmpeg-user] Grid of videos starting at different times

Paul B Mahol onemda at gmail.com
Tue Aug 11 18:21:28 EEST 2020


On 8/11/20, Randy Johnson via ffmpeg-user <ffmpeg-user at ffmpeg.org> wrote:
> Hello,
>
> I have several videos that were recorded in a webinar type setting.
>
> Each person in the webinar was recorded with their own feed.
>
> I then receive a zip file with each video recording and a JSON file with a
> start / stop time offset.
>
> Example:
>
> Video 1:
>
>  "filename" : "823ef68a-cb1c-4636-bdfb-ed3d9611a755.webm",
> "size" : 4766189,
>  "startTimeOffset" : 17599,
> "stopTimeOffset" : 72696,
>
> Video 2:
>
> "filename" : "a920ab58-a42d-49fe-a239-b27497b22bb1.webm",
>  "size" : 6378071,
>  "startTimeOffset" : 1439,
>  "stopTimeOffset" : 74655,
>
>
> I could have anywhere from 2 to 10 videos per job.
>
> For faster processing I convert the videos to Mp4
>
> I have had some luck in merging the videos into 1 using hstack but I cannot
> seem to figure out how to get them merged into 1 video with the different
> time offsets.
>
> Here is what I have so far which gets me close but doesn't take into account
> the different offsets so everyone is talking over each other and out of
> order etc.
>
> ffmpeg -i
> /Users/randy/Downloads/archive23/a920ab58-a42d-49fe-a239-b27497b22bb1.mp4 -i
> /Users/randy/Downloads/archive23/823ef68a-cb1c-4636-bdfb-ed3d9611a755.mp4
> -filter_complex \
> "[0:v][1:v]hstack=inputs=2[v]; \
>  [0:a][1:a]amerge[a]" \
> -map "[v]" -map "[a]" -ac 2 -y /Users/randy/Downloads/archive23/output.mp4
>
> Another tricky thing is that if the conference was 1 hour long, someone
> could show up 15 minutes late or leave 15 minutes early.
>
> I thought
> ffmpeg -itsoffset n
> could be of help but
> https://stackoverflow.com/questions/55492772/stacking-different-length-videos-not-working-with-ffmpeg-and-itsoffset
> this post says that hstack doesn't sync with timestamps.

Post is invalid. hstack use timestamps to sync its inputs.
hstack expect same timestamps from all its inputs.

Your inputs have different starts and durations/ends and thus what
should be inserted between gaps?

You could theoretically generate filtergraph by using supplied
parameters, but that would be
very complicated at best.

>
> They do give an example in the stack overflow post:
> ffmpeg \
> -i smaller.mp4 \
> -i bigger.mp4 \
> -filter_complex \
>  "[0]tpad=start_duration=17[left];\
>   [left][1]hstack=inputs=2;\
>  [0]adelay=17s|17s[lefta];[lefta][1]amix=2" \
> -c:v libx264 -crf 23 out.mp4
>
> Any ideas on the best way for me to proceed?
>
> Thanks,
>
> Randy
> _______________________________________________
> ffmpeg-user mailing list
> ffmpeg-user at ffmpeg.org
> https://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-request at ffmpeg.org with subject "unsubscribe".


More information about the ffmpeg-user mailing list