[FFmpeg-devel] How to synchronize device sources

Noah Bergbauer noah.bergbauer at tum.de
Mon Jan 27 16:12:55 EET 2020


>> The goal it to use all of this directly from ffmpeg.exe, not only as a library.
>> And from what I can see ffmpeg always calculates pts relative to the first
>> value it sees, which would nullify the effect of using wallclock timestamps,
>> even if I were to switch both devices to those.
>> 
>> Or is there a way explain to ffmpeg that yes, I actually want you to correlate
>> the pts of these two devices with each other?
>
> The ffmpeg commant-line tool by default will normalize the start time of
> the input files, but there are options to keep work with the unaltered
> input timestamps. The -copyts option is the first one that comes to
> mind.
>
> I do not know the current state with regard to handling live timestamps,
> but these are things that can be enhanced.
>
> But nothing can be done unless all the streams have proper and
> comparable timestamps.

Makes sense, so I switched to wallclock timestamps with a timebase in microseconds.

Enabling -copyts however breaks a few things in that case. First of all, ffmpeg itself:

https://github.com/FFmpeg/FFmpeg/blob/dfc471488675aa257183745502d0074055db3bd2/fftools/ffmpeg.c#L1140

            if (ost->frame_number == 0 && delta0 >= 0.5) {
                av_log(NULL, AV_LOG_DEBUG, "Not duplicating %d initial frames\n", (int)lrintf(delta0));
                delta = duration;
                delta0 = 0;
                ost->sync_opts = lrint(sync_ipts);
            }

The long returned by lrint is a 32-bit integer on Windows. This leads to an overflow
(wallclock pts are well beyond 32 bits) and causes ffmpeg to drop all frames after the first one.
Switching to llrint is a simpe fix (although I feel quite uneasy about all the other lrint and
lrintf (on doubles!) calls in this function).

But then players appear to run into similar problems. VLC hangs when trying to load the mp4.
The Windows 10 Movies app plays the video, but appears to ignore the timestamps entirely.

All of these problems go away with -start_at_zero but since the audio is still mistimed, I guess
this just puts me back to the behavior without -copyts (not sure what the difference is then).

Or maybe the container is the issue? ffprobe only shows a single start time for the mp4 file.

I think what I really need is a way to tell ffmpeg.exe to actually drop the input that starts first
up to the point where the late input starts. Is there any built-in way to make that happen?

Thanks,
Noah Bergbauer


Von: ffmpeg-devel <ffmpeg-devel-bounces at ffmpeg.org> im Auftrag von Nicolas George <george at nsup.org>
Gesendet: Sonntag, 26. Januar 2020 21:57
An: FFmpeg development discussions and patches
Betreff: Re: [FFmpeg-devel] How to synchronize device sources
    
Noah Bergbauer (12020-01-26):
> Capturing with DXGI essentially *must* be done on a separate thread, as it is
> extremely time sensitive and starts dropping frames at the slightest delays.
> Dropping and duplicating frames also happens on that thread. This is why
> my DXGI device implements pts as a simple frame counter - I want ffmpeg
> to reproduce exactly the frames I return without ever duplicating or dropping
> anything by itself.

It seems like a strange design. A PTS that is only a frame counter does
not bring any information. I strongly suggest you use something more
relevant. If the device does not offer anything, use the wall clock.

> The goal it to use all of this directly from ffmpeg.exe, not only as a library.
> And from what I can see ffmpeg always calculates pts relative to the first
> value it sees, which would nullify the effect of using wallclock timestamps,
> even if I were to switch both devices to those.
> 
> Or is there a way explain to ffmpeg that yes, I actually want you to correlate
> the pts of these two devices with each other?

The ffmpeg commant-line tool by default will normalize the start time of
the input files, but there are options to keep work with the unaltered
input timestamps. The -copyts option is the first one that comes to
mind.

I do not know the current state with regard to handling live timestamps,
but these are things that can be enhanced.

But nothing can be done unless all the streams have proper and
comparable timestamps.

Regards,

-- 
  Nicolas George
    


More information about the ffmpeg-devel mailing list