[FFmpeg-user] Is ffmpeg's "Output Stream" framerate... wrong?
roninpawn
roninpawn at gmail.com
Tue Mar 16 21:37:35 EET 2021
My fix was - as expected - simple as finding the correct variable to poll
with ffprobe:
*'r_frame_rate' is "the lowest framerate with which all timestamps can be
> represented accurately (it is the least common multiple of all framerates
> in the stream)."*
*'avg_frame_rate' is just that: total duration / total # of frames*
Switching to 'avg_frame_rate' gets me the correct rate to calculate against
- bing, bosh. VFR timing works as expected.
---
But I am still trying to insinuate here that the console output of ffmpeg,
in so much as I understand it, is just plain *INCORRECT*. The output stream
in this instance is 'rawvideo.' There is no frame drop/duplication enabled
or happening as far as I know -- and the frame count of the test I
originally described in this thread supports that. The source footage is
VFR with an average frame rate that calculates to 30.011923688394276fps. When
I open a stream to ffmpeg, what I receive from the output stream is every
single one of that source's frames. Literally nothing has changed from
input to output in this operation to alter the frame base/rate or number of
frames I will receive. (as far as I am aware)
Despite the equity of INPUT and OUTPUT here, the output stream is loudly
declaring itself 30fps. *Which it is not*. That's just not at all correct.
And the value '30fps' is occupying the same position in the stream metadata
where the source input frame rate was displayed. Which communicates clearly
that on the input, this was 30.01fps -- but the output is now 30fps flat. A
purely FALSE declaration that misled me into thinking that both
'r_frame_rate' and the actual output stream were converting VFR to the
nearest standard timebase by some internal magic. Me, trusting the console
output, resulted in a bug in my code and subsequent timing inaccuracies...
because the console output is not correct.
All that said... After glancing at some superuser threads it looks like
this output stream frame rate is accurate / helpful in other circumstances.
Like when altering the frame rate of the output with vsync declarations, or
when full-on transcoding. And it wouldn't make any difference when you've
got a CFR source at an industry-standard frame base. But in this use case
of rawvideo output from a VFR source, that fps figure is nothing but a *wrong
number* in an important place, misleading anyone who looks at it. Which, to
my mind, makes it a bug that wants some error-trapping.
Am I wrong?
-Roninpawn
On Tue, Mar 16, 2021 at 12:56 PM Carl Zwanzig <cpz at tuunq.com> wrote:
> On 3/15/2021 6:51 PM, roninpawn wrote:
> > I develop a Python application used to conduct official timing for
> > speedrunning leaderboards based on automated video analysis.
>
> Instead of relying on output frame rate, have you considered using the
> Presentation Time Stamps (PTS) of the input? Those more accurately direct
> when a frame should be displayed (presented) so the timing ought to be
> more
> accurate.
>
> (There was much discussion about PTS on this list recently.)
>
> z!
> _______________________________________________
> ffmpeg-user mailing list
> ffmpeg-user at ffmpeg.org
> https://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-request at ffmpeg.org with subject "unsubscribe".
--
Roninpawn on YouTube
(http://www.youtube.com/user/roninpawn
<http://www.youtube.com/user/roninpawn>)
More information about the ffmpeg-user
mailing list