[FFmpeg-devel] [RFC] Issue with "standard" FPS/timebase detection
michaelni at gmx.at
Fri Sep 16 21:35:13 CEST 2011
On Thu, Feb 11, 2010 at 01:18:18AM -0800, Jason Garrett-Glaser wrote:
> Test case:
> In the "101fps" file, tb_unreliable gets triggered because of the
> framerate being >= 101. The if(duration_count[i] &&
> tb_unreliable(st->codec) code in libavformat then proceeds to run,
> resulting in a framerate of 1/12, which obviously doesn't make any
> sense whatsoever and is rather broken.
> I understand the basic idea behind what the duration error code is
> doing, but I don't understand why it's giving such a weird result.
> The primary reason seems to be the multiply in double error=
> duration_error[i][j] * get_std_framerate(j); , which results in a bias
> towards very small framerate values. But even if I remove that, it
> still comes up with an extremely weird fps (695/12). Something is off
> in the error calculation, but I don't fully understand it enough to
> judge what it is.
> As this is Michael's code, I figure he can probably offer the best
> insight on what's going on here.
yes, sorry for the delay
fixed in ffmpeg git master
ive also tested against a heap of other files and it all seems working
If someone doesnt want the new code this fix can likely be backported
Michael GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB
Breaking DRM is a little like attempting to break through a door even
though the window is wide open and the only thing in the house is a bunch
of things you dont want and which you would get tomorrow for free anyway
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Size: 198 bytes
Desc: Digital signature
More information about the ffmpeg-devel