[FFmpeg-devel] [PATCH] frame: add a time_base field

Nicolas George george at nsup.org
Thu Sep 9 22:55:22 EEST 2021


Lynne (12021-09-09):
> It's a necessary piece of information pertinent to the correct
> presenting of each frame. Moreover, it simplifies the API,

That piece of information is already present along with all the other
pieces of information necessary to make sense of a frame.

> which new users are still finding difficult to use. Like for example
> timebase negotiation in lavf, which requires a complicated dance
> to perform, and is not documented anywhere. And which
> timebase field are you supposed to use from lavf? The global
> context's? The stream's? The codecparameter's? This

This is already documented. Maybe the documentation can be made clearer.

> field eliminates any source of doubt which timebase to use.
> And this isn't about the difference between frames and packets
> either, frames can be wrapped as packets too.
> Additionally, this will allows us to deal with stream switching and
> stream splicing correctly without needing to reinitialize components.

This is an interesting remark, but please consider that relying on the
change of a field that used to be constant is a terrible API break.

> Right now, all the arguments you've given are "it's redundant"
> (it isn't, you __need__ a timebase to make sense of any timestamps,
> and if a timebase isn't found in the frame but halfway across Jupiter,
> it's simply missing), it's complicated (it isn't, it's a 10-line patch,
> maximum), it's hard to keep in sync (it isn't, it's a frame field like
> any other which will be printed by ffprobe and tested by FATE).

And yes, that is all the arguments I have, or you can even say it in
singular: argument, because it is one and the same: redundancy is a bad
idea because it is hard to keep in sync.

As for "halfway across Jupiter", stop this ridiculous hyperbole. The
fact is that all frames are relative to a certain context that give
meaning to the timestamps: an AVStream when it comes from a demuxer or
goes to a muxer, an AVFilterLink in libavfilter, an OutputStream in
fftools, another application-defined data structure for any application.

There is never a frame in isolation, so designing an API as if there
were is a waste of time.

> The only honest argument you stated has been an implicit "I don't
> like this".

Please have the courtesy of not accusing me of dishonesty. This is an
absolute prerequisite for constructive discussion.

-- 
  Nicolas George
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 833 bytes
Desc: not available
URL: <https://ffmpeg.org/pipermail/ffmpeg-devel/attachments/20210909/1e8fe089/attachment.sig>


More information about the ffmpeg-devel mailing list