[MPlayer-G2-dev] more on frame timing, framerate-changing filters
Arpi
arpi at thot.banki.hu
Sun Jun 15 00:46:51 CEST 2003
Hi,
> Anyway, to get to the point, I'm hoping to make this code friendly for
> porting to G2 (I already have the timing-change stuff worked out for
> it), but I'd like to clear up how we want timing to work. Right now,
> it seems G2 has a duration field for the image, but I'm not sure
> whether that's preferred, or whether we should instead have a
> timestamp relative to the previous frame... IMO, for some filters, we
> may need to know *future* information to decide a duration, whereas we
> should already know the pts relative to prev frame. And of course, for
> performance purposes (and coding simplicity), it's optimal not to have
> to grab future frames from up the chain before we output a given
Yes.
Actually I spent a lot of time thinking on this, and got to the conclusion
that no optimal (or near optimal) solution exists.
At least in mplayer world, where lots of containers and codecs with
different behaviour and timing models are supported.
I've reduced this game to 2 basic types:
- absolute timestamps (when to display the frame)
- frame durations (how long display the frame)
Both may be given for a frame, and teh pts_flags tells you which ones
are available and (!!!) how accurate they are.
(some containers have only inaccurate timestamps, but fixed fps/duration)
The main problem is that in several cases only codecs (or filters) know
the final duration value, so it cannot be used in demuxer level to calculate
the accurate timestamp (as i wanted to do earlier).
> frame. Consider for example the case of vf_decimate (near-duplicate
> frame dropper). If it wants to output duration, it has to grab frames
> into the future until it finds a non-duplicate. But with pts relative
> to previous, it can defer the task of decoding those extra frames
> until the current frame has been displayed, and *then* decoding those
> extra frames takes place during the "sleep time" anyway, so it won't
> cause the a/v sync to stutter on slow systems.
Actually you should (have to) report dropped frames too, by returning NULL.
It ensures that next filters know about frames were dropped (some temporal
filters or for example field<->frame splitters/merge filters require this),
and the final a-v sync code also know about frame being dropped.
Imho the vf_decimate should run one frame before the playback pointer,
so it always return the previous frame if different enough, or return NULL
if similar enough.
Filters altering playback rate should modify the duration of incoming
frames, and reset timestamp of generated new (==inserted) frames.
See the tfields port for example.
And, yes, i know it's not the optimal solution, so i'm open to better
models, although i know there is no better way (within the given
constraints)
> Also, on another matter. I know G1's whole a/v sync system has been
> based on a lot of approximations and feedback measurement. This is
yes
> great for mplayer, and probably also for encoding from variable-fps
> input to fixed-fps avi output with mencoder. However, especially with
> the new perfect-a/v-sync mpeg code in G2, I'd really like to see
> support for "exact" timing in G2. Maybe a way to use a time division
> base and specify all pts stuff in terms of those (exact) units. It
i was thinking on this too, and as you see it's done so in demuxer layer
(rate multiplier and divisor instead of float fps).
although i don't think it does really worth the extra code, and to worry
about the integer ranges everywhere (you never know if the base rate is 1
or 1/100000000000000, so even long long may overflow)
Kabi once doen some calculations on ffmpeg-devel when this topic was
discussed there (about the ticker code) and with double it's accurate
enough to run over several thousands of hours without a single frame delay.
and the pts values may be (and should be, and usually are) calculated from
integer counters (frameno/fps or integer_pts/pts_scale) by the demuxers.
> would be much perferred for precision encoding and video processing
> work, and nice for output to containers like nut (mpcf) which will
> support such things. I'm envisioning my inverse telecine wanting to
> use a time base of 1/120 second, for flawless output of mixed
> telecine, 30fps progressive, and 60fps interlaced content as a single
you shouldn't rely on assuming any fps, think of if anyone is doing -speed
1.356 and then use teh telecine filter. you should use only rates, not
absolute values. ie for inv. telecine multiply incoming duration by 4/5...
> progressive output stream. (Such a perfect rip could *really* get the
> anime fansub groups interested in using mplayer/nut instead of the
> windows junk they use now....)
:)
> If you're worried about not wanting all framerate-changing filters (or
> apps) to have to support this stuff, it could be an optional feature,
> where any filter in the chain can ignore the exact info and just use
> "float pts" the rest of the way down the chain...
argh
A'rpi / Astral & ESP-team
--
Developer of MPlayer G2, the Movie Framework for all - http://www.MPlayerHQ.hu
More information about the MPlayer-G2-dev
mailing list