[MPlayer-G2-dev] more on frame timing, framerate-changing filters

D Richard Felker III dalias at aerifal.cx
Wed Jun 11 06:14:46 CEST 2003


Hey. I'm working more on improving inverse-telecine stuff. Actually I
think I'm going to test the new design on G1 (even though it's much
more of a pain to deal with fields the way I want to in G1, it's
easier to test with a fully-functional player, and people will be able
to benefit from it sooner).

Anyway, to get to the point, I'm hoping to make this code friendly for
porting to G2 (I already have the timing-change stuff worked out for
it), but I'd like to clear up how we want timing to work. Right now,
it seems G2 has a duration field for the image, but I'm not sure
whether that's preferred, or whether we should instead have a
timestamp relative to the previous frame... IMO, for some filters, we
may need to know *future* information to decide a duration, whereas we
should already know the pts relative to prev frame. And of course, for
performance purposes (and coding simplicity), it's optimal not to have
to grab future frames from up the chain before we output a given
frame. Consider for example the case of vf_decimate (near-duplicate
frame dropper). If it wants to output duration, it has to grab frames
into the future until it finds a non-duplicate. But with pts relative
to previous, it can defer the task of decoding those extra frames
until the current frame has been displayed, and *then* decoding those
extra frames takes place during the "sleep time" anyway, so it won't
cause the a/v sync to stutter on slow systems.

Also, on another matter. I know G1's whole a/v sync system has been
based on a lot of approximations and feedback measurement. This is
great for mplayer, and probably also for encoding from variable-fps
input to fixed-fps avi output with mencoder. However, especially with
the new perfect-a/v-sync mpeg code in G2, I'd really like to see
support for "exact" timing in G2. Maybe a way to use a time division
base and specify all pts stuff in terms of those (exact) units. It
would be much perferred for precision encoding and video processing
work, and nice for output to containers like nut (mpcf) which will
support such things. I'm envisioning my inverse telecine wanting to
use a time base of 1/120 second, for flawless output of mixed
telecine, 30fps progressive, and 60fps interlaced content as a single
progressive output stream. (Such a perfect rip could *really* get the
anime fansub groups interested in using mplayer/nut instead of the
windows junk they use now....)

If you're worried about not wanting all framerate-changing filters (or
apps) to have to support this stuff, it could be an optional feature,
where any filter in the chain can ignore the exact info and just use
"float pts" the rest of the way down the chain...

Rich





More information about the MPlayer-G2-dev mailing list