[MPlayer-dev-eng] mp-g2 pre14

Arpi arpi at thot.banki.hu
Mon Apr 21 03:17:52 CEST 2003


Hi,

> > > I dunno how this should be done, but there MUST be some way for the
> > > player to 'hook in' between the filter chain and libvo2, in order to
> > > control the rate at which frames are displayed. Otherwise, filters
> > > that alter framerate (e.g. showing interlaced video at 60 fields per
> > > second, inverse telecine, etc) cannot work properly! This was one of
> > > the big deficiencies in the old design, so I hope G2's architecture
> > > will fix it.
> > 
> > I want to fix it, but can't imagine how to fix it yet.
> > Maybe i can't avoid moving the video timing/delaying code to vf_vo2, maybe
> > with a callback to the player (to be able to use idle time for something
> > more useful).
> 
> Bleh...
> 
> Would it perhaps be possible to just return the delay to the caller,
> and then have the caller signal back to libvo2 when it's time to
> actually show the frame?

it's the old (mplayer) way

> Of course, I don't know how this could handle
> filters that increase framerate (multiple output frames per input
> frame).

yes, _this_ is the problem with old/current way.
core calls codec/filter to process a frame, and then it got 2 with different
pts... without things like slices/dr it wouldn't be problem to return a list
of picture buffers with pts attached, but things are not so simple here...
and we have to prepare to evil cases, when a vo driver have a single, always
visible buffer only... the second picture will overwrite the first one.

> The callback idea is just extremely ugly, but it may be the
> only way...
it seems :(
or we have to disallow increasing fps in filters.

or we have to make hacks, i have an idea, but it's also ugly, and limited:
filters could send "pts requests" to the core, into a queue, and call
put_image only for the first output frame (and send pts of second one).
then core, after displaying the frame, it could process requests from the
queue, and call a special filter control() at the given times. so filters
could send the inserted frames at the right time.
(i know, i said it's ugly)
also these frames could be dropped, like B frames

> > Handling PTSs is tricky. Actually there is 2 layer of timestamps.
> > The raw ones (coming from demuxers, usually inaccurate, with discontinuities
> > (not only DVD is such thing, some other formats is even worse), and the
> > 'smoothed' ones, currently in sh_audio->timer & sh_video->delay.
> > 
> > My current (not final) plan for g2:
> > 
> > Demuxers will retrieve the raw timestamps. Let decoders to modify
> > timestamps, if they want (because of frame reordering, in-codecs timestamps
> > (realvideo, mpeg4, h263)). I'll do the pts smoothing in dec_video.c
> > (interface between codecs and filters). Maybe i have to add a demuxer hook
> > for that purpose, because smoothing algo depends on demuxer/container format.
> > 
> > Then the smoothed, linear timestamp will be passed to the filters along with
> > the video frames. They can alter it, or just pass to the next filter.
> > At the end, either the vo driver or vf_vo2 will delay and then show the
> > frame, or the ve driver writes it out with timestamp, or drop/duplicate for
> > fixed-fps formats.
> 
> I'm not sure, but my guess would be that smoothed timestamps are bad.

ok 'smoothed' is not the right word for that... call that preprocessed.

and mpeg2 is a very special case, as it have a fixed FPS, but a duration
modifier factor for each frame.
for mpeg, 'smoothing' means: we have an own time counter, initially synced
to the raw PTS, then incremented by (1/fps)*duration_factor at each frame.
so you have a very accurate pts, while the raw pts from teh steram is
inaccurate (especially that it doesn't belong to frames, but to raw PS
packets which usually begin at the middle of a frame, or are very rare,
only one for a GOP or so, on most DVDs)
(of couse it should be smoothed to the raw pts with a very low alpha, to
avoid desync, and have to be reinited when raw pts jumps)

believe me, you don't want to meet those evil raw timestamps in filters...

> For proper inverse telecine of mixed hard- and soft-telecine content,
> you need the exact timestamp as well as the values of the tff/bff and
> repeat field flags.

tff/bff may be a problem... it's very sepcific to a given container & format


A'rpi / Astral & ESP-team

--
Developer of MPlayer, the Movie Player for Linux - http://www.MPlayerHQ.hu



More information about the MPlayer-dev-eng mailing list