[MPlayer-dev-eng] video filter layer
uhmmmm at gmail.com
Sat Apr 21 14:41:33 CEST 2007
Michael Niedermayer wrote:
> * many parts are undocumented and noone really understands the whole
> libmpcodecs, RTFS also doesnt help as the authors of various filters
> might have had a different idea on how things work ...
> direct rendering with slices is "fun"
> the problem is that while frames are output in display order from codecs
> (at least all current codecs behave that way, ...) slices are output
> for the currently coded frame which is generally a future frame
> so for example a filter (independant of the filter API) would get with
> a common mpeg2 (IBBPBBP) video
[snip coded frame order]
I've thought a little about this. Here's the idea I have at the moment:
Obviously, some filters need frames in display order (temporal blur,
probably ivtc, though I admit to not understanding much about it, etc).
Certainly the final video output or video encoder does. But there are
plenty of simpler filters which can operate in any order (crop, scale,
spatial blur, etc). Let the filters specify if they require display
order or not. All filters before the first display-order-only
filter/video output can be processed in the order they come out of the
After that, the frames would need buffered until we can output them to
the remaining filters in display order (I assume the API provides a way
to know when you've gotten the next frame in display order? I haven't
looked at that part of the code yet). This would be handled
automatically by the filter system.
Of course, by buffering the frames like that, we probably push the data
out of the cache and lose some of the benefit of slices. On the other
hand, there's no reason I see that the reordered frames can't be split
up and processed as slices for the rest of the filters.
But then, maybe it's faster to do the reordering at the beginning, and
run all the frames through the filters as slices in display order?
Might be worth benchmarking.
More information about the MPlayer-dev-eng