[MPlayer-G2-dev] vp layer and config

Michael Niedermayer michaelni at gmx.at
Mon Dec 15 13:22:24 CET 2003


Hi

On Monday 15 December 2003 13:06, D Richard Felker III wrote:
> On Mon, Dec 15, 2003 at 12:10:48PM +0100, Michael Niedermayer wrote:
> > Hi
> >
> > On Monday 15 December 2003 11:49, D Richard Felker III wrote:
> > > On Mon, Dec 15, 2003 at 10:13:02AM +0100, Arpi wrote:
> > > > - split mp_image to colorspace descriptor (see thread on this list)
> > > >   and buffer descriptor (stride, pointers), maybe a 3rd part
> > > > containing frame descriptor (frame/field flags, timestamp, etc so
> > > > info related to the visual content of the image, not the phisical
> > > > buffer itself, so linear converters (colorspace conf, scale, expand
> > > > etc) could simply passthru this info and change buffer desc only)
> > >
> > > I've been working on implementing this, but there's one element of
> > > mp_image_t I'm not sure where to put. Actually this has been bothering
> > > me for a while now. The exported quant_store (qscale). In G1 the
> > > pointer just gets copied when passing it on through filters, but this
> > > is probably between mildly and seriously incorrect, especially with
> > > out-of-order rendering.
> > >
> > > IMO storing quant table in the framedesc isn't a good idea, since
> > > quantizers are only valid for the original buffer arrangement.
> > > Actually, I tend to think they belong in the buffer descriptor, almost
> > > like a fourth plane. But who should be responsible for allocating and
> > > freeing the quant plane? IMO the only way it can really work properly
> > > is to have the same code that allocates the ordinary planes be
> > > responsible for the quant plane too..
> >
> > btw, we could also pass other stuff like motion vectors around, these
> > maybe usefull for fast transcoding
>
> And MB types too?? :)
yes

>
> There's lots of stuff we _could_ pass around. The problem is doing it
> in a sane way that doesn't overcomplicate things for codecs/filters.
>
> > the quant plane is always (width+15)/16 x (height+15)/16 big, but we
> > could use something like
>
> This is true for mpeg1/2/4. But is it the same for mpeg2 with 4:2:2
> sampling? And what about strange codecs like svq3?
its true for svq3 & mpeg2 4:*:* too AFAIK

>
> > enum PlaneType{
> >     Y_PLANE,
> >     CB_PLANE,
> >     CR_PLANE,
> >     ALPHA_PLANE,
> >     QUANT_PLANE,
> >     FORWARD_MOTION_PLANE,
> >     BACKWARD_MOTION_PLANE,
> > }
> > struct PlaneDescriptor{
> >     int bpp;  //bits per pixel (32bit for 2x16bit motion vectors)
> >     int log2_subsample[2]; // like chroma_w/h_shift
> >     int offset[2]; //x/y offsets of the 0,0 sample relative to the luma
> > plane in 1/2 sample precission
> > }
>
> I think something in between what you proposed and the old "dumb" way
> of doing it is probably appropriate. If you have too much flexibility,
> then filters and codecs have to handle all the cases that allows. :(
we could enforce some restrictions and still keep it flexible, allthough we 
probably must then put some checks in the code to ensure that noone violates 
them cuz of lack of RTFM (or lack of WTFM)

>
> If we do want to define extra planes like this (quant & palette are
> definitely needed, and maybe your mv's too) they should be at fixed
> indices in the planes array (maybe that's what you meant by using the
> planetype enum). 
yes, that was exactly the reason

[...]
-- 
Michael
level[i]= get_vlc(); i+=get_vlc();		(violates patent EP0266049)
median(mv[y-1][x], mv[y][x-1], mv[y+1][x+1]);	(violates patent #5,905,535)
buf[i]= qp - buf[i-1];				(violates patent #?)
for more examples, see http://mplayerhq.hu/~michael/patent.html
stop it, see http://petition.eurolinux.org & http://petition.ffii.org/eubsa/en



More information about the MPlayer-G2-dev mailing list