[Ffmpeg-devel] Re: [PATCH] fix DV encoder VLC bit accounting
Wed Feb 22 21:51:52 CET 2006
On Wed, Feb 22, 2006 at 02:19:36PM -0500, Dan Maas wrote:
> > also note that the patch is full of tabs and cosmetical changes ...
> Can you send me the emacs settings to produce clean code? (I remember
> seeing a coding standards document somewhere, but I can't find it atm).
> I am working on some more substantial patches so I'd like to get this
Personally, I just avoid cosmetical changes by not touching code a lot ;-)
Tabs are also easy to avoid depending on the editor (sorry I don't know
how to teach emacs to do it for you -- I'm a vi guy ;-)). But generally
speaking these are the only rules one has to pay attention to, beyond
> > and last the patch recalculates all coefficients of all areas
> > after any !0 -> 0 change, while only a single coefficient would
> > need to be checked and only if the change was the last non zero
> > coeff of an area, ugh this is inefficient code
> > so either
> > * recalc only the single coeff and only when needed
> > * find the worst case bitsize increase and and add that to the amount of
> > bits needed in case a !0->0 chnange at an area end happens
> > * detect during final encoding that too many bits are there and simply
> > redo the whole with higher quantization
> I'd prefer the first option, since it preserves the accuracy of AC bit
> consumption tracking. But, I will have to think about how to implement
> this efficiently.
> Roman, do you have any ideas?
Basically, IIRC the only thing that we have to do when a particular
AC coeff. gets "quantized" to 0 is to add a delta to bit_size[area]
which would account for a longer "run-length" from a previous non-0
AC coef. We don't have to recompute anything else. How to implement
it efficiently -- I don't know. From the top of my head -- may be we
have to have a sort of "back" pointers to the prev non-0 AC coef.
or something. So we'll be reducing the number of recalculations
significantly by trading off a little memory.
The other alternative would be to have a sort of a transition-diagram
for deltas. However it could be rather big, so I'm not sure this
one is worth persuing.
> The "only when needed part" could be accomplished by keeping track of
> the last non-zero coefficient for each area. But I don't see how to
> recalculate only a single coefficient, other than keeping a
> per-coefficient array of VLC lenghts, which seems like it would defeat
> the purpose of this optimization.
> > The fixed encoder is about 6% slower than the unmodified encoder.
> > rejected
> Note that the unmodified encoder produces visible image
> corruption. This is a correctness issue.
To be fair -- yes I agree. Although, working with a lot of DVD -> DV
conversion, I haven't really notice anything (and I think I'm usually
pretty good about these things -- the green tint of libdv was obvious
to me right away). Anyway, side notes aside, I agree -- this is a
correctness issue, but then again it is not the only one which has
took the backburner in the absence of *visible* proof. Weighting
is another thing that is currently missing in ffmpeg's DV codec.
More information about the ffmpeg-devel