[FFmpeg-devel] zlib decoder
Måns Rullgård
mans
Wed Jul 4 03:17:05 CEST 2007
Michael Niedermayer <michaelni at gmx.at> writes:
> Hi
>
> On Tue, Jul 03, 2007 at 10:40:34PM +0100, M?ns Rullg?rd wrote:
>> Michael Niedermayer <michaelni at gmx.at> writes:
>>
>> > Hi
>> >
>> > On Mon, Jul 02, 2007 at 10:30:16PM +0100, M?ns Rullg?rd wrote:
>> >> Here, at long last, is my highly anticipated zlib decoder.
>> >>
>> >> It decompresses a random choice of gzip files I've tried it on
>> >> correctly. I'm sure there still are corner cases I haven't covered,
>> >> though. Any help finding, and better yet fixing, these is
>> >> appreciated.
>> >>
>> >> Speedwise it's on par with gunzip, with large buffer sizes even a bit
>> >> faster.
>> > [...]
>> >> static const unsigned int len_tab[29][2] = {
>>
>> [...]
>>
>> >> static const unsigned int dist_tab[32][2] = {
>>
>> [...]
>>
>> > this fit in short
>>
>> Yes, it does.
>>
>> > are all thouse macros really needed?
>> > cant this be implemented in a more readable way?
>>
>> Some could probably be made into functions. Every place that reads
>> bits needs to be reachable from the switch statement though. I moved
>> the huffman code parsing out of the main loop to make that part easier
>> to read.
>>
>> Do you see any real bugs? Optimisations? API comments?
>
> bugs no, optimiztaions, yes, get rid of the macros and switch :)
> that is instead of
>
> case foobar:
> ctx->state= foobar;
> if(!do we have enough bits){
> "safe state" and return, we will continue from the case above
> }
>
> i would rather not even start decompressing if iam low on input data
> but rather accumulate a minimum amount of kbytes unless its the last call
> this should simplify the code alot, things like headers and huffman
> tables should be guranteed to be fully available
> i would just check in the 2 block loops if iam close to out of data
> and if so return and then continue from there
The problem is that there is no telling ahead of time how many bits
are needed. For instance, the maximum theoretical size of the huffman
tables is much larger than is practical to buffer.
We also need to cope with use cases like the zmbv decoder. There each
GOP is a continuous zlib stream, but the decoder assumes that each
frame of input will produce the required output. If we buffer the
bitstream this might not happen as expected.
--
M?ns Rullg?rd
mans at mansr.com
More information about the ffmpeg-devel
mailing list