[Libav-user] avcodec_decode_video2 doesn't get pal8 palette and how to get "closest" pix_fmt
Matthew Einhorn
moiein2000 at gmail.com
Thu Jul 28 14:15:00 CEST 2011
On Thu, Jul 28, 2011 at 6:02 AM, Stefano Sabatini
<stefano.sabatini-lala at poste.it> wrote:
> On date Tuesday 2011-07-26 13:58:00 -0400, Matthew Einhorn encoded:
>> On Sun, Jul 24, 2011 at 9:41 AM, Stefano Sabatini
>> <stefano.sabatini-lala at poste.it> wrote:
>> > On date Sunday 2011-07-24 04:51:49 -0400, Matthew Einhorn encoded:
> [...]
>> Upon spending some time with the debugger I've isolated the problem
>> into a very weird corner. But first a bit more about my code (partly
>> attached). My code is a dll wrapper to the ffmpeg dlls. Upon one dll
>> call you create an object for a video and initialize all the
>> format/conversion/codec contexts (open function). Then with further
>> dll calls you request the next frame. As said, this works fine now for
>> all video formats I tested with except pal8 (rawvideo). With pal8,
>> calling avcodec_decode_video2(m_pCodecCtx, m_pFrame, &nFrameFinished,
>> m_pAVPacket) copies a palette of zero into m_pFrame->data[1]
>> (palette_has_changed is also zero). So this has nothing to do with the
>> sws_scale, because sws_scale gets a bad palette. So the question is
>> why avcodec_decode_video2 doesn't read the palette. The video file
>> isn't bad because of the following.
>
> So far so good.
>
>> I was able to fix this if before returning from the open function I
>> added one call to avcodec_decode_video2 (and of course before that to
>> av_read_frame). That is, if I asked ffmpeg to decode the first frame
>> before I returned from the function that initialized frames, context
>> etc. the palette was read correctly in the first and subsequent frames
>> (palette_has_changed was one). But if I requested the first frame
>> after returning from my open frame function, in a separate function,
>> the palette isn't read properly.
>>
>> Now, this smells of something going out of context and closed when my
>> open function returns. It cannot be my variables because all of my
>> variables are created as class variables beforehand which stay put. I
>> also don't use any smart pointers or such. So it must be (I think)
>> that one of the av alloc functions clears something if I don't decode
>> a frame before returning from the function that called the av alloc
>> function. I think it's something with the decoder, possibly a buffer?
>>
>> My dlls are called from the same thread every time they are called and
>> the dll doesn't unload or move between calls. Now ffplay does all its
>> work from one central main function with calls to other functions (and
>> it's not a dll) so that's why I think ffplay doesn't have an issue
>> with it.
>>
>
>> Now, I understand that this might be difficult to debug so I'm mostly
>> asking for clues and what to look at. I.e. in all the format/codec
>> contexts structs is there some function pointer or member variables
>> that are responsible for getting the palettes and will help me track
>> down the issue?
>
> Why the
> av_free_packet(m_pAVPacket);
>
> in cDecodeFrame()?
>
> This looks suspicious.
>
The reason for the av_free_packet call after decoding the frame is
that that's how the dranger example did it. But ffplay also does the
same thing at line 1773:
http://www.ffmpeg.org/doxygen/trunk/ffplay_8c-source.html#l01771
When I removed the av_free_packet call it introduced a memory leak
into the application and the memory use of the app grew with each call
to get next frame and it also didn't fix the pal8 palette issue.
>> avcodec_decode_video2 ends up calling some function
>> pointer so I couldn't follow through the code to see where it's
>> actually read. It could also be that the problem is with the the
>> zeranoe dlls in which case this might not be the best place to solve
>> it, but I doubt it because it works fine for all the other videos.
>>
>>
>> >> In particular, from what I seemed to have read and seen of ffmpeg, for
>> >> pal8 AVFrame data[0] is the data, while data[1] is the palette. When
>> >> calling avcodec_decode_video2 on a pal8 video, data[0] is indeed data
>> >> (bunch of different values), while data[1] is an array with all
>> >> elements zero. Indeed, when I edited data[1] to some random values the
>> >> sws_scale output image was not black anymore and you could see the
>> >> remnants of my picture.
>> >>
>> >
>> >> So I'm wondering, is the video file broken and that's why the palette
>> >> doesn't show up? Or did I miss a flag when initializing codec/format
>> >> context etc. so that the palette isn't read?
>> >
>> > AFAIK you don't need any special hacks for working with palette
>> > formats.
>> >
>> >> 2. I'm looking for a function similar to avcodec_find_best_pix_fmt.
>> >> What I want is to pass in a list of formats and the function would
>> >> return what's the closest format. For example, say the source format
>> >> is pal8 and I pass in as possible destination formats: RGB24 and
>> >> GRAY8. Then the function should return GRAY8.
>> >> avcodec_find_best_pix_fmt would return in that case RGB24 which "is"
>> >> the best format, but in this case would waste 2 extra bytes since pal8
>> >> is only 8 bytes depth and gray to start with.
>> >>
>> >> Does a function like this exist? Would it be easy for me to write such
>> >> a function using the ffmpeg API? And if so can I get some pointers?
>> >
>> > Should be easy to hack the logic of avcodec_find_best_pix_fmt() for
>> > implementing an avcodec_find_closest_pix_fmt() or such.
>> >
>>
>> I looked through the code for the above functions and I think as is,
>> the avcodec_find_best_pix_fmt function should return the closest pix
>> format like I want. I think the only reason it doesn't (I think) is
>> because the pal8 format in particular might be set wrongly.
>>
>
>> If you look at the pix_fmt_info array that the
>> avcodec_find_best_pix_fmt1 func is referring to, you'll see this
>> definition for pal8:
>> [PIX_FMT_PAL8] = {
>> .is_alpha = 1,
>> .color_type = FF_COLOR_RGB,
>> .depth = 8,
>> },
>>
>> shouldn't it be .color_type = FF_COLOR_GRAY? Because it's set to
>> FF_COLOR_RGB, the avcodec get loss function returns a chroma and
>> colorspace loss when converting from pal8 to gray8. That's why RGB24
>> gets picked over gray8. But I thought that pal8 is already gray (B/W)
>> so there shouldn't be any loss? Admittedly, I don't know too much
>> about the pix formats.
>
> Pal8 works by storing a palette in data[1], which maps an integer in
> the range 0-255 to an RGBA 32-bits entry.
>
> The actual chromatic image features can be guessed only by analyzing
> the palette itself, in case you have all the RGBA entries set to a
> gray color then it will be gray, but in general a PAL8 image will
> contain a colored (R != G != B) data.
>
I didn't know that. The pal8 videos I saw was gray so I assumed all of
them were (although if they were all gray a palette wouldn't really be
needed...). So avcodec_find_best_pix_fmt should work fine for me then.
Thanks,
Matt
More information about the Libav-user
mailing list