[FFmpeg-devel] [PATCH 1/2] lavf/avienc: Simplify palette handling

Mats Peterson matsp888 at yahoo.com
Sat Feb 27 18:05:20 CET 2016


On 02/27/2016 05:45 PM, Reimar Döffinger wrote:
>>>>> I agree that it should be stored in a side data packet myself normally,
>>>>> and that this is a somewhat weird construction. It probably has to do
>>>>> with the nut format originally, which stores raw palettized data after
>>>>> the video data in the packets. Anyway, I have accepted the facts. For
>>>>> the record, the new ff_reshuffle_raw_rgb() function written by Michael
>>>>> in lavf/rawutils.c that aligns strides properly for AVI and QuickTime,
>>>>> will set a CONTAINS_PAL flag if the packet size is larger than the
>>>>> actual video data. He has hardcoded the palette size to 1024 bytes in
>>>>> that file.
>>>>>
>>>>> Mats
>>>>>
>>>>
>>>> The nut format stores the PALETTE after the video data in the packets,
>>>> nothing else :)
>>>>
>>>
>>> In any case, on muxing, the packets will have the palette after the
>>> video data in the packets, whether it's AVI or QuickTime. Neither
>>> avienc.c or movenc.c uses any side data packets for the palette.
>>>
>>> Michael's intention has been to enable palette switching in the middle
>>> of the stream, hence storage of the palette in each packet, and AVI
>>> supports it by using the 'xxpc' chunks in the video data. It is also
>>> implemented by now.
>>>
>>> Mats
>>>
>>
>> Not that it couldn't be done with side data packets, though.
>
> If it doesn't support side data then the muxers are plain broken.

The muxers should support side data, I agree on that. The thing is the 
packets come in to the AVI, QuickTime and nut muxers with the palette 
data appended to the video data in the packets. That's a choice Michael 
have made, I think. Possibly someone else, I don't know. Anyway, Michael 
once told me that he would like to view palettized frames as 
"self-contained", where the palette is part of the video data in an 
atomic way.

> If the nut muxer stores palette by appending it to the frames,
> then the demuxer should split it out into side data.
> Note that I am absolutely not a fan of this side data stuff,
> but since we already decided to do it like that then that's
> the way we need to go, not randomly doing one way in one
> place and differently in another, that just makes for unusable
> API.

I agree on that. I'm a bit confused myself about the differing ways of 
storing the palette...

> 2) There are file formats that store it that way and we cannot easily
>     split it into side data. Not sure that can really happen.
>
nut stores the palette in every frame in the file, in its current shape.

Anyway, things work OK right now, and even if the situation is slightly 
confusing, it's probably not a good idea to start messing around with 
things until a clearly defined way of storing the palette for the muxers 
and demuxers has been presented.

Mats



More information about the ffmpeg-devel mailing list