[FFmpeg-devel] [PATCH] WebP native muxer bugfix: frames should have alpha blending off
Urvang Joshi
urvang at google.com
Tue Jul 14 02:20:43 CEST 2015
Hi Michael,
How about sending the blend method info using side_data in the AVFrame?
Here's the patch using that method. If this looks fine, I can separate it
out into avformat/* changes and avcodec/* changes.
Thanks,
Urvang
On Thu, Jun 25, 2015 at 6:08 PM Michael Niedermayer <michaelni at gmx.at>
wrote:
> On Thu, Jun 25, 2015 at 08:07:06PM +0000, Urvang Joshi wrote:
> > On Thu, May 28, 2015 at 8:30 PM Michael Niedermayer <michaelni at gmx.at>
> > wrote:
> >
> > > On Thu, May 28, 2015 at 05:52:35PM +0000, Urvang Joshi wrote:
> > > > On Wed, May 27, 2015 at 5:33 PM Michael Niedermayer <
> michaelni at gmx.at>
> > > > wrote:
> > > >
> > > > > On Wed, May 27, 2015 at 03:10:05PM -0700, Urvang Joshi wrote:
> > > > > > All the frames that the native muxer gets are fully reconstructed
> > > > > > frames,
> > > > >
> > > > > wrong
> > > >
> > > >
> > > > >
> > > > > > and they should not be alpha-blended with the previous frames.
> > > > > >
> > > > > > As per the WebP container spec
> > > > > >
> > > https://developers.google.com/speed/webp/docs/riff_container#animation
> ,
> > > > > > ANMF chunk should specify blending method = do not blend (and
> > > disposal
> > > > > > method = do not dispose).
> > > > > >
> > > > > > However, the native muxer was wrongly setting blending method =
> use
> > > > > > alpha blending.
> > > > > > This bug can be reproduced by converting a source with
> transparency
> > > > > > (e.g. animated GIF with transparency) to an animated WebP, and
> > > viewing
> > > > > > it with vwebp.
> > > > > > ---
> > > > > > libavformat/webpenc.c | 2 +-
> > > > > > 1 file changed, 1 insertion(+), 1 deletion(-)
> > > > >
> > > > > this breaks the encoder completely
> > > > > the testcase is the same as previously but probably any testcase
> that
> > > > > enables encoding multi frame animations will do
> > > > > try -cr_threshold 10000 -cr_size 16 for example
> > > > >
> > > >
> > > > Ah, the problem seems to be for sources which don't have alpha, and
> then
> > > > alpha is introduced by 'cr_threshold' and 'cr_size' for example.
> [There
> > > are
> > > > other cases too, but this is one example].
> > > >
> > > > I believe the logic for cr_threshold / cr_size is incorrect then,
> > > > unfortunately. Here's why:
> > > > 1. The original frame that the encoder gets (before possibly being
> > > modified
> > > > based on cr_threshold and cr_size) is fully-reconstructed, and
> should NOT
> > > > be alpha-blended with the previous frame.
> > > > [Yes, this is true. You can repro this bug by converting this GIF to
> WebP
> > > > before this patch: http://dhelemann.de/images/Flug1.gif]
> > > >
> > > > For example, if this frame had a transparent pixel, it should be
> shown as
> > > > transparent pixel and should NOT see-through the corresponding pixel
> from
> > > > the previous frame. This would be achieved by setting blending
> method =
> > > "do
> > > > not blend".
> > > >
> > > > 2. On the other hand, based on the cr_threshold and cr_size, some
> pixels
> > > > which are 'similar' to the corresponding pixels in the previous
> frame are
> > > > modified to be transparent. So, this logic expects that the frame is
> > > > alpha-blended with the previous frame (to see-through pixels from the
> > > > previous frame).
> > > >
> > > > Clearly, the two requirements are contrasting and cannot be met.
> > > >
> > >
> > > > One solution I can think of:
> > > > (1) By default, set blending method = "do not blend"
> > > > (2) Some pixels can be modified to become transparent ONLY IF the
> > > original
> > > > frame doesn't have any transparent pixels. And if some pixels are
> made
> > > > transparent, we set blending method = "blend".
> > >
> > > sounds possible unless iam too tired and misunderstand
> > >
> >
> > Alright, I'm implementing this behavior then.
> >
> > One question:
> > - Whether the original frame had transparent pixels or not is determined
> in
> > the encoder; but
> > - The 'blending method' flag is written in the muxer.
> >
> > Given this, how do I pass along this info (selected "blend method" for
> each
> > frame) from encoder to muxer?
>
> I would try to seperate demuxer and decoder in such a way that
> the output from the encoder is self contained and can be fed to the
> decoder and decode correctly. That is i would make the chunks that
> are needed for decoding part of data that belongs to the
> decoder/encoder layer, if that is possible
>
>
>
> [...]
> --
> Michael GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB
>
> The real ebay dictionary, page 1
> "Used only once" - "Some unspecified defect prevented a second use"
> "In good condition" - "Can be repaird by experienced expert"
> "As is" - "You wouldnt want it even if you were payed for it, if you knew
> ..."
> _______________________________________________
> ffmpeg-devel mailing list
> ffmpeg-devel at ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-devel
>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: 0001-WebP-native-muxer-bugfix-frames-should-have-appropri.patch
Type: text/x-patch
Size: 10975 bytes
Desc: not available
URL: <http://ffmpeg.org/pipermail/ffmpeg-devel/attachments/20150714/ffa4d6f7/attachment.bin>
More information about the ffmpeg-devel
mailing list