[FFmpeg-devel] [PATCH] WebP native muxer bugfix: frames should have alpha blending off
urvang at google.com
Thu Jun 25 22:07:06 CEST 2015
On Thu, May 28, 2015 at 8:30 PM Michael Niedermayer <michaelni at gmx.at>
> On Thu, May 28, 2015 at 05:52:35PM +0000, Urvang Joshi wrote:
> > On Wed, May 27, 2015 at 5:33 PM Michael Niedermayer <michaelni at gmx.at>
> > wrote:
> > > On Wed, May 27, 2015 at 03:10:05PM -0700, Urvang Joshi wrote:
> > > > All the frames that the native muxer gets are fully reconstructed
> > > > frames,
> > >
> > > wrong
> > >
> > > > and they should not be alpha-blended with the previous frames.
> > > >
> > > > As per the WebP container spec
> > > >
> > > > ANMF chunk should specify blending method = do not blend (and
> > > > method = do not dispose).
> > > >
> > > > However, the native muxer was wrongly setting blending method = use
> > > > alpha blending.
> > > > This bug can be reproduced by converting a source with transparency
> > > > (e.g. animated GIF with transparency) to an animated WebP, and
> > > > it with vwebp.
> > > > ---
> > > > libavformat/webpenc.c | 2 +-
> > > > 1 file changed, 1 insertion(+), 1 deletion(-)
> > >
> > > this breaks the encoder completely
> > > the testcase is the same as previously but probably any testcase that
> > > enables encoding multi frame animations will do
> > > try -cr_threshold 10000 -cr_size 16 for example
> > >
> > Ah, the problem seems to be for sources which don't have alpha, and then
> > alpha is introduced by 'cr_threshold' and 'cr_size' for example. [There
> > other cases too, but this is one example].
> > I believe the logic for cr_threshold / cr_size is incorrect then,
> > unfortunately. Here's why:
> > 1. The original frame that the encoder gets (before possibly being
> > based on cr_threshold and cr_size) is fully-reconstructed, and should NOT
> > be alpha-blended with the previous frame.
> > [Yes, this is true. You can repro this bug by converting this GIF to WebP
> > before this patch: http://dhelemann.de/images/Flug1.gif]
> > For example, if this frame had a transparent pixel, it should be shown as
> > transparent pixel and should NOT see-through the corresponding pixel from
> > the previous frame. This would be achieved by setting blending method =
> > not blend".
> > 2. On the other hand, based on the cr_threshold and cr_size, some pixels
> > which are 'similar' to the corresponding pixels in the previous frame are
> > modified to be transparent. So, this logic expects that the frame is
> > alpha-blended with the previous frame (to see-through pixels from the
> > previous frame).
> > Clearly, the two requirements are contrasting and cannot be met.
> > One solution I can think of:
> > (1) By default, set blending method = "do not blend"
> > (2) Some pixels can be modified to become transparent ONLY IF the
> > frame doesn't have any transparent pixels. And if some pixels are made
> > transparent, we set blending method = "blend".
> sounds possible unless iam too tired and misunderstand
Alright, I'm implementing this behavior then.
- Whether the original frame had transparent pixels or not is determined in
the encoder; but
- The 'blending method' flag is written in the muxer.
Given this, how do I pass along this info (selected "blend method" for each
frame) from encoder to muxer?
> > Thoughts?
> webp allows updating the last frame by using alpha, but it does not
> allow a normal RGBA difference update like a P frame with 0,0 MVs
> would be IIUC.
> supporting something similar to P frames would be better than trying
> to emulate GIF
> I never tried to reencode a gif to webp and dont plan it in the future
> either, i think the whole webp design is too much based on gif not
> considering that it could be used for material that is not from a gif
> Michael GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB
> DNS cache poisoning attacks, popular search engine, Google internet
> dont be evil, please
> ffmpeg-devel mailing list
> ffmpeg-devel at ffmpeg.org
More information about the ffmpeg-devel