[FFmpeg-devel] [PATCH 1/4] vp8: Add hwaccel hooks

Hendrik Leppkes h.leppkes at gmail.com
Mon Feb 20 13:08:10 EET 2017


On Mon, Feb 20, 2017 at 11:44 AM, Michael Niedermayer
<michael at niedermayer.cc> wrote:
> On Mon, Feb 20, 2017 at 10:23:27AM +0000, Mark Thompson wrote:
>> On 20/02/17 02:35, Michael Niedermayer wrote:
>> > On Sun, Feb 19, 2017 at 09:29:33PM +0000, Mark Thompson wrote:
>> >> On 19/02/17 21:04, Ronald S. Bultje wrote:
>> >>> Hi,
>> >>>
>> >>> On Sun, Feb 19, 2017 at 12:23 PM, Mark Thompson <sw at jkqxz.net> wrote:
>> >>>
>> >>>> diff --git a/libavcodec/webp.c b/libavcodec/webp.c
>> >>>>
>> >>> [..]
>> >>>
>> >>>> +        avctx->get_format = webp_get_format;
>> >>>
>> >>>
>> >>> Docs say:
>> >>> "decoding: Set by user, if not set the native format will be chosen."
>> >>> So I don't think decoders are supposed to set this.
>> >>
>> >> The webp decoder does not support get_format.  I suppose the user could technically store something there and then read it back, so save and restore the value across the relevant calls?
>> >
>> > This is quite ugly, why do you want to do that ?
>> >
>> > get_format is set by the user
>> > the get_format API requires the function to choose one of the caller
>> > provided formats and it can choose any.
>> > I dont know what your function does but its different from the API.
>> > It smells very much like a hack ...
>> >
>> > The one situation in which you can set get_format from libavcodec is
>> > when there is a seperate codec context you created, that is that you
>> > are its user.
>> >
>> > can you explain why this code messes with avctx->get_format ?
>> > and for example doesnt change the code that calls get_format so that
>> > it passes a correct pixel format list which then by any get_format()
>> > results in a correct format ?
>> > or am i missing something ?
>>
>
>> The current WebP decoder calls the VP8 decoder /using the same AVCodecContext/.
>
> so they are kind of the same decoder
>
>
>> Previously the format was set directly on the VP8Context, but now that the VP8 decoder uses ff_get_format() that initial value is not used.  This change adds a get_format() callback on the AVCodecContext used by the VP8 decoder to set the format appropriately.
>
> But this is semantically wrong, the formats supported by the decoder
> implementation are choosen by the code calling get_format().
> the get_format callback chooses amongth these formats depending on the
> users preferrance/usefullness.
>
>
>>
>> Now, because that is the same AVCodecContext as the one supplied by the user, the user could themselves have stored something into the get_format field (even though it isn't used by the WebP decoder) and expect to be able to read it back correctly later.  I think this would be madness, but is I suppose technically allowed; saving and restoring the value across the VP8 calls would fix that case.
>
> Why do you try to misuse the API ?
> i mean i think we agree that this violates the API. Making sure it
> works doesnt solve that it violates the API. And anyone working on
> the API would likely assume that code conforms to the API documentation.
> -> the developer would think one thing and the code would do another
> thats a recipe for creating bugs.
>
> I think the code calling *get_format() should pass the correct list
> of formats to the callback and not some other part override the users
> get_format calback to work around that the list passed is wrong.
>
> am i missing something ?
> is what i suggested difficult to do or do you see a issue with that
> solution ?
>

The entire WebP alpha concept is a huge hack, because the vp8
bitstream or vp8 decoder know nothing of the alpha, its coded
separately.
However, to make room to store the alpha, the vp8 decoder needs to be
told to decode into an alpha-enabled format, so that the webp decoder
can fill it.

The vp8 decoder can't know its supposed to do this, unless someone
tells it externally - which is where get_format comes in.

- Hendrik


More information about the ffmpeg-devel mailing list