[FFmpeg-devel] [PATCH] lavc/mediacodec: add hwaccel support

wm4 nfxjfg at googlemail.com
Fri Apr 15 12:57:30 CEST 2016


On Thu, 14 Apr 2016 14:19:42 +0200
Matthieu Bouron <matthieu.bouron at gmail.com> wrote:

> On Thu, Apr 7, 2016 at 4:18 PM, wm4 <nfxjfg at googlemail.com> wrote:
> 
> > On Fri, 18 Mar 2016 17:50:39 +0100
> > Matthieu Bouron <matthieu.bouron at gmail.com> wrote:
> >  
> > > From: Matthieu Bouron <matthieu.bouron at stupeflix.com>
> > >
> > > ---
> > >
> > > Hello,  
> >
> > Can't say much about this, so just some minor confused comments.
> >  
> 
> Thanks for your comments and sorry for the late reply.

Well, I've also taken some time to send that review...

> 
> >  
> > >
> > > The following patch add hwaccel support to the mediacodec (h264) decoder  
> > by allowing  
> > > the user to render the output frames directly on a surface.
> > >
> > > In order to do so the user needs to initialize the hwaccel through the  
> > use of  
> > > av_mediacodec_alloc_context and av_mediacodec_default_init functions.  
> > The later  
> > > takes a reference to an android/view/Surface as parameter.
> > >
> > > If the hwaccel successfully initialize, the decoder output frames pix  
> > fmt will be  
> > > AV_PIX_FMT_MEDIACODEC. The following snippet of code demonstrate how to  
> > render  
> > > the frames on the surface:
> > >
> > >     AVMediaCodecBuffer *buffer = (AVMediaCodecBuffer *)frame->data[3];
> > >     av_mediacodec_release_buffer(buffer, 1);
> > >
> > > The last argument of av_mediacodec_release_buffer enable rendering of the
> > > buffer on the surface (or not if set to 0).
> > >  
> >
> > I don't understand this (at all), but unreferencing the AVFrame should
> > unref the underlying surface.
> >  
> 
> In this case, the underlying surface will remain (it is owned by the codec
> itself) but the output buffer (that should be renderered to the surface)
> will be discarded.
> 

So: the AVFrame (and AVMediaCodecBuffer) really reference two buffers,
the codec and output buffer? And this API call releases the codec
buffer? Why can't it be released immediately?

> >  
> > > Regarding the internal changes in the mediacodec decoder:
> > >
> > > MediaCodec.flush() discards both input and output buffers meaning that if
> > > MediaCodec.flush() is called all output buffers the user has a reference  
> > on are  
> > > now invalid (and cannot be used).
> > > This behaviour does not fit well in the avcodec API.
> > >
> > > When the decoder is configured to output software buffers, there is no  
> > issue as  
> > > the buffers are copied.
> > >
> > > Now when the decoder is configured to output to a surface, the user  
> > might not  
> > > want to render all the frames as fast as the decoder can go and might  
> > want to  
> > > control *when* the frame are rendered, so we need to make sure that the
> > > MediaCodec.flush() call is delayed until all the frames the user retains  
> > has  
> > > been released or rendered.
> > >
> > > Delaying the call to MediaCodec.flush() means buffering any inputs that  
> > come  
> > > the decoder until the user has released/renderer the frame he retains.
> > >
> > > This is a limitation of this hwaccel implementation, if the user retains  
> > a  
> > > frame (a), then issue a flush command to the decoder, the packets he  
> > feeds to  
> > > the decoder at that point will be queued in the internal decoder packet  
> > queue  
> > > (until he releases the frame (a)). This scenario leads to a memory usage
> > > increase to say the least.
> > >
> > > Currently there is no limitation on the size of the internal decoder  
> > packet  
> > > queue but this is something that can be added easily. Then, if the queue  
> > is  
> > > full, what would be the behaviour of the decoder ? Can it block ? Or  
> > should it  
> > > returns something like AVERROR(EAGAIN) ?  
> >
> > The current API can't do anything like this. It has to output 0 or 1
> > frame per input packet. (If it outputs nothing, the frame is either
> > discarded or queued internally. The queue can be emptied only when
> > draining the decoder at the end of the stream.)
> >
> > So it looks like all you can do is blocking. (Which could lead to a
> > deadlock in the API user, depending of how the user's code works?)
> >  
> 
> Yes if I block at some point, it can lead to a deadlock if the user never
> releases all the frames. I'm considering buffering a few input packets
> before blocking.

OK, I guess this can't be avoided. Maybe the user should get the
possibility to control how many surfaces are buffered at most (before a
deadlock happens).

> 
> >  
> > >
> > > About the other internal decoder changes I introduced:
> > >
> > > The MediaCodecDecContext is now refcounted (using the lavu/atomic api)  
> > since  
> > > the (hwaccel) frames can be retained by the user, we need to delay the
> > > destruction of the codec until the user has released all the frames he  
> > has a  
> > > reference on.
> > > The reference counter of the MediaCodecDecContext is incremented each  
> > time an  
> > > (hwaccel) frame is outputted by the decoder and decremented each time a
> > > (hwaccel) frame is released.
> > >
> > > Also, when the decoder is configured to output to a surface the pts that  
> > are  
> > > given to the MediaCodec API are now rescaled based on the codec_timebase  
> > as  
> > > those timestamps values are propagated to the frames rendered on the  
> > surface  
> > > since Android M. Not sure if it's really useful though.  
> >
> > That's all nice, but where is this stuff documented at all?
> >  
> 
> It is documented here:
> http://developer.android.com/reference/android/media/MediaCodec.html#queueInputBuffer%28int,%20int,%20int,%20long,%20int%29

OK, I was just thinking that there might be things specific to the
ffmpeg wrapper.

> > > +
> > > +static int mediacodec_dec_flush_codec(AVCodecContext *avctx,  
> > MediaCodecDecContext *s)  
> > > +{
> > > +    FFAMediaCodec *codec = s->codec;
> > > +    int status;
> > > +
> > > +    s->queued_buffer_nb = 0;
> > > +    s->dequeued_buffer_nb = 0;
> > > +
> > > +    s->draining = 0;
> > > +    s->flushing = 0;
> > > +
> > > +    status = ff_AMediaCodec_flush(codec);
> > > +    if (status < 0) {
> > > +        av_log(NULL, AV_LOG_ERROR, "Failed to flush MediaCodec %p",  
> > codec);  
> > > +        return AVERROR_EXTERNAL;
> > > +    }
> > > +
> > > +    s->first_buffer = 0;
> > > +    s->first_buffer_at = av_gettime();  
> >
> > What is the system time doing here?
> >  
> 
> It is here for debugging/information purpose, to know how many time it took
> to the codec to buffer and output its first buffer when it starts or after
> a flush (discard).

Well, a bit questionable to have in the final code, but ok I guess.

> 
> I will wait the hwaccel stuff from libav to be merged in order to resubmit
> (and push if ok) this patch.

I don't think this changes much. There is now AVHWFramesContext, which
may help with refcounting some things and simplifying the
opaque/readback code paths. I find it useful, but it's not an absolute
requirement. The old API still works, both internally and externally.

Also, most of these changes have been already merged for a while.



More information about the ffmpeg-devel mailing list