[FFmpeg-devel] [PATCH] lavfi: Add VAAPI deinterlacer

Mark Thompson sw at jkqxz.net
Sun Jan 8 23:37:04 EET 2017


On 08/01/17 20:48, Andy Furniss wrote:
> Mark Thompson wrote:
> 
>> To offer a bit more information about this:
>>
>> It is adding a filter to dinterlace video on the GPU using VAAPI.
>> This works on both Intel (i965) and AMD (mesa)
> 
> Not so sure about the working with AMD/mesa bit. On git it doesn't for
> me and I kind of didn't expect it to with the enc needing an env to
> disable interlaced buffers to work normally.

VAAPI decode-deinterlace-download works perfectly with the filter for me running mesa git on Polaris (I was testing with an older version, but I updated and rebuilt just now to check).

With the default settings VAAPI encode is not so good - the encoder runs but the output is broken (looks like two separate luma fields and chroma is just random).

> With VAAPI_DISABLE_INTERLACE=1 set the first example will produce output
> (with -bf 0), but it won't be de-interlaced. With the env 0 I am in
> locking vce/gpu territory.

Setting VAAPI_DISABLE_INTERLACE=1 makes the encoder output sensible, but also disables the deinterlacer.  (I have to set this normally to make it work.)

Everything there succeeds with no errors or hangs, it's just the output that isn't as desired.

> Maybe the download examples will work - not time to test yet.

Based on my experience, I think they will.  Still helpful if you can check with your setup, though :)

> TBH mesa vaapi temporal de-int had issues from day 1 (vdpau calling
> same code doesn't)
> 
> It's good this is going in though - I am soon opening a bug about the
> "crappiness" of the env (breaks mpv --vo=vaapi) and it adds another case.

Thanks,

- Mark


More information about the ffmpeg-devel mailing list