[Libav-user] FFmpeg parallel H264 Decoding

Michael Zucchi notzed at gmail.com
Mon Aug 6 02:15:50 CEST 2012


On 06/08/12 03:13, Christian Brümmer wrote:

> I've another question cause it seems that you are familiar with FFmpeg
> and its dependancies. The problem beside decoding (@Android Phone )is
> the Color-Conversion from YUV to RGBA. I m using swscale (performance is
> worse ~ about the decoding time) and i heared about splitting the image
> in two parts and use swscale parallel for each part. Is that possible?
> If not - is there a way to speed up colour conversion (some thing like
> "thread_count") for android? I get a log output that says there is no
> hardware acceleration for colour conversion - is that still right or is
> there a patch somewhere to use advanced neon commands for
> color-conversions?

I used opengles2 for this, it works well and gives you the scaling 
directly and is pretty well ideal if all you're doing is displaying it 
(also requires smaller texture uploads - which are #@#@$# slow - the 
android player uses external textures which directly load YUV from 'cpu 
ram', but these are not exposed apis).  swscale is way too slow (unless 
it's been improved lately for arm), but if you want to try it MT the api 
directly handles 'slices', so converting it to MT should be fairly 
simple and obvious.

For optimised versions, the yuv conversion alone isn't too hard to write 
(i'm sure there is some out there for neon or at least arm, search for 
it) but if you need scaling it complicates things a little bit.

You can do the yuv conversion concurrently to decoding the next frame, 
so if all threads are busy decoding the video already, doing an MT 
conversion wont help.

But either way, mp4 is just slow to decode and you hit limits trying to 
do it in software.  If you have control over both ends you're not tied 
to one format though.

  Michael



More information about the Libav-user mailing list