[Libav-user] Multithreaded decoding of multi-program transport streams
antoine at nagafix.co.uk
Tue Dec 3 08:36:05 CET 2013
> I use multiple encoders, in my case, simultaneously in different
> threads, and it works fine. I think decoders should work fine too.
> One thing I do is register a custom locking function via
> av_lockmgr_register() just in case some bit of internal library code
> wants to serialize threads through a critical section.
Can someone please clarify for me when using "av_lockmgr_register" is
required as I am not 100% sure I understand: on what types of builds
(win32threads? pthreads? all?) and under what circumstances? (encoders,
decoders, both? type of streams?)
Maybe it's always required and we've just not really hit the problem yet
because we don't use multi-clients often.. (each client gets its own
encoding thread). And multiple contexts from the same thread works just
> Regarding the packet question, I found the following commentary in
> avformat.h (from ffmpeg 2.0.1):
> * If AVPacket.buf is set on the returned packet, then the packet is
> * allocated dynamically and the user may keep it indefinitely.
> * Otherwise, if AVPacket.buf is NULL, the packet data is backed by a
> * static storage somewhere inside the demuxer and the packet is only
> * until the next av_read_frame() call or closing the file. If the caller
> * requires a longer lifetime, av_dup_packet() will make an
> av_malloc()ed copy
> * of it.
> * In both cases, the packet must be freed with av_free_packet() when
> it is no
> * longer needed.
> I think you can make this work. As far as I've seen, the code has
> been written with concurrency in mind.
> On 12/2/2013 4:46 PM, Bruce Wheaton wrote:
>> On Dec 1, 2013, at 2:43 AM, Adi Shavit <adishavit at gmail.com> wrote:
>>> Does anyone have any insights or some references I should follow
>>> regarding this issue?
>> Adi, are you aware that ffmpeg does/can employ multi-threaded
>> decoding already? If you set the correct number of threads by setting
>> thread_count in your avcodeccontext before opening the codec, it will
>> do exactly what you propose.
>> In effect, the first few decode calls will return immediately, then
>> your frames will start to come out, having been delayed by the number
>> of threads you requested.
>>> On Tue, Nov 26, 2013 at 9:15 PM, Adi Shavit <adishavit at gmail.com>
>>> I am consuming a multi-program transport stream with several
>>> video streams and decoding them simultaneously. This works well.
>>> I am currently doing it al on a single thread.
>>> Each AVPacket received by av_read_frame() is checked for the
>>> relevant stream_index and passed to a corresponding decoder.
>>> Hence, I have one AVCodecContext per decoded elementary stream. Each
>>> such AVCodecContext handles one elementary stream, calling
>>> avcodec_decode_video2() etc.
>>> The current single threaded design means that the next packet isn't
>>> decoded until the one before it is decoded.
>>> I'd like to move to a multi-threaded design where each
>>> AVCodecContext resides in a separate thread with its own AVPacket
>>> (concurrent SPSC-)queue and the master thread calls av_read_frame()
>>> and inserts the coded packet into the relevant queue (Actor Model /
>>> Erlang style).
>>> Note that each elementary stream is always decoded by the same
>>> single thread.
>>> Before I refactor my code to do this, I'd like to know if there is
>>> anything on the avlib side preventing me from implementing this
>>> AVPacket is a pointer to internal and external data. Are there any
>>> such data that are shared between elementary streams?
>>> What should I beware of?
>>> Please advise,
>>> Libav-user mailing list
>>> Libav-user at ffmpeg.org
>> Libav-user mailing list
>> Libav-user at ffmpeg.org
> Libav-user mailing list
> Libav-user at ffmpeg.org
More information about the Libav-user