[Libav-user] Speeding up codec detection of raw bitstream
wiebe at halfgaar.net
Sun Feb 11 13:51:01 EET 2018
I'm working on an application for a hardware device I built. In essence, I obtain raw data from an optical receiver, which can be AC3, DTS, etc, and use ffmpeg to decode this into multi-channel analog output. It all works, but I was wondering what I can do to speed up the detection of the codec, especially if header packets are missed (because it captures mid-stream).
To give ffmpeg IO, I make my own AVIOContext  with a callback function that reads from a semaphore controlled buffer. I assign that to avFormatContext->pb . Then I open the stream with:
avformat_open_input(&avFormatContext, NULL, NULL, NULL) 
This function blocks until a good codec is detected (or gives up?). Especially for DTS, this can take a very long time, especially when it missed headers.
Is there a way to speed this up? Would it help for instance to not use av_register_all() and avcodec_register_all()? If so, how and what do I register what I need (documentation is unclear to me; it doesn't give examples).
If there is any information in the original s/pdif bitstream, that is lost. The DIR9001 chip only outputs the contents.
More information about the Libav-user