[FFmpeg-user] Asynchronous overlays
baptiste.coudurier at gmail.com
Wed May 18 22:29:26 CEST 2011
On 05/18/2011 09:24 AM, Alexandre Ferrieux wrote:
> I'd like to put side-by-side two video sources (with the overlay
> filter), with a completely asynchronous/decoupled scheme. Indeed my two
> video sources have varied and unstable frame rates, and in a naive setup
> of the overlay filter, ffmpeg insists on fetching a frame from each one
> for every output frame.
> The idea is do decode both streams independently (threads) into the same
> overlay buffer, and have a third thread sample this at a regular rate
> (the wanted output frame rate), and feed that into the output chain
> (encoder + container). So, if at any given time one of the sources lags,
> the same frozen frame from it will be reused in several output frames
> (ie overlay buffer not updated in that area), but the overall output
> will not be stalled (as it is today).
> Q1: is this doable with command-line flags to the ffmpeg executable ?
> Q2: if not, I'd appreciate a sketch of where to look in the sources to
> do this in C.
Doing this is way more simple I think, it's just a matter of fetching a
new overlay frame at the right time depending on pts of the main video,
I don't think you need any threads here.
Key fingerprint 8D77134D20CC9220201FC5DB0AC9325C5C1ABAAA
FFmpeg maintainer http://www.ffmpeg.org
More information about the ffmpeg-user