[FFmpeg-user] Detecting frames on Raw video stream
charl.wentzel at vodamail.co.za
Thu Aug 18 23:50:50 EEST 2016
On 15/07/2016 10:20, Carl Eugen Hoyos wrote:
>>> You should teach FFmpeg to read the frames from the camera.
>> Any ideas how? Should I consider writing my own container
>> format library for FFmpeg?
> I meant instead of implementing the API in you own piece of
> software you could use it in FFmpeg (or reverse engineer it).
Thanks for the advice. I managed to find a workable solution. I have
tested piping on the command line before, e.g. "myfeedprogram | ffmpeg
-i - [lots of parameters]" and this worked. The problem was that I
needed up to three simultaneous streams: live video, video recording and
snapshots. This couldn't work because I couldn't have three instances
of the same software trying to access the one camera.
The solution was simple, create a pipe, fork the process, connect output
end of pipe to stdin of the child process, replace child process with
ffmpeg (exec()), then feed raw frames into pipe. This can be done as
many times as necessary. So I have one pogram capturing frames from the
camera, but then "piping" it to 3 separate instances of ffmpeg.
More information about the ffmpeg-user