[FFmpeg-devel] (sponsored) feature request
bouke at editb.nl
Thu Jul 3 10:56:47 CEST 2014
Feature request / want to hire a coder, and IMHO a not so difficult.
(Well, the devil will be in the detail's of course / famous last words...)
I would like to have proper timecode on capturing live streams.
Now, the way it is currently, i can start the recording with a timecode
value, but there is no fixed relation between the moment FFmpeg is fired and
the first frame is captured, so there is always an unpredictable offset.
What I think 'could' work is adding something that will hold FFmpeg at the
point just before it starts buffering. (So start up FFmeg with '-timecode
hold_horses' or whatever)
Then, ask for a timecode value input.
On receiving it, start filling the buffer and encode. (Taking the time
between that an initializing the buffer into account, if needed.)
That 'should' result in a +/- one frame accurace, as the start time could be
just after the beginning of a frame coming in, or very close to the end of
one, so there can be an offset of slightly less than one frame duration.
No idea how to cope with that at the moment. Perhaps I could fire the values
at a fixed time intervals of N*frameDur so the offset would always be the
same. (All sources may be genlocked.)
Instead of taking the timecode from the user, it could also use the system
time. My main concern is that I get multiple recordings starting at
different moments in sync.
Does this make sense?
This email is free from viruses and malware because avast! Antivirus protection is active.
More information about the ffmpeg-devel