[FFmpeg-devel] (sponsored) feature request
andrey.krieger.utkin at gmail.com
Thu Jul 3 11:57:20 CEST 2014
2014-07-03 11:56 GMT+03:00 Edit B <bouke at editb.nl>:
> Hi all,
> Feature request / want to hire a coder, and IMHO a not so difficult.
> (Well, the devil will be in the detail's of course / famous last words...)
> I would like to have proper timecode on capturing live streams.
> Now, the way it is currently, i can start the recording with a timecode
> value, but there is no fixed relation between the moment FFmpeg is fired and
> the first frame is captured, so there is always an unpredictable offset.
> What I think 'could' work is adding something that will hold FFmpeg at the
> point just before it starts buffering. (So start up FFmeg with '-timecode
> hold_horses' or whatever)
> Then, ask for a timecode value input.
> On receiving it, start filling the buffer and encode. (Taking the time
> between that an initializing the buffer into account, if needed.)
> That 'should' result in a +/- one frame accurace, as the start time could be
> just after the beginning of a frame coming in, or very close to the end of
> one, so there can be an offset of slightly less than one frame duration.
> No idea how to cope with that at the moment. Perhaps I could fire the values
> at a fixed time intervals of N*frameDur so the offset would always be the
> same. (All sources may be genlocked.)
> Instead of taking the timecode from the user, it could also use the system
> time. My main concern is that I get multiple recordings starting at
> different moments in sync.
> Does this make sense?
Sorry for my ignorance, what exactly is your use case, and what is
inconvenience you suffer from with current FFmpeg?
More information about the ffmpeg-devel