[FFmpeg-user] Timestamps for mpjpeg demuxer

Dmpfbck xbtw0hx96ql8t8aj_ffmpeg at freenet.de
Thu Feb 18 12:15:58 CET 2016


Am 18.02.2016 um 00:27 schrieb Moritz Barsnick:
> I had wondered myself:
> http://lists.ffmpeg.org/pipermail/ffmpeg-user/2016-January/030178.html
> (I'm using a "multipart/replace" stream, or whatever that crude MIME
> invention is called.)
>
> So the input option "-use_wallclock_as_timestamps" should do the trick,
> possibly in combination with "-re".
>
> I don't recall whether ffmpeg could grab the timestamps from the stream
> - I believe not. I do know that my stream, coming from "motion", does
> not provide a "Last-modified" header or the likes per frame.

"-use_wallclock_as_timestamps" can be very helpful, thanks! Seems not to
be documented. I've tried it, and the timestamps are the number of
seconds since 1/1/1970.

The full story: I once wrote a screen capturing program (yes, there are
others out there.... It's just for my own personal use and a prototype
for implementing programming stuff). Video is saved as MJPEG in an AVI
container. In addition, audio is recorded from the microphone (to a WAV
file). This works perfectly without A/V sync problems. I haven't been
using FFmpeg for this purpose so far. Encoding and muxing is done after
recording is finished using AviDemux.

Now I'm about to change this procedure completely by implementing
real-time encoding by using FFmpeg as the encoding backend.  FFmpeg has
two input sources: Video via TCP as already described, and audio via
FFmpeg's DirectShow capabilities from the webcam. The timestamps for
audio are relativ to the system start, e.g. the last recording started
at 18716.449 (~5 hours, 11 minutes). If I specify
"-use_wallclock_as_timestamps 1" for video, the timestamps of the two
streams are not relativ to the same point in time.

To keep the story short, I'm discarding some other thoughts because it
can't work this way: As FFmpeg can lag behind, my application buffers
the captured screenshots before sending them to FFmpeg via TCP.
Consequently, the point in time when FFmpeg reads the packet and creates
a timestamp for it (if I use use_wallclock_as_timestamps), can be
seconds (or minutes) /after/ the frame has been captured. The timestamp
would be completely wrong, so I can't proceed this way.

Conclusion: I must find a way to provide the real timestamp for every
video frame. It's been recorded within my program, but I can't send it
along with every frame within a mpjpeg stream. That's why I'm curious if
there is any other simple streaming protocol in which I can send a jpeg
frame + timestamp, and that doesn't require me reading dozends of
specification documents.

[sent this 3 message 3rd time now as previous ones don't seem to arrive
after a longer waiting period]


More information about the ffmpeg-user mailing list