[Libav-user] Video and audio timing / syncing

Alex Cohn alexcohn at netvision.net.il
Sun Mar 31 09:03:40 CEST 2013


On Sun, Mar 31, 2013 at 9:12 AM, Brad O'Hearne
<brado at bighillsoftware.com> wrote:
> On Mar 30, 2013, at 4:16 AM, René J.V. Bertin <rjvbertin at gmail.com> wrote:
>> Seriously, why not post the information provided by QTKit, and how you convert it? Seems it could be quite easy to confound QT's time scale and FFmpeg's time_base units?
>
> It appears I've discovered what the problem is, however I'm not yet clear on how to fix it. The problem is not in my pts or dts values, or my formulas I'm using to convert QTSampleBuffer presentationTime and decodeTime to time_base units. (In fact, as an aside, I commented all that code out and used the muxing.c pts-setting code and it changed absolutely nothing -- the same problem existed.
>
> I am configuring QTKit to have a minimumFrameRate of 24, which is the value I'm using for time_base.den, according to the documentation. What I discovered is that despite configuring this frame rate in QTKit, that's not the actual frame rate being received -- at runtime capture is actually producing closer to 15 fps. I determined this by simply reading the log of pts values  to the point where the value was the highest pts <= time_base.den -- and there were about 15 frames that had been consistently processed. So I then just manually hardcoded the time_base.den to 15, and boom, both video and audio are right on the money, completely in sync.
>
> The problem is that I don't want (or more properly put, I do not think it would be prudent or bug-free code) to hard-code this value, as I expect frame rate likely will in reality vary, based on computer, camera, etc. At the present, I've got a question out to the QuickTime API users mailing list because there does not appear to be a way to query the actual frame rate being captured from either the sample buffer received, the capture device, or the capture connection.
>
> But this raises the question: what is the proper way to deal with a varying frame rate during encoding, so as to properly set pts and dts? It would appear that the intention is for a codec context's time_base to be set once prior to the encoding cycle. So I'd guess that even if I cb.deould get a runtime frame rate as capture was taking place, I couldn't change the time_base.den value on the fly during encoding.
>
> How would you suggest one deals with this? What happens if you set the time_base.den to the expected frame rate, such as 24, only to actually receive 15 (or some other number) frames per second? How do you deliver proper timings in this scenario?
>
> Thanks,
>
> Brad

The trick is that not all containers and codecs support variable frame
rate. For example, mp4 with h264 codec allows you to set time_base to
a very high value, and set pts for video frames at irregular intervals
in terms of this time base. (The latter is a rational number, so the
actual expected time is pts*tb.num/tb.den). On the other hand, h263
only supports fixed frame rates, and these from a limited choice of
predefined values.

And let me repeat again, the timestamps may be set on the container
level (avformat) and on the codec level (avcodec), and sometimes the
rules of the two levels differ, and the numerical values must be
different. And remember, the players may handle some videos
differently, when either the spec or the file allow different
interpretations.

BR, and have a nice holiday,

Alex


More information about the Libav-user mailing list