[FFmpeg-devel] [PATCH] Incorrect Ogg Theora duration

Henrik Gulbrandsen henrik
Tue Apr 22 11:24:11 CEST 2008


On Sun, 2008-04-20 at 13:28 +0100, M?ns Rullg?rd wrote:
> Henrik Gulbrandsen <henrik at gulbra.net> writes:
> 
> > On Fri, 2008-04-18 at 00:45 -0700, Baptiste Coudurier wrote: 
> >> I think the "start" value is incorrect. According to specs, ogg
> >> granule stores the pts plus the duration of the ogg packet, so last
> >> packet granule is the stream duration.
> >
> > As mentioned previously, Ogg Theora bitstreams prior to version 3.2.1
> > don't follow this rule. Since libtheora uses 3.2.0 for versions before
> > 1.0beta1, I think there is a good case for being backwards compatible.
> 
> How many such files exist in the wild?  Given that I've never seen a
> wild Theora file, other than Xiph promotional stuff, I reckon there
> can't be that many.

Well, more to the point: My test system happens to have libtheora alpha7
installed, so the files I write with FFmpeg can't actually be correctly
read by FFmpeg without fixes. I could get around this by upgrading the
library, but I prefer to kill bugs rather than running from them.

> > On Fri, 2008-04-18 at 09:10 +0100, M?ns Rullg?rd wrote:
> >> > Index: libavformat/oggdec.c
> >> > ===================================================================
> >> > --- libavformat/oggdec.c	(revision 12884)
> >> > +++ libavformat/oggdec.c	(working copy)
> >> > @@ -307,6 +307,7 @@ ogg_packet (AVFormatContext * s, int *st
> >> >      ogg_stream_t *os;
> >> >      int complete = 0;
> >> >      int segp = 0, psize = 0;
> >> > +    uint64_t start = url_ftell(s->pb);
> >> >  
> >> >  #if 0
> >> >      av_log (s, AV_LOG_DEBUG, "ogg_packet: curidx=%i\n", ogg->curidx);
> >> > @@ -368,6 +369,7 @@ ogg_packet (AVFormatContext * s, int *st
> >> >      if (os->header < 0){
> >> >          int hdr = os->codec->header (s, idx);
> >> >          if (!hdr){
> >> > +          url_fseek (s->pb, start, SEEK_SET);
> >> >            os->header = os->seq;
> >> >            os->segp = segp;
> >> >            os->psize = psize;
> >> 
> >> This looks very wrong.
> >
> > Interleaved multi-page packets. Right. That is a potential problem if
> > higher specs allow it, but somehow I doubt that this was your point.
> > Could you please clarify? This particular line is only meant to run
> > precisely when the first data packet has already been fetched and is
> > about to be discarded after failing to be accepted as a header packet.
> 
> I still don't understand.  Why do you seek back in the file after
> reading the headers?  It doesn't make any sense to me.

Have a look at ogg_get_headers()! We're looping over an unknown number
of header packets for each stream and must hand them over to the codec
to figure out if they are actually headers or not. This means that we
are not only reading the headers, but also the first data packet, which
is currently discarded. I just want it back in stream for later reading.

> >> > @@ -463,14 +465,14 @@ ogg_get_length (AVFormatContext * s)
> >> >  
> >> >      if (idx != -1){
> >> >          s->streams[idx]->duration =
> >> > -            ogg_gptopts (s, idx, ogg->streams[idx].granule);
> >> > +            ogg_gptopts (s, idx, ogg->streams[idx].granule + 1);
> >> >      }
> >> 
> >> Baptiste is right, the last value gives the length of the stream.
> >
> > In this case, it's irrelevant what the low-level granule semantics are.
> > The function is named ogg_gptopts(), and - unless that is just a random
> > selection of letters - should presumably convert Granule Position to
> > Presentation Time Stamp. Unless PTS is the time when presentation stops,
> > we really need a granule from the (imaginary) packet after the last one.
> 
> Read the code again.  The value is used as the PTS of the first packet
> on the next page.

OK. Sure. I can live with any definition, as long as the value returned
from ogg_gptopts() is well-defined. In that case, the theora_gptopts()
fix should be reversed, and the result of calling it for header packets
would be undefined, so I hope nobody ever wants to do that.

> >> >      ogg->size = size;
> >> >      ogg_restore (s, 0);
> >> >      ogg_save (s);
> >> >      while (!ogg_read_page (s, &i)) {
> >> > -        if (i == idx && ogg->streams[i].granule != -1 && ogg->streams[i].granule != 0)
> >> > +        if (i == idx && ogg->streams[i].granule != -1)
> >> >              break;
> >> >      }
> >> >      if (i == idx) {
> >> 
> >> Why?
> >
> > Because 0 is a legal granule value for data packets in old Theora files,
> > as noted above.
> 
> I see.
> 
> > ogg_get_length() is a static function and is called only once when
> > the headers have already been skipped, if that worried you.
> 
> I fail to see the relevance of that remark.

Just trying to be helpful. Skipping headers is presumably the reason why
the "granule != 0" part was there to begin with. Thus it's unnecessary.

/Henrik






More information about the ffmpeg-devel mailing list