[FFmpeg-devel] [PATCH] Incorrect Ogg Theora duration

Henrik Gulbrandsen henrik
Sun Apr 20 07:40:07 CEST 2008


On Fri, 2008-04-18 at 00:45 -0700, Baptiste Coudurier wrote: 
> I think the "start" value is incorrect. According to specs, ogg
> granule stores the pts plus the duration of the ogg packet, so last
> packet granule is the stream duration.

As mentioned previously, Ogg Theora bitstreams prior to version 3.2.1
don't follow this rule. Since libtheora uses 3.2.0 for versions before
1.0beta1, I think there is a good case for being backwards compatible.

On Fri, 2008-04-18 at 09:10 +0100, M?ns Rullg?rd wrote:
> > Index: libavformat/oggdec.c
> > ===================================================================
> > --- libavformat/oggdec.c	(revision 12884)
> > +++ libavformat/oggdec.c	(working copy)
> > @@ -307,6 +307,7 @@ ogg_packet (AVFormatContext * s, int *st
> >      ogg_stream_t *os;
> >      int complete = 0;
> >      int segp = 0, psize = 0;
> > +    uint64_t start = url_ftell(s->pb);
> >  
> >  #if 0
> >      av_log (s, AV_LOG_DEBUG, "ogg_packet: curidx=%i\n", ogg->curidx);
> > @@ -368,6 +369,7 @@ ogg_packet (AVFormatContext * s, int *st
> >      if (os->header < 0){
> >          int hdr = os->codec->header (s, idx);
> >          if (!hdr){
> > +          url_fseek (s->pb, start, SEEK_SET);
> >            os->header = os->seq;
> >            os->segp = segp;
> >            os->psize = psize;
> 
> This looks very wrong.

Interleaved multi-page packets. Right. That is a potential problem if
higher specs allow it, but somehow I doubt that this was your point.
Could you please clarify? This particular line is only meant to run
precisely when the first data packet has already been fetched and is
about to be discarded after failing to be accepted as a header packet.

> > @@ -463,14 +465,14 @@ ogg_get_length (AVFormatContext * s)
> >  
> >      if (idx != -1){
> >          s->streams[idx]->duration =
> > -            ogg_gptopts (s, idx, ogg->streams[idx].granule);
> > +            ogg_gptopts (s, idx, ogg->streams[idx].granule + 1);
> >      }
> 
> Baptiste is right, the last value gives the length of the stream.

In this case, it's irrelevant what the low-level granule semantics are.
The function is named ogg_gptopts(), and - unless that is just a random
selection of letters - should presumably convert Granule Position to
Presentation Time Stamp. Unless PTS is the time when presentation stops,
we really need a granule from the (imaginary) packet after the last one.

> >      ogg->size = size;
> >      ogg_restore (s, 0);
> >      ogg_save (s);
> >      while (!ogg_read_page (s, &i)) {
> > -        if (i == idx && ogg->streams[i].granule != -1 && ogg->streams[i].granule != 0)
> > +        if (i == idx && ogg->streams[i].granule != -1)
> >              break;
> >      }
> >      if (i == idx) {
> 
> Why?

Because 0 is a legal granule value for data packets in old Theora files,
as noted above. ogg_get_length() is a static function and is called only
once when the headers have already been skipped, if that worried you.

/Henrik






More information about the ffmpeg-devel mailing list