[FFmpeg-devel] [PATCH 5/9] sbc: implement SBC encoder (low-complexity subband codec)

Aurelien Jacobs aurel at gnuage.org
Thu Mar 1 22:45:27 EET 2018


On Thu, Mar 01, 2018 at 01:27:06AM +0000, Rostislav Pehlivanov wrote:
> On 28 February 2018 at 23:34, Aurelien Jacobs <aurel at gnuage.org> wrote:
> 
> > On Wed, Feb 28, 2018 at 12:59:40AM +0000, Rostislav Pehlivanov wrote:
> > > On 27 February 2018 at 23:56, Aurelien Jacobs <aurel at gnuage.org> wrote:
> > >
> > > >
> > > > So I've updated the patch with only a msbc and a delay option and
> > > > I've added some sane parameters decisions, mainly based on the
> > following
> > > > study :
> > > > https://pdfs.semanticscholar.org/1f19/561d03bc88b67728375566c95bbf77
> > > > e730d5.pdf
> > > > This seems to give pretty good results.
> > > >
> > > >
> > > I think you ought to use a float for this, like:
> > >
> > > >{ "sbc_delay", "Maximum delay in milliseconds", offsetof(<context>,
> > > <option>), AV_OPT_TYPE_FLOAT, { .dbl = <some reasonable default> },
> > <min>,
> > > <max>, <flags>, <option name in structure> },
> >
> > Why would I want to use float for this ??
> > AV_OPT_TYPE_DURATION is more appropriate for this. It has microsecond
> > precision which is more than enough, and you can still use a "float"
> > string as a parameter (eg. -sbc_delay 0.0035 for a 3.5 ms maximum delay).
> >
> >
> The unit is the issue. AV_OPT_TYPE_DURATION expects a duration in seconds,
> but all codecs (except .ape) have frame sizes expressible in a (near)
> integer amount of milliseconds, this included.

AV_OPT_TYPE_DURATION supports various input format. There is no reason
it can't support milliseconds or microseconds input.
I've just sent a patch to implement exactly this.

> Take a look at both Opus
> encoders, they both use a unit of milliseconds in their avopts to set frame
> durations. Having a codec which uses seconds unlike everything else isn't
> right (and we the same issue just a month ago with a timeout option).
> You don't have to use a float, you can use an int as well as long as frame
> sizes are an integer number of milliseconds.

"unlike everything else" ???
What is everything else ?
Do you have codecs example other than Opus that use foat milliseconds ?

I don't see much in avcodec, but in avformat there are various similar
examples:
- http: reconnect_delay_max (INT in seconds)
- gifdec: min_delay, max_gif_delay... (INT in hundredths of seconds)
- mpegenc: preload (INT in microseconds)

That's what happens when each codec/muxer uses a simple number type
(INT or FLOAT) and picks it's prefered unit. Total inconsistency !

That's what AV_OPT_TYPE_DURATION avoids, by preventing each codec to
pick a unit (it gives microseconds to everyone which should fit pretty
much all situations) and by allowing users to enter parameters in a
convinient way whatever the most appropiate unit is.
Consitency across the codebase using a common parameter parser and
a common internal unit (microsecond).

So what I propose is to rename sbc_delay to sbc_latency (for example),
and to add a opus_latecy using AV_OPT_TYPE_DURATION that will deprecate
opus_delay. I think that's the best way forward if you care about
consistency.


More information about the ffmpeg-devel mailing list