[FFmpeg-devel] [Ffmpeg-devel] [PATCH] GSM-MS decoder and encoder

Michel Bardiaux mbardiaux
Fri Mar 28 15:22:03 CET 2008


Baptiste Coudurier wrote:
> Hi,
> 
> Michel Bardiaux wrote:
>> Baptiste Coudurier wrote:
>>> Michael Niedermayer wrote:
>>>> Hi
>>>>
>>>> On Mon, Feb 19, 2007 at 11:59:14AM +0100, Michel Bardiaux wrote:
>>>>> Michael Niedermayer wrote:
>>>>>> Hi
>>>>>>
>>>>>> On Sat, Feb 17, 2007 at 03:42:52PM +0100, Michel Bardiaux wrote:
>>>>>>> Michael Niedermayer wrote:
>>>>>>>> Hi
>>>>> [snip]
>>>>>>>> is 13000 exactly correct isnt it 13200 for CODEC_ID_GSM?
>>>>>>> Yes and no. The last 4 bits of each frame are not actually used, they
>>>>>>> come on stage only because we (both libgsm and lavc) use a
>>>>>>> byte-oriented
>>>>>>> API, and files made of bytes. They should be considered as container
>>>>>>> overhead in TOAST files, hence not part of the codec bitrate.
>>>>>> but with this argumentation we would have to subtract the padding
>>>>>> bits from
>>>>>> mpeg from the bitrate too and thats something we dont 
>>>>> I think you're talking codec-level padding here, but I wrote of
>>>>> container overhead.
>>>>>
>>>>>> also it would cause
>>>>>> problems for containers which expect the bitrate well to be the
>>>>>> bitrate of
>>>>>> what they get, not to be slightly less due to some padding bits
>>>>>> they dont
>>>>>> know about ...
>>>>> AFAIK currently only some MS containers (AVI, WAV) accept GSM, and
>>>>> then only MS-GSM, which does not have the problem. The only
>>>>> container for non-MS-GSM is TOAST (currently not implemented) and
>>>>> that one knows about the 4 extra bytes.
>>>> well a grep for gsm shows a hit in aiff.c so it seems it is supported
>>>> in a
>>>> non toast format currently and as there is no gsm specific code in it
>>>> i would
>>>> guess it will end with 13200 as bitrate
>>>>
>>>> anyway i wont fight about this appl the patch, i will remove the
>>>> 13000 check
>>>> when it breaks something
>>>>
>>> I was fixing agsm in mov, and this bitrate check cause init to fail for
>>> decoding (encoding shares the function), IMHO we don't care about
>>> bitrate when decoding.
>>>
>>> Can I remove it ?
>>>
>> Oops, forgot I am maintainer for libgsm!
>>
>> Since bitrate is also irrelevant for encoding, what about this?
>>
>>
>> ------------------------------------------------------------------------
>>
>> Index: libavcodec/libgsm.c
>> ===================================================================
>> --- libavcodec/libgsm.c	(revision 12579)
>> +++ libavcodec/libgsm.c	(working copy)
>> @@ -36,8 +36,10 @@
>>  #define GSM_FRAME_SIZE 160
>>  
>>  static av_cold int libgsm_init(AVCodecContext *avctx) {
>> -    if (avctx->channels > 1 || avctx->sample_rate != 8000 || avctx->bit_rate != 13000)
>> +    if (avctx->channels > 1 || avctx->sample_rate != 8000)
>>          return -1;
>> +    if (avctx->bit_rate != 13000)
>> +        av_log(avctx, AV_LOG_WARNING, "Bit rate %d instead of expected 13000 - ignored\n");
>>  
> 
> Well, Im not really ok to annoy user with a false message, when decoding
> this is irrelevant.

It is not false. IMO a file with any bitrate other than 13000 is 
corrupted - well, at least it is not strictly according to specs. Yes, 
this is debatable, but to debate it we need facts, that is a collection 
of files with that problem. Printing the message will inform us about that.

> 
> What is this check good for, since bitrate is irrelevant ?
> 

Remove the message on encoding, and very soon we will have someone 
complaining that he used -acodec libgsm -ab 26000 but the output file is 
too small and WMP says it is 13kbits...

Greetings,
-- 
Michel Bardiaux
http://www.mediaxim.com/




More information about the ffmpeg-devel mailing list