[FFmpeg-devel] Clarification about bits_per_coded_sample vs bits_per_raw_sample
shawnsingh at google.com
Wed May 3 01:01:37 EEST 2017
We are trying to understand two fields in AVCodecParameters:
bits_per_coded_sample and bits_per_raw_sample.
In the comments in libavcodec/avcodec.h, bits_per_coded_sample is described
as "the bitrate per sample". This sounds like (encoded bitrate / sample
rate), for example 128 kbps 44.1 kHz audio stream would be 3 bits per coded
audio sample. But, the code usage suggests that this field is actually
"the bit depth of each sample, if the sample was uncompressed", which is
also similar to the comments and usage for bits_per_raw_sample. For
example, the mov.c demuxing initializes bits_per_coded_sample when parsing
the "sample size" field of the mp4 AudioSampleEntry (in the stsd atom, for
both audio and video).
Various codecs/formats initialize one value, or the other, or both in
different times. For example, pcm.c audio codec sets bits_per_coded_sample
on encoding, and bits_per_raw_sample on decoding. But the mov.c demuxer
and movenc.c muxer both use bits_per_coded_sample for muxing.
So, what really is the difference between these two values? Is it
possible that these fields should just be merged into one field? Or if
there is a pattern we don't see, perhaps only the comments need to be
More information about the ffmpeg-devel