# [FFmpeg-devel] [PATCH 3/3] avformat/format: av_register_output_format() and av_register_intput_format() that work in O(1) time

Reimar Döffinger Reimar.Doeffinger at gmx.de
Tue Dec 17 21:16:46 CET 2013

On Tue, Dec 17, 2013 at 07:51:53PM +0100, Michael Niedermayer wrote:
> On Mon, Dec 16, 2013 at 05:20:49PM -0800, Timothy Gu wrote:
> > On Dec 16, 2013 4:09 PM, "Michael Niedermayer" <michaelni at gmx.at> wrote:
> > > Subject: [FFmpeg-devel] [PATCH 3/3] avformat/format:
> > > av_register_output_format() and av_register_intput_format() that
> > > work in O(1) time
> >
> > Stupid question, what does O(1) mean? I read the Wikipedia article about
> > big O notation but it seems so confusing.
>
> it means that independant of some parameter (here the number of
> registered elements) theres a upper bound
> on the time that the operation takes on average.

Simpler (not 100% correct) way to put is: adding one element takes the
same time no matter how long the list is.
What makes the correct definition confusing (and Michael simplification
not quite correct) is that an algorithm that e.g. never terminates
when there are 3 elements in the list can still be O(1)
E.g. 1/x is in O(1).
The general idea is to express how runtime of an algorithm changes
as you add more and more elements.

(Side rant: the wikipedia article doesn't get better from the fact that
the notation is utter crap, and they chose one that makes even less
sense. While they go into why it should be "element of" instead of "="
they completely miss that f(x) is not a function, f is - though they use
both notations in different places. I guess there's a reason why some people
make fun of it being called computer "science" when they don't even manage
to come up with a notation that isn't broken front to end).