[FFmpeg-devel] [RFC] Runtime-inited versus Hardcoded tables.

Zdenek Kabelac zdenek.kabelac
Thu Jan 31 14:14:53 CET 2008

2008/1/31, Kostya <kostya.shishkov at gmail.com>:
> On Thu, Jan 31, 2008 at 01:23:43PM +0100, Reimar D?ffinger wrote:
> > Hello,
> > On Thu, Jan 31, 2008 at 12:54:16PM +0100, Zdenek Kabelac wrote:
> > > But I think that files like rv34vlc.h,indeo3data.h are already quite
> > > too much if there exists simple code to generate them at runtime I
> > > would prefer rather this version.
> >
> > After compression at least the indeo ones are less than 20 kB as well
> > though.
> > Nevertheless, if they are really big and easy to create, IMO they should
> > not just be created at runtime but be part of the codec context - at
> > least I consider the usual case that only one codec is open at a time,
> > and big tables should not use up memory when the codec is no longer
> > used, just because it has been used once.
> > Though if there are big tables and they can be easily calculated at
> > runtime, there is quite a probability that those tables could be made
> > smaller without much of a speed loss.
> >
> > > The scenario is simple - you release binary/source package which is
> > > then downloaded by thousands of user - which spends time/money waiting
> > > for download :)
> >
> > Even if we were pessimistic and assumed 20 codecs with as huge tables as
> > indeo3, that would be about 400 kB of data to transfer. Even for a modem
> > user that is not that much of a deal - at least that is what I think,
> > and why I find this a bit constructed.
> Alternatively we can consider taking ScummVM path - moving large tables
> into binary file and loading data from it when needed.
> This will add a bit of user confusion too.

Well I've just checked some of the largest *.h files and forget to
mention dcdata.h - but if the the vlc tables are just the result of
reverse engineering and there is not known the way how to effectively
generate them by an algorithm in runtime  of course they must be
inserted there - using separate binary databooks aren't probably
worth, because of the confusion they might provide.

But there is always chance you will want to install the ffmpeg into
some limited sized eeprom/CD/distro and you want to save some memory
for other things - so while you could have a lot of dynamic memory -
you are limited by the ROM size....

It's probably not so easy to say which tables should be created for
the codec initialization - should it take less then 0.1ms on 400MHz
CPU :) though for me I would even support the idea to have the tables
created always when codec is opened/closed.
But it might be nice to have the build with 'static table' or
dynamicaly computed tables - I think there is already zillion of
defines thus this one extra wouldn't be so problematic - routines for
table generation could be held in one .c file - and added only when
such build is requested.

As for the 400kB per modem user - well if you will multiply this by
the number of ffmpeg user all over the world - it's quite a large
traffic which might have been used for the faster po*n ;) transfers or
whatever you choose ;) And btw in many countries there are still
Internet providers are charging you per transferred megabytes....


More information about the ffmpeg-devel mailing list