Wed May 6 17:23:52 CEST 2009
> On Wed, May 06, 2009 at 01:37:46AM -0700, Jason Garrett-Glaser wrote:
>> On Wed, May 6, 2009 at 1:33 AM, Baptiste Coudurier
>> <baptiste.coudurier at gmail.com> wrote:
>>> Hi Vitor,
>>> On 5/6/2009 12:23 AM, Vitor Sessak wrote:
>>>> I was expecting Michael to reject my patch, not the list moderators (see
>>> Sorry :/
No problem. I imagine that the signal/noise ratio of the moderation
queue is pretty low. I resent it.
>>> Patch is huge :(
>>> List says your mail is 800k !!!! ?
>>> Cause: Message body is too big: 822865 bytes with a limit of 200 KB
>>> I thought that was a mistake.
>> Have you seen the sheer size of the TwinVQ tables? Hopefully they are
>> a lesson to codec developers to never do such an insane thing ever
Each single file uses "only" 40kb of table data (and 26kb per frame).
The problem is that for _every single one_ of the nine possible
combinations of bitrate & sample rate it have a different set of tables.
While this is extremely ugly, it should have made some sense for them at
the time since:
1- They didn't plan to create a VQF spec anytime soon (so they didn't
care about ugliness)
2- They didn't project it to run in anything less powerful than a 486
with 64 mb of ram (so 350 kb "is not that much")
3- They wanted the best quality per bitrate, no matter what, to compete
My main rant is that they could have designed their codebooks to fit in
int16_t (and occupy only half the size) with practically no disadvantages...
> It is well learnt - DCA tables take only ~300kB and >250kB for RealVideo 3 and 4
In case you are wondering, stripped twinvq.o makes 357196 bytes
>> The easiest way to deal with this would probably be to approve the
>> tables separately so that a megabyte patch doesn't have to be sent
>> every time.
That's why I asked Michael to start the review by the tables. I'm
resending them because I renamed quite a few codebooks in the latest
version of my patch...
More information about the ffmpeg-devel