[FFmpeg-devel] [PATCH] NellyMoser audio decoder
Wed Sep 12 17:33:44 CEST 2007
On Wed, Sep 12, 2007 at 10:50:23AM -0400, Daniel Serpell wrote:
> El Wed, Sep 12, 2007 at 12:45:57AM -0400, Rich Felker escribio:
> > On Wed, Sep 12, 2007 at 12:02:18AM -0400, Daniel Serpell wrote:
> > >
> > > But you could do it only on page-out. Just hash the page before going
> > > to disk. If the hash matches an already paged-out one, you don't have
> > > to store it again.
> > Even without a vulnerability, you'll get a random collision given
> > enough time. Not likely on a single machine, but the likelihood that
> > some computer somewhere in the world would be affected by it within a
> > one-year period is probably nontrivial unless you make the hashes so
> > large that paging-out actually leaves a big hash in memory.
> Well, with a 128 bit hash, and 2^20 pages (so you have 4GB virtual RAM),
> the probability of collision is (1-e(- (2^40) / (2^129))) = 1.6*10^-27
> At 100MB/s of write to disc, you could page out about 25000 pages each
> second. So, you could replace all 2^20 pages in 40 seconds, this means
> you could repeat your experiment 788400 times per year.
> With 10^9 computers, the probability of collision in 10 years is then
> 1.6*10^-27 * 8*10^5 * 10 * 10^9 = 1.3 * 10^-11
> IMHO, you only need a good hash.
i agree that the failure probability is lower than the failure probability
of even very reliable hardware
though, note in practice you would likely want to do 2 hashes, 1 fast
and after finding a match checking with a slow secure hash that
the blocks really match, otherwise you would likely slow down all
page out operations noticeably
Michael GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB
I do not agree with what you have to say, but I'll defend to the death your
right to say it. -- Voltaire
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Size: 189 bytes
Desc: Digital signature
More information about the ffmpeg-devel