[cryptography] random number generator
pkejjy at gmail.com
Sat Nov 22 10:58:43 EST 2014
All, in the interest of clarity:
1. Let's do the math. Let's assume that we have a really dumb entropy
extractor which waits around for 128 interrupts to occur. It just sits in a
loop sampling the timestamp until this criterion is satisfied. It saves all
these time stamps to a big chunk of memory. Suppose, further, that this
system is so very quiescent (but not _perfectly_ so) that the timing of
each interrupt arrives predictably, but for an error of 1 CPU clock tick,
at random. So for instance the time between interrupts is either 10 or 11
ticks. Thus 128 interrupts gives us 128 bits of entropy. In other words,
taken in the aggregate, the timestamp stream will be worth 128 bits of
entropy. But of course we need to produce a "short" output in the interest
of practicality, so let's say we hash this long timestamp stream through a
cryptographically wonderful PRNG, yielding 128 bits of noise. Applying the
reflexive density constant, we expect that (1-1/e) or so of the 2^128
_theoretically_ possible hashes will be _actually_ possible. So, roughly
speaking, we drop down to 127 bits of entropy. Next, adjust for the fact
that maybe our PRNG ain't so wonderful after all because it has unseen
biases, and maybe we're down to 120 bits. Whatever. We still have a
freaking strong random number at the end of the day -- all from a very
2. Empirically, it seems to me that, historically, entropy extractors have
paid none too careful attention to hash quality. So for instance, we end up
doing a commutative operation on the aforementioned timestamp stream. In
that case, the _sum_ would be uncertain by something like +/-11 (root 128).
So then we somehow manage to get this code into general circulation, and no
one one looks at it until stuff starts getting hacked. Finally, we end up
with a paper like in DJ's link, which makes some of us assume that
coldbootishness is too blame for poor entropy, when in fact it was the bias
of the timestamp stream hash. Jytter's hash, in particular, is
noncommutative and reasonably "fair"; there are vastly better hashes which
could be implemented, but not without touching memory in such a way as to
risk autopseudorandomness (i.e. confusing the TRNG's own pseudorandom
timing with true background system entropy).
3. "world's leading experts": Stu has spoken to way more people than I have
about Jytter. But most of his conversations are covered under NDA. I only
met one of these experts personally, who said he had run the gammot of
statistical tests on Jytter and couldn't find a weakness. Then again, he
didn't do the biasing trick that I previously suggested. I would recommend
that everyone assume that no one has ever tested Jytter at all, so
therefore you need to test it yourself and demonstrate how weak it is.
On Sat, Nov 22, 2014 at 10:13 AM, James A. Donald <jamesd at echeque.com>
> On 2014-11-22 06:31, dj at deadhat.com wrote:
>> OK, if you think my Jytter TRNG is weak,
>> I did not say it was weak. I said Jytter (and any other algorithm) is
>> deterministic when run on an entropy free platform. This is a simple fact.
> All platforms have entropy.
> If they boot from a physical disk, microturbulence creates true randomness
> in data availability.
> If they are on the net, packets arrive at random times with random delays.
> If they are using wifi, not only are packets arriving at random times, but
> are affected by random noise.
> The question is, does all this entropy show up in Jytter? I rather think
> it does.
> cryptography mailing list
> cryptography at randombit.net
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the cryptography