[cryptography] Enranda: 4MB/s Userspace TRNG
pkejjy at gmail.com
Wed May 27 09:18:53 EDT 2015
I realize that anyone can whiten a long stream of weak entropy into a short
stream of strong entropy. But I disagree that doing so properly, in this
case, is trivial. The way Enranda does it, you need to predict events
separated by tens of thousands of aperiodic timedelta sequences in order to
create a predictable bit. If someone just, say, compressed and then
encrypted the long weak stream, it would not be as secure because the
sequence hashes are slightly biased and slightly susceptible to
If your concern is that the timedelta stream could be made to look
"interesting" by virtue of complicated processes with deterministic timing,
then we can restrict Enranda to "fill" mode, wherein the timestamp is only
read in a tight loop. In the absence of real world disturbances, it would
never fill its own entropy buffer. Instead, it would see lots of
periodicity, and just sit there waiting.
Do you really think that even a "smart" attacker can model a live CPU well
enough to predict tens of thousands of unique timedelta sequences in a row?
Maybe, if it's a CPU connected to nothing but an ICE, and if we assume that
it can't throttle itself in response to temperature. No, if it's a real
system running enough of an OS to be interesting enough to attack, let
alone if it has received network or harddrive packets (and in the absence
of unrelated kernel vulns and the like), whether or not it's currently
connected to the network.
Yes, thermal noise or quantum noise is noise to everyone. We all know that.
But we also know that busses are radiative, especially those which hang out
outside of the chassis. So you're assuming that it's easier to predict tens
of thousands of unique timedelta sequences in a row, correctly, than to
ensure safe transport of external bits to internal memory locations. Is it
possible to design a physically secure conduit for storing thermal bits to
main memory? Yes, sure. But how practical and scalable is it, unless you
have custom designed laptops? Versus: who could possibly model live X86
behavior well enough to predict those sequences successfully, if the kernel
and firmware are uninfected? I think, no one, even if we assume that it's
all pseudorandom after the last IRQ arrived. An attacker would have much
better luck trying to read the output entropy from radio noise emmanating
while it traverses the frontside bus, which although entirely possible, is
a lot harder than reading it from USB or Ethernet.
IRQ noise is not just a matter of the time it takes to service an
interrupt. It's a matter of the cache pollution etc. which occurs as a
result and manifests itself over time as a cascade of events in a
complicated cellular automaton. The entire X86 system is a giant, extremely
complicated random number generator, which as I said we should assume is
entirely deterministic. But it's not modellable, in practice. And
definitely not to the tune of tens of thousands of unique sequences in a
Is that hard proof? No. But neither is: "Look at my thermal noise. Trust me
that nothing can intercept it on the way to the CPU."
On Wed, May 27, 2015 at 12:14 PM, Krisztián Pintér <pinterkr at gmail.com>
> On Wed, May 27, 2015 at 3:12 AM, Russell Leidich <pkejjy at gmail.com> wrote:
> > "if your proposed method comes with a complex extractor, it is bullshit"
> > OK point well taken. I should offer a raw mode.
> no, you actually shouldn't. you should offer raw mode only. maybe some
> clever compression just to reduce the amount of data going into the
> slower secure whitening.
> > What this leaves behind is the aperiodic residue. Or more specifically,
> > ((the hashes (of all sequences)) that have not been seen in the last 2^16
> > such hashes). I realize that this isn't hard proof (as nothing in
> > hardware can be proven)
> this is much worse than "not a hard proof". it is next to nothing. you
> have a hypothesis, which you don't clearly state, and then you have a
> countermeasure, which you don't explain.
> > cache misses, pipeline stalls, CPU circuit clock gating, etc. that
> > the majority of the protoentropy.
> the CPU is a deterministic system. cache misses and all the other
> stuff are not random, but depend on previous instructions, thus the
> internal state of the cpu. this is NOT a source of entropy. the source
> of entropy comes from outside of the CPU, namely anything that changes
> its internal state. these are: responses from mass storage or other IO
> drivers, user input, network events, etc. that is: IRQs. the CPU as a
> system is chaotic, and so tiny differences in those inputs cause huge
> differences later. but this is NOT entropy. this is a completely
> deterministic process.
> at this point, we could dwell on the nature of entropy. by definition,
> entropy is anything the attacker does not know. considering your
> probable attackers, the entire internal state of the CPU is entropy.
> but this is not the case for limited hardware and more potent
> that is why it is crucial to separate the actual entropy from the
> deterministic chaos on top of it. with a nice usual thermal noise
> generator, we can be pretty sure about the real entropy, which is
> entropy for all attackers. that so called CPU jitter is not entropy,
> but a chaotic complex postprocessing on top of some IRQ based minimal
> real entropy. the amount of which is unknown.
> cryptography mailing list
> cryptography at randombit.net
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the cryptography