[cryptography] Quality of HAVEGE algorithm for entropy?

Stephan Mueller smueller at chronox.de
Thu Nov 28 04:29:01 EST 2013

Am Donnerstag, 28. November 2013, 10:12:19 schrieb Joachim Strömbergson:

Hi Joachim,

> Aloha!
> Stephan Mueller wrote:
> > The only challenge that I see with Havege is that the algorithm is
> > quite complex and that the description does not fully explain why and
> > where the entropy comes from. Looking into the source code of
> > oneiteration.h, the code is also not fully clear.
> Havege is (if I remember correctly) a magnificent example of Duff's
> Device: https://en.wikipedia.org/wiki/Duff's_device
> The code tries to force instruction cache misses at different points on
> the switch-loop thereby causing a lot of pipe flushes and instruction
> loads from lower level caches all the way to main store.
> A goof comparison to Havege is Jytter that basically (AFAIK) is trying
> to get entropy from the same source (measuring variance in instruction
> timing). But Havege tries to force the creation of variance and can thus
> generate higher rate of entropy. In my measurements I get kbps from
> Jytter byt Mbps from Havege. I have yet to compare the quality as
> measured using Dieharder, but from my memory Havege was really good.

The problem is that dieharder & Co only show the statistical quality. 
Based on my real-world attempts to the CPU jitter issue used as a noise 
source for /dev/random, the questions around the entropy of the data still 
remains -- see the email threat on LKML.
> > Considering the grilling I get with a similar RNG that I ask to be
> > used as a seed source for /dev/random or other crypto libs (see
> > thread http://lkml.org/lkml/2013/10/11/582), I would have concerns on
> > the algorithm.
> As long as one does not rely on one source - and _always_ feed the
> entropy to the RNG-CSPRNG chain (not replace the chain and connect the
> source directly to /dev/random output like with Bull Mountain) I have a
> hard time to see where much controversy would emerge. As long as the
> source produces ok quality entropy.

Then please chime in on the LKML discussion in my support :-D
> One issue I'm thinking of is if you have more than one source, but one
> of them dwafs the other sources in kapacity. Say having a microphone
> providing whitish noise at kbps rate and then having RdRand from your
> Haswell CPU generating data at Gbps speed, will the microphone entropy
> matter?

You are absolutely on spot. The key difference with a CPU jitter noise 
source vs the other noise sources in /dev/random is that the jitter is an 
on-demand production of entropy whereas the others are generated during 
the operation of the OS. That means, if you are not careful, the on-demand 
generation can easily outpace all other noise sources.

That is why my current patch set only uses the jitter noise source as last 
resort, i.e. when /dev/random is about to block. As long as the other 
noise sources produce entropy, my jitter noise source is not even asked.

With that approach, however, /dev/random will never block any more on any 

| Cui bono? |

More information about the cryptography mailing list