[cryptography] Quality of HAVEGE algorithm for entropy?

Stephan Mueller smueller at chronox.de
Wed Nov 27 06:10:39 EST 2013


Am Dienstag, 26. November 2013, 14:33:54 schrieb coderman:

Hi coderman,

> On Tue, Nov 26, 2013 at 10:09 AM, Joachim Strömbergson
> 
> <Joachim at strombergson.com> wrote:
> > ...
> > I have concerns though on embedded SSL stacks that use Havege as 
entropy
> > source on MCUs such as AVR32 and ARM.
> > ...
> > On an x86-based server you can use Havege, but use it to feed
> > /dev/random, not as a RNG directly. The same goes for Jytter.
> 
> good points!
> 
> haveged should work fine on StrongArm, A8, A9, Xscale, anything with a
> high res timer like ARM Cycle Counter (in place of TSC).
> 
> older ARM processors and x86 without high res TSC (pre-pentium?) will
> have trouble.

The way haveged is implemented, not really. The reason is that it uses 
clock_gettime, which uses the Linux kernel clocksource framework. That 
framework has drivers for a number of different timers on various 
architectures.
> 
> 
> 
> and as mentioned, all entropy sources should feed into host entropy
> pool via an entropy daemon that verifies entropy, mixes / compresses
> it, and then feed into host pool.

I would not concur with this statment: at runtime, you cannot verify 
entropy beyond simple pattern checks. Moreover, compression (i.e. 
whitening) is not meaningful when mixing it into /dev/random as it has its 
own whitening function.

The key however is that the entropy estimation that you use to inject the 
data with the appropriate IOCTL into /dev/random must be conservative. 
This way, there is no need to have full entropy on the data stream to be 
added to the entropy pool.

Ciao
Stephan
-- 
| Cui bono? |


More information about the cryptography mailing list