[cryptography] Quality of HAVEGE algorithm for entropy?

coderman coderman at gmail.com
Wed Nov 27 15:00:50 EST 2013


On Wed, Nov 27, 2013 at 3:10 AM, Stephan Mueller <smueller at chronox.de> wrote:
> ...
> The way haveged is implemented, not really. The reason is that it uses
> clock_gettime, which uses the Linux kernel clocksource framework. That
> framework has drivers for a number of different timers on various
> architectures.


do you know if the list of supported clock sources is documented somewhere?



> ... you cannot verify
> entropy beyond simple pattern checks. Moreover, compression (i.e.
> whitening) is not meaningful when mixing it into /dev/random as it has its
> own whitening function.

simple checks (sanity checks) are useful.  perhaps most applicable to
physical noise sources (XSTORE) but if there was an issue with clocks
i'd like the entropy daemon to not gather and feed /dev/random with
useless samples.

as for compression, i suppose i agree.   you may write fewer bytes to
/dev/random with a higher entropy density, but practically speaking,
it is nil effect.



> The key however is that the entropy estimation that you use to inject the
> data with the appropriate IOCTL into /dev/random must be conservative.


agreed!
(for example, XSTORE is >96%, but i always operate at 75% max estimate)



best regards,


More information about the cryptography mailing list