[cryptography] is there an interation-incremental version of PBKDF2?
mheyman at gmail.com
mheyman at gmail.com
Fri Sep 10 10:32:11 EDT 2010
On Thu, Sep 9, 2010 at 2:28 PM, Marsh Ray <marsh at extendedsubset.com> wrote:
> On 09/08/2010 05:06 PM, travis+ml-rbcryptography at subspacefield.org wrote:
>>
>>>> On Fri, Sep 03, 2010 at 09:45:20AM +0100, Ben Laurie wrote:
>>>>
>>>> ...narrow-pipe designs have a huge null space for messages which
>>>> are exactly as big as the compression function input size. For
>>>> instance hashing inputs that are multiples of 512 bits, SHA-256
>>>> will only produce about 63% of the possible 2^256 outputs.
>>>>
>>> So we deal with SHA-255.33 instead of SHA-256. Not a big enough
>>> difference to worry about.
>
> But without a fresh injection of entropy, the effective entropy in the
> resulting hash is reduced more than half a bit per log2
> of the iteration count.
>
First of all, half a bit per log2 isn't quite true.
See "Random Mapping Statistics", Flajolet, A Odlyzko, Advances in
cryptology, EUROCRYPT'89, 1990
<http://www.springerlink.com/index/32q2qh4n325evy7f.pdf>.
The paper shows the bits of entropy lost is:
log2(1-t(k))
where:
t(k+1) = e^(t(k)-1)
So, for instance, by the 256rd iteration, you have only lost 7.01 bits
of entropy, not 8 bits. And, you will never get below
( ( pi*(2^n) )/2 )^0.5
where 'n' is the number of bits in the hash you iterate over. This is
about 128.3 bits for SHA-256.
Restated with no equations: An iterated hash will make a graph with
multiple trees attached to multiple cycles. If you start on a tree,
each iteration walks down the tree and eventually lands on a cycle. It
looks like a pile of hairy loops or a flock of flying spaghetti
monsters. At every iteration, the entropy goes down simply because of
a reduced possibility of states. This happens because the state cannot
possibly be at a branch terminal after one iteration. The state cannot
possibly be at a branch terminal or next to it after two iterations.
Once you have iterated enough that you cannot possibly be on a tree
but must be in a cycle, the entropy will never go down again.
Second of all, the vast majority of discussions about losing entropy
this way is completely mute.
These entropy discussions are mute because in the real world we don't
care about 'entropy' we care about what I have heard referred to as
'conditional computational entropy' or the entropy experienced by
somebody with a real device, not a device that can enumerate all
states in an iterated 256-bit hash and know which states can be
excluded.
Back in the real world, we don't lose any 'conditional computational
entropy' upon iteration.
We don't lose any 'conditional computational entropy' upon iteration
because we have no machines powerful enough to know what states are at
branch-terminals or next to branch-terminals or next to those, etc.
----
-Michael Heyman
More information about the cryptography
mailing list