[cryptography] philosophical question about strengths and attacks at impossible levels

Marsh Ray marsh at extendedsubset.com
Thu Oct 14 14:32:41 EDT 2010


On 10/13/2010 11:56 PM, Zooko O'Whielacronx wrote:
>
> What if a hash has 512-bit collision-resistance? What would that mean?
> That an attacker might spend about 2^512 resources to find a collision
> in it?

The attacker can match any hash he generates with any other, not just 
one specific target value. The more hashes he has "on file" to compare 
with, the more likely the next one he generates will match. So the 
collision resistance ends up being approximately 2^(n/2) instead of 2^n.
http://en.wikipedia.org/wiki/Birthday_attack

I haven't heard of anything which required 512 bit collision resistance. 
But NIST is specifying a 512 bit hash output length variant of SHA-3.

Without saying whether or not I agree with it, here's my guess as to how 
a 512 bit output length might be justified:

1. The output hash value must be a power-of-two number of bits long, 
because everyone likes it that way.

2. If the hash value were 256 bits long it could have no better than 
about 128 bits collision resistance.

3. There are quantum computer attacks theorized which appear to cut the 
exponent in half again. Thus a 256 bit hash could possibly be collided 
in 2^64 operations on some future machine.

4. 2^64 work seems plausible for an adversary assumed to have mastered 
quantum computer technology, therefore a 256 bit hash is too small.

5. Therefore, we need a 512 bit hash function.

> That is a meaningless possibility to discuss since 2^512
> resources will never exist in the life of this universe, so it can't
> mean that, or if it does mean that then there is no use in talking
> about "512-bit collision-resistance". Maybe it means something else?

There's this concept of "safety margin" which seems like a good idea, 
except that it lumps in any and all future weakness (the known unknowns 
and unknown unknowns alike) under the same system of units as the 
precise mathematical predictions about entropy.

I have a deep suspicion about using units of 'bits' to quantify the risk 
of future attacks. If we were instead designing, say, steampunk-style 
Babbage engines then we might estimate an adversary's future 
capabilities in units of "PSI" or "horsepower" and compensate 
accordingly. But wouldn't that be missing an essential point?

Since everything gained in yet-to-be-discovered future attacks is hoped 
to be compensated for in this safety margin, any future analytic 
breakthrough which results in an effective attack is likely to be 
construed as evidence of an insufficient safety margin. Nobody wants to 
be the guy who's remembered as the overconfident cryptographer who chose 
the insufficient safety margin, so there's probably a bit of a CYA 
factor in there as well.

The process sometimes gives the appearance of having mathematicians do a 
bunch of rigorous analysis and proofs, then at the end throwing in a 
fudge factor of 2x.

I don't think it's an entirely fair characterization though.

> Or perhaps not. Perhaps the bridge which is designed to withstand
> 10^300 tons of pressure is actually *more* likely to fail than the
> other one when hit by this unpredicted, unmodelled event. Who can
> tell?

Case in point: AES-256 appears to be significantly broken in a way that 
AES-128 is not.

> One reasonable position to take is that it was a mistake for NIST to
> specify that some of the SHA-3 hashes had to have 512-bit preimage
> resistance. (If it *was* a mistake the I really have no idea what to
> do about it at this juncture!)

On one hand, it's NIST's job to standardize stuff as perfectly as 
possible, which is usually far in excess of what nearly anybody needs 
(they have a fascinating website BTW). If they want to standardize a 
65536-bit hash length, that's OK with me.

Except that the requirements for the 512 bit function are spilling over 
into the smaller output sizes that we'll actually be using from day to 
day. People want to be able to use substantially the same circuitry for 
hardware implementations, so some of that cost may be paid regardless.

If there's any CYA or length-envy going on at NIST, that pressure is 
likely to be felt tenfold by the junior engineer in Asia who's proposing 
his first new design to the bosses (for a product you will soon use). 
It's likely that the largest standard hash length will be selected 
because it's easier to do that rather than justify using something "less 
secure". Other parts of the system's security may end up being reduced 
to compensate.

We may have seen a similar phenomenon with the NIST-required transition 
to 2048 bits for RSA certs.

> That position says that there *is* a need for a hash function which
> takes much more CPU time than SHA-3-256 does in order to provide much
> less likelihood that an attacker will be able to find a pre-image in
> it than in SHA-3-256, but that this "much less likelihood" is not in
> any meaningful sense correlated with the idea of having "512-bit
> pre-image resistance".

You won't find many people requesting that their hash function take more 
time. The only places I'm familiar with cycles being consumed for their 
own sake is when deriving keys from passwords and making password hashes 
for authentication. If the good-guy's operation is assumed to be 
infrequent (e.g., authenticating terminal session logins) and he has 
some cycles to spare, he can greatly multiply the effort for an attacker 
to brute force it without noticeable cost to himself.

> Another reasonable position to take is that a hash function which is
> known to have at most 384-bit pre-image resistance is *more likely to
> fail* than one which is known to have at most 512-bit pre-image
> resistance. This is where my limited understanding of hash function
> cryptanalysis comes to an end.

Those who know tend to not talk about it much, those who don't know do 
the talking.

Here's what I think:

> Is that plausible?

Function A: susceptible to a preimage attack at 2^384 work
Function B: susceptible to a preimage attack at 2^512 work
This is all the information we have.

> If I give you two
> hash functions like that, are you confident that you could learn how
> to find pre-images in the former before they find pre-images in the
> latter?

You mixed 'you' and 'they' there, so I'm not sure exactly the question.

> How sure are you?

No one's yet published a preimage for MD5, a seriously broken 128 bit 
function, so I doubt you'll find anyone who will express confidence that 
they can find a preimage for any reasonable 384 or 512 bit hash function.

> Is it possible that it would be the other
> way around--that you would discover a method of finding pre-images in
> the latter before discovering a method of finding pre-images in the
> former?

My guess is that there's not a chance in hell of actually constructing a 
preimage in any reasonably well designed 384-bit or longer hash function.

So my only hope is for one of the functions turns out to be utterly 
broken, worse than even MD5, and this isn't directly related to the 384 
vs 512 bit preimage resistance. (It would be interesting to know the 
actual output lengths of the functions. A 384-bit output length might 
suggest that function was better designed than a 512-bit output function 
that only delivered 384 bits of preimage resistance).

Anything might be possible with no specification of the actual functions 
under discussion. Perhaps A is NIST SHA-3-384 and B is "XOR-512". In the 
absence of evidence that anything else is not equal, having two 
functions to attack appears to double my chances if I can keep my 
options open.

Thus my answer is "Yes, it is possible".

> If someone who has real hash function cryptanalysis expertise and who
> takes the latter position could explain what they mean by "more likely
> to fail", then I would be fascinated to hear it.

Yeah, me too.

But I suspect the people who have that "real hash function cryptanalysis 
expertise" and like to chat about it publicly would all fit together 
comfortably in a medium-sized elevator.

> In any case, I'm pretty sure that as a *user* of hash functions what I
> care about is "more likely to fail" (and efficiency), not about "bits
> of security" for any bit-level greater than about 128 (including
> consideration of quantum attacks, multi-target attacks, etc.)

Usually "more is better", but in this case, more output bits may not 
mean a more clueful designer.

- Marsh



More information about the cryptography mailing list