[cryptography] Current state of brute-forcing random keys?

Marsh Ray marsh at extendedsubset.com
Thu Jun 9 17:22:16 EDT 2011

On 06/09/2011 12:14 PM, Paul Hoffman wrote:
> What is the current state of brute-force attacks on AES-128 blobs?
> Are there recent results where we can estimate the cost of
> brute-forcing 64-bit and 80-bit keys?

The article is behind a paywall, but Google quotes the relevant statistic:
"The standard-cell implementation on a 0.35 µm CMOS process from Philips 
Semiconductors occupies an area of only 0.25 mm"

Let's assume:
* the attacker has only one file for which he wishes to find the key
* you generate the key using a method for which the attacker can do no 
better than brute force for the nominal key size.
* the attacker can do no better than brute force against AES
* the attacker has ASICs using an AES implementation similar in chip 
area to Philips' standard cell
* the attacker can run his chips at a rate of 1 GHz

The last two assumptions are probably wrong. The Philips standard cell 
may not be optimized rapid re-keying and one that is might be larger in 
area. But the estimate of 1 GHz is probably conservative and there's an 
inherent output rate vs. area tradeoff in the pipelining of the design. 
But I think these assumptions are probably within a factor of 10x, i.e., 
a few bits of key size.

So we're close to a nice round figure... 2^32 trials per mm2-second. Now 
you just have to find out how many mm2 of silicon the attacker is going 
to commit to decrypting your file! But obviously that's not the real 
metric, in large systems power and cooling is the limiting factor. How 
big of an air conditioner will he use?

Looking up a random modern Intel chip (Core i5-650 4M Cache, 3.20 GHz), 
it has a die size of 81 mm2 and a max TDP of 73 W. A recent Nvidia chip 
(GTX 480) is 529 mm2 and said to be 250W TDP. The bruteforce algorithm 
need not spend any time at all waiting for data, so it seems like it 
could easily be pushing the limits of power density. 1 W per mm2 seems 
like a reasonable guess, but unless the data center will be built 
underwater you could probably double it to account for the cost of heat 

Which neatly fits 2^32 trials per watt-second. A real engineer would 
probably design the chips to minimize energy-per-trial, but I think our 
estimate is probably still within an order of magnitude or two.

Last I checked, in the US electric power is around $0.07 per kWh in 
areas considered "cheap". So each trial costs $4.53e−18 in electric power.

For an 64-bit key, you expect the adversary to need 2^63 trials for 
which he might pay a power bill of $597.

For an 80-bit key, you expect the adversary to need 2^79 trials for 
which he might pay a power bill of $2.7M.

For a 128-bit key, you expect the adversary to need 2^127 trials for 
which he might pay a power bill of $11.0e21.

Note that some botnet operators control millions of nodes and they do 
not usually pay their own power bills. But they probably don't have the 
efficiency of our specialized ASIC attacker either.

As others have said, you need a real key derivation function (like 
PBKDF2) with a at least a little added work factor for this. If the 
password is going to be human-chosen or human-remembered it's not likely 
to have even this much entropy in practice.

- Marsh

More information about the cryptography mailing list