[cryptography] RSA question
fgrieu at gmail.com
Sun Sep 5 11:13:17 EDT 2010
[reposted with a correction, the log(2) factor]
On 04/09/2010 15:07, Victor.Duchovni at morganstanley.com apparently wrote:
> one could look-up the brute-force cost of RSA
> vs. (ideal) symmetric algorithms, and discover that
> while brute forcing an ideal symmetric algorithm doubles
> in cost for every additional key bit, GNFS factoring cost
> is approximately proportional to
> where "n" is the number of key bits.
No. According to sources such as
the effort to factor a number n bits is approximately proportional to
exp( (n*(log(2)*64/9+o(1)))^(1/3) * log(n*log(2))^(2/3) )
Any of the log(2), 64/9 and o(1) term change the effort estimate
way more than by a constant factor.
> So compared to 1k RSA bits, 16k RSA bits has a GNFS cost
> that is (16*1.96)^(1/3) ~ 3.15 times higher.
This is wrong well beyond the omission of the log(2)*64/9+o(1) term.
By application of the above formula with n=16384 and n=1024, and
expressing the ratio as a power of 2, I (now) get that the cost of
factoring 16 kbits RSA numbers would be approximately 2^190 times that
of factoring a 1 kbits RSA number if we neglect the o(1) term [and still
2^99 times, rather than 3.15 times, when omitting the (necessary)
> If 1k RSA bits is comparable to 80 symmetric bits, then
> 16k RSA bits is comparable to 80*3.15 or 252 bits.
One can't multiply key bit size by ratio of effort; we must instead
*add* the *logarithm* in base 2 of the ratio of effort.
We get that 16k bits RSA would be comparable to 80+219 = 270 bits
symmetric key if we neglect the o(1) term [80+114 = 179 bits when
omitting the log(2)*64/9+o(1) term].
[Wondering if the financial and engineering crisis may have to do with
how *we* do math]
More information about the cryptography