# [cryptography] RSA question

Francois Grieu fgrieu at gmail.com
Sun Sep 5 06:12:15 EDT 2010

``` On 04/09/2010 15:07, Victor.Duchovni at morganstanley.com apparently wrote :
> one could look-up the brute-force cost of RSA
> vs. (ideal) symmetric algorithms, and discover that
> while brute forcing an ideal symmetric algorithm doubles
> in cost for every additional key bit, GNFS factoring cost
> is approximately proportional to
>
>     exp(n^(1/3)*log(n)^(2/3))
>
> where "n" is the number of key bits.
>

No. As far is known, the effort is approximately proportional to

exp( (n*(64/9+o(1)))^(1/3) * log(n)^(2/3) )

Both the 64/9 and the o(1) term change the effort estimate way more than
a constant factor.

> So compared to 1k RSA bits, 16k RSA bits has a GNFS cost
> that is  (16*1.96)^(1/3) ~ 3.15 times higher.

This is wrong well beyond the omission of the 64/9+o(1) term. By
straight application of the above formula with n=16384 and n=1024, and
expressing the ratio as the nearest power of 2, we get that the cost of
factoring 16 kbits RSA numbers would be approximately 2^219 times that
of factoring a 1 kbits RSA number if we neglect the o(1) term [and still
2^114 rather than 3.15 times neglecting the 64/9+o(1) term].

> If 1k RSA bits is comparable to 80 symmetric bits, then 16k RSA bits
is comparable to 80*3.15 or 252 bits.

One can't multiply key bit size by ratio of effort; we must instead
*add* the *logarithm* in base 2 of the ratio of effort.
We get that 16k bits RSA would be comparable to 80+219 = 299 bits
symmetric key if we neglect the o(1) term [80+114 = 194 bits neglecting
the 64/9+o(1) term].

Francois Grieu

[Wondering if the crisis of financial institutions may have to do with
how they do math]

```