mersenneforum.org  

Go Back   mersenneforum.org > Factoring Projects > Factoring

Reply
 
Thread Tools
Old 2008-06-09, 19:16   #1
BotXXX
 
BotXXX's Avatar
 
Aug 2003
Europe

110000102 Posts
Default Factoring a 1024-bit RSA with GNFS?

As maybe you might have heared, Kaspersky Labs found a new version of a virus that encrypts files for ransom. This time it looks like they are using a well implemented encrypting engine and the used key is 1024 bits in RSA flavour.

What would be the effort to factor this key and help free the information instead of paying the ransom? As one of the programmers/researchers of the anti-virri companies write [quote]Along with antivirus companies around the world, we're faced with the task of cracking the RSA 1024-bit key. This is a huge cryptographic challenge. We estimate it would take around 15 million modern computers, running for about a year, to crack such a key.[/qoute] In my opinion you wouldnt need that much machines, or would you?
BotXXX is offline   Reply With Quote
Old 2008-06-09, 19:20   #2
jasonp
Tribal Bullet
 
jasonp's Avatar
 
Oct 2004

3,541 Posts
Default

Quote:
Originally Posted by BotXXX View Post
What would be the effort to factor this key and help free the information instead of paying the ransom? As one of the programmers/researchers of the anti-virri companies write
Quote:
Along with antivirus companies around the world, we're faced with the task of cracking the RSA 1024-bit key. This is a huge cryptographic challenge. We estimate it would take around 15 million modern computers, running for about a year, to crack such a key.
In my opinion you wouldnt need that much machines, or would you?
It doesn't matter, this is just another virus out of thousands. Why does it merit millions of years of machine time when everyone's botnet also uses a plugin architecture protected with RSA?

Nobody has a very good feel for the amount of effort needed for 1024-bit GNFS.
jasonp is offline   Reply With Quote
Old 2008-06-09, 19:33   #3
xilman
Bamboozled!
 
xilman's Avatar
 
"π’‰Ίπ’ŒŒπ’‡·π’†·π’€­"
May 2003
Down not across

10,753 Posts
Default

Quote:
Originally Posted by jasonp View Post
Nobody has a very good feel for the amount of effort needed for 1024-bit GNFS.
I take slight issue with that statement.

I'm fairly sure we (meaning a bunch of the usual suspects) know to within a factor of ten how much effort will needed with present-day algorithms and hardware. Some of the usual suspects published a paper on estimates for kilobit GNFS not so long ago.

At the moment, kilobit GNFS is out of reach of all but an incredibly massive computation; much harder in real terms than RSA-129 was in 1993 and than was 512-bit GNFS in 2000. Current state of the art is, IMO, somewhere between 700 and 800 bits. That estimate is based in large part on the relatively recent factorizations of RSA-200 (i.e. 660 bits) by GNFS and a 1039-bit SNFS, in conjunction with the 2/3 rule for SNFS/GNFS difficulty.


Paul
xilman is offline   Reply With Quote
Old 2008-06-09, 21:08   #4
jasonp
Tribal Bullet
 
jasonp's Avatar
 
Oct 2004

3,541 Posts
Default

Quote:
Originally Posted by xilman View Post
I take slight issue with that statement.

I'm fairly sure we (meaning a bunch of the usual suspects) know to within a factor of ten how much effort will needed with present-day algorithms and hardware. Some of the usual suspects published a paper on estimates for kilobit GNFS not so long ago.
I know, and that paper was very interesting. In my head I was thinking about the implementation side, i.e. how much you'd have to multiply the theoretical runtime in order to build the shortcuts that let sieving happen on today's computers. Plus dealing with a matrix that has a billion columns.
jasonp is offline   Reply With Quote
Old 2008-06-10, 04:48   #5
Andi47
 
Andi47's Avatar
 
Oct 2004
Austria

46628 Posts
Default

Quote:
Originally Posted by jasonp View Post
I know, and that paper was very interesting. In my head I was thinking about the implementation side, i.e. how much you'd have to multiply the theoretical runtime in order to build the shortcuts that let sieving happen on today's computers. Plus dealing with a matrix that has a billion columns.
Just out of curiosity: How much RAM would such a matrix monster take?
Andi47 is offline   Reply With Quote
Old 2008-06-10, 08:14   #6
xilman
Bamboozled!
 
xilman's Avatar
 
"π’‰Ίπ’ŒŒπ’‡·π’†·π’€­"
May 2003
Down not across

10,753 Posts
Default

Quote:
Originally Posted by Andi47 View Post
Just out of curiosity: How much RAM would such a matrix monster take?
a 10M square matrix takes about 3GB with the implementations of block Lanczos with which I'm familiar. That number is an estimate good to, at best, factor of 2 so don't take it too literally. I'm not so familiar with block Wiedemann so can't give a corresponding estimate for that algorithm.

A fair estimate for a 1G matrix will be that it would take (1G/10M) * (a few) times as much memory. Memory usage depends on the density and on a number of implementation details (that's where " a few" comes from) but I'm only giving an estimate good to within a factor of a few or so. The scaling factor is, perhaps, 1000 --- so around 3 terabytes. Note that not all the memory need be on a single machine, though distributing the storage will itself increase the amount of storage required.

Answer to your question: between 1 and 10TB.

Executive summary: lots.


Paul

Last fiddled with by xilman on 2008-06-10 at 08:18 Reason: Fix a power-of-ten error!
xilman is offline   Reply With Quote
Old 2008-06-10, 13:45   #7
R.D. Silverman
 
R.D. Silverman's Avatar
 
Nov 2003

22×5×373 Posts
Default

Quote:
Originally Posted by xilman View Post
a 10M square matrix takes about 3GB with the implementations of block Lanczos with which I'm familiar. That number is an estimate good to, at best, factor of 2 so don't take it too literally. I'm not so familiar with block Wiedemann so can't give a corresponding estimate for that algorithm.

A fair estimate for a 1G matrix will be that it would take (1G/10M) * (a few) times as much memory. Memory usage depends on the density and on a number of implementation details (that's where " a few" comes from) but I'm only giving an estimate good to within a factor of a few or so. The scaling factor is, perhaps, 1000 --- so around 3 terabytes. Note that not all the memory need be on a single machine, though distributing the storage will itself increase the amount of storage required.

Answer to your question: between 1 and 10TB.

Executive summary: lots.


Paul
I think that it will be close to the upper end of that estimate.

Space requirements scale with sqrt(time).

RSA-1024 will take about 32000 times as long as RSA-200,
so the matrix will take about 180x as much space. RSA_200 required
about 44 Gbytes.
R.D. Silverman is offline   Reply With Quote
Old 2008-06-10, 15:57   #8
xilman
Bamboozled!
 
xilman's Avatar
 
"π’‰Ίπ’ŒŒπ’‡·π’†·π’€­"
May 2003
Down not across

2A0116 Posts
Default

Quote:
Originally Posted by R.D. Silverman View Post
I think that it will be close to the upper end of that estimate.

Space requirements scale with sqrt(time).

RSA-1024 will take about 32000 times as long as RSA-200,
so the matrix will take about 180x as much space. RSA_200 required
about 44 Gbytes.
I'm happy with that estimate. You're predicting 8TB, I 3TB. I doubt that either of us can honestly say our error bars are much less than times/divide 2. Indeed, your estimate is bsed on a rather smaller extrapolation than mine and so stands a good chance of being more accurate.

We should set up another of those silly polls where the clueless pull figures out of thin air and then pontificate on their supposed significance.


Paul

Last fiddled with by xilman on 2008-06-10 at 16:00
xilman is offline   Reply With Quote
Old 2008-06-10, 16:49   #9
bsquared
 
bsquared's Avatar
 
"Ben"
Feb 2007

3·1,171 Posts
Default

Just for fun, lets look at the space required to hold all the relations.

From 6^383+1, RSA-640, and RSA-200 I found that the digit size is fairly linear with bits in the large prime bounds. A fit to the three points (165,31) (193,34), (200,35) gives the linear relationship 0.1122 * (digits) + 12.4558. So the 309 digit RSA-1024 would require 48 bit large primes (silly assumption #1).

Using 48 bit large primes would require roughly 1.1e13 relations based on the following heuristic: 0.65 * 2 * pi(2^48). 0.65 is a slight lowering from the 0.68 that was observed with RSA-200 (silly assumption #2).

If each relation can be stored in 100 bytes (silly assumption #3, based on piles of relations in ggnfs format I have laying around), then we'd need about 1.1e15 bytes of storage for the relations: over 1 exabyte.

This is about 137 thousand dual-layer DVD's, which at the going rate at best buy is ~ $345k USD. Let's not even consider the shipping cost .

- ben.

[edit]
O.k., so one would probably compress the relations... but how many CPU-years does zipping/unzipping 1 exabyte add to the computational effort?

Last fiddled with by bsquared on 2008-06-10 at 16:51
bsquared is offline   Reply With Quote
Old 2008-06-10, 17:03   #10
retina
Undefined
 
retina's Avatar
 
"The unspeakable one"
Jun 2006
My evil lair

183416 Posts
Default

Quote:
Originally Posted by bsquared View Post
... over 1 exabyte.

This is about 137 thousand dual-layer DVD's, which at the going rate at best buy is ~ $345k USD. Let's not even consider the shipping cost
But why would you waste your money buying DVDs? Better to buy 1100 x 1TB HDDs, then you can use them again after the factorisation is done.

Or another completely different approach that requires minuscule data storage, is just start TFing the modulus. Shouldn't take more than a gazillion (+- a zillion or 3) years to finish.
retina is online now   Reply With Quote
Old 2008-06-10, 17:21   #11
bsquared
 
bsquared's Avatar
 
"Ben"
Feb 2007

351310 Posts
Default

Quote:
Originally Posted by retina View Post
But why would you waste your money buying DVDs? Better to buy 1100 x 1TB HDDs, then you can use them again after the factorisation is done.. Shouldn't take more than a gazillion (+- a zillion or 3) years to finish.
Ok fine
I'll buy 1100 1TB external HDD's at $300 a pop for a total of $330k, saving $15k and gaining reuseability. Either one gives a feel for the scale involved, no?

[edit - just saw the white lettering...]
How about ECM? How many curves at what B1 bound would one expect to run to find a 154 digit factor? Even more laughable, how much RAM would stage 2 require?
bsquared is offline   Reply With Quote
Reply



Similar Threads
Thread Thread Starter Forum Replies Last Post
factoring special form of 1024-bit RSA key Johnatan YAFU 20 2016-04-22 04:33
Factoring 1024-bit number using Yafu Anyone YAFU 30 2015-09-01 13:25
Factor 1024-bit on 2x512 bit (help) Ogonia YAFU 23 2012-11-15 06:51
Seeking GNFS factoring advice... WraithX Msieve 18 2012-05-20 22:19
any good GNFS factoring programs? ixfd64 Factoring 1 2004-04-27 09:41

All times are UTC. The time now is 00:57.


Sat Jul 17 00:57:30 UTC 2021 up 49 days, 22:44, 1 user, load averages: 1.13, 1.25, 1.32

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.