![]() |
|
|
#1 |
|
Aug 2003
Europe
110000102 Posts |
As maybe you might have heared, Kaspersky Labs found a new version of a virus that encrypts files for ransom. This time it looks like they are using a well implemented encrypting engine and the used key is 1024 bits in RSA flavour.
What would be the effort to factor this key and help free the information instead of paying the ransom? As one of the programmers/researchers of the anti-virri companies write [quote]Along with antivirus companies around the world, we're faced with the task of cracking the RSA 1024-bit key. This is a huge cryptographic challenge. We estimate it would take around 15 million modern computers, running for about a year, to crack such a key.[/qoute] In my opinion you wouldnt need that much machines, or would you? |
|
|
|
|
|
#2 | ||
|
Tribal Bullet
Oct 2004
3,541 Posts |
Quote:
Nobody has a very good feel for the amount of effort needed for 1024-bit GNFS. |
||
|
|
|
|
|
#3 | |
|
Bamboozled!
"πΊππ·π·π"
May 2003
Down not across
10,753 Posts |
Quote:
I'm fairly sure we (meaning a bunch of the usual suspects) know to within a factor of ten how much effort will needed with present-day algorithms and hardware. Some of the usual suspects published a paper on estimates for kilobit GNFS not so long ago. At the moment, kilobit GNFS is out of reach of all but an incredibly massive computation; much harder in real terms than RSA-129 was in 1993 and than was 512-bit GNFS in 2000. Current state of the art is, IMO, somewhere between 700 and 800 bits. That estimate is based in large part on the relatively recent factorizations of RSA-200 (i.e. 660 bits) by GNFS and a 1039-bit SNFS, in conjunction with the 2/3 rule for SNFS/GNFS difficulty. Paul |
|
|
|
|
|
|
#4 | |
|
Tribal Bullet
Oct 2004
3,541 Posts |
Quote:
|
|
|
|
|
|
|
#5 | |
|
Oct 2004
Austria
46628 Posts |
Quote:
|
|
|
|
|
|
|
#6 | |
|
Bamboozled!
"πΊππ·π·π"
May 2003
Down not across
10,753 Posts |
Quote:
A fair estimate for a 1G matrix will be that it would take (1G/10M) * (a few) times as much memory. Memory usage depends on the density and on a number of implementation details (that's where " a few" comes from) but I'm only giving an estimate good to within a factor of a few or so. The scaling factor is, perhaps, 1000 --- so around 3 terabytes. Note that not all the memory need be on a single machine, though distributing the storage will itself increase the amount of storage required. Answer to your question: between 1 and 10TB. Executive summary: lots. Paul Last fiddled with by xilman on 2008-06-10 at 08:18 Reason: Fix a power-of-ten error! |
|
|
|
|
|
|
#7 | |
|
Nov 2003
22×5×373 Posts |
Quote:
Space requirements scale with sqrt(time). RSA-1024 will take about 32000 times as long as RSA-200, so the matrix will take about 180x as much space. RSA_200 required about 44 Gbytes. |
|
|
|
|
|
|
#8 | |
|
Bamboozled!
"πΊππ·π·π"
May 2003
Down not across
2A0116 Posts |
Quote:
We should set up another of those silly polls where the clueless pull figures out of thin air and then pontificate on their supposed significance. Paul Last fiddled with by xilman on 2008-06-10 at 16:00 |
|
|
|
|
|
|
#9 |
|
"Ben"
Feb 2007
3·1,171 Posts |
Just for fun, lets look at the space required to hold all the relations.
From 6^383+1, RSA-640, and RSA-200 I found that the digit size is fairly linear with bits in the large prime bounds. A fit to the three points (165,31) (193,34), (200,35) gives the linear relationship 0.1122 * (digits) + 12.4558. So the 309 digit RSA-1024 would require 48 bit large primes (silly assumption #1). Using 48 bit large primes would require roughly 1.1e13 relations based on the following heuristic: 0.65 * 2 * pi(2^48). 0.65 is a slight lowering from the 0.68 that was observed with RSA-200 (silly assumption #2). If each relation can be stored in 100 bytes (silly assumption #3, based on piles of relations in ggnfs format I have laying around), then we'd need about 1.1e15 bytes of storage for the relations: over 1 exabyte. This is about 137 thousand dual-layer DVD's, which at the going rate at best buy is ~ $345k USD. Let's not even consider the shipping cost .- ben. [edit] O.k., so one would probably compress the relations... but how many CPU-years does zipping/unzipping 1 exabyte add to the computational effort? Last fiddled with by bsquared on 2008-06-10 at 16:51 |
|
|
|
|
|
#10 | |
|
Undefined
"The unspeakable one"
Jun 2006
My evil lair
183416 Posts |
Quote:
Or another completely different approach that requires minuscule data storage, is just start TFing the modulus. Shouldn't take more than a gazillion (+- a zillion or 3) years to finish. |
|
|
|
|
|
|
#11 | |
|
"Ben"
Feb 2007
351310 Posts |
Quote:
![]() I'll buy 1100 1TB external HDD's at $300 a pop for a total of $330k, saving $15k and gaining reuseability. Either one gives a feel for the scale involved, no? [edit - just saw the white lettering...] How about ECM? How many curves at what B1 bound would one expect to run to find a 154 digit factor? Even more laughable, how much RAM would stage 2 require? |
|
|
|
|
![]() |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| factoring special form of 1024-bit RSA key | Johnatan | YAFU | 20 | 2016-04-22 04:33 |
| Factoring 1024-bit number using Yafu | Anyone | YAFU | 30 | 2015-09-01 13:25 |
| Factor 1024-bit on 2x512 bit (help) | Ogonia | YAFU | 23 | 2012-11-15 06:51 |
| Seeking GNFS factoring advice... | WraithX | Msieve | 18 | 2012-05-20 22:19 |
| any good GNFS factoring programs? | ixfd64 | Factoring | 1 | 2004-04-27 09:41 |