![]() |
|
|
#1 |
|
Sep 2004
22×5 Posts |
I just know there's lots of eye rolling and sighing going on when those in the know read the thread name, but anyhoo here goes! Some of this has probably been covered already but I was wondering how long it might take with the current resources to factor RSA-640 (193 digits with GNFS - tall order?) - I know this would be record breaking and depend very much on the chosen start values, number of participants, luck etc. but any rough ideas? My main reason for asking is a serious case of crypto-crack withdrawal and I can't think of anything better than to get going again! Is there a problem with choosing relatively accurate start parameters, and thereby a risk of having a project that never finds the factors?
I think an attempt at RSA-640 would definitely bring in a lot of support from the old md5crk/ecc2 people as well as others (both for the RSA-640 challenge at the time and possibly that stay on afterwards to build NFSNet up). Previously it was mentioned that there was a problem with manpower on the NFSNet side - if feasible would rsa-640 need more or less maintenance after it was set running than current projects? Would there be a problem with the resulting matrix size? Just a few thoughts, anyway. |
|
|
|
|
#2 | |
|
Nov 2003
746010 Posts |
Quote:
Sieving could be done without too much difficulty. Dealing with the matrix would be a MAJOR problem. |
|
|
|
|
|
#3 |
|
Aug 2003
Europe
2·97 Posts |
Indeed in the time that the previous one got factored the talk was over 640. Also in that mean time there were hints of certain people that it would not be wise to invest computing time (I am sure there are people privatly working on this number) on this 640 number. But in the meantime there is always the point of being lucky if other methods are used than the GNFS. But also those can go on almost for 1000's of years.
The biggest problem is solving the matrix, sieving the relations can be done with out any hassle. It just takes a bit longer. But the matrix is mind breaking. Developing the matrix solver that can be run in a distributed computing platform is the answer. But the difficulty if it is possible is enormouse. |
|
|
|
|
#4 | |
|
"Nancy"
Aug 2002
Alexandria
2,467 Posts |
Quote:
I seem to remember a comment in a newspaper article after Franke's team factored RSA-576 which said they were working an a 200 digit number next. That probably means RSA-640, but it wasn't clear whether they were already attacking it or merely planning it as the next project for the future. In an email to me, Jens Franke mentioned that they are optimizing the siever for larger numbers (probably allowing larger or more large primes, he wasn't specific) and that they are working on Athlon64 code. Alex |
|
|
|
|
|
#5 | |
|
Aug 2003
Europe
2×97 Posts |
Quote:
|
|
|
|
|
|
#6 |
|
"Nancy"
Aug 2002
Alexandria
2,467 Posts |
Hmm.. the number of elementary particles in the universe is estimated at around 10^80, so it's more like searching a certain electron in this and the next 10^15 parallel universes...
Alex |
|
|
|
|
#7 |
|
Sep 2004
101002 Posts |
Thanks for all the replies - this is very interesting stuff - I hadn't really thought that someone might already be attacking RSA-640, which is funny because pretty much the same thing happened to people from the md5crk project. I suppose I could be outrageous and suggest we attack RSA-704 pre-emptively instead!!!
(212 digits hehehe) Quick question though, how would the matrix produced by an RSA-640 attack differ from that generated from i.e. 11,199- (208 digits - 10 days on a G5 can't be bad - even 100 days for RSA would probably be an acceptable period if the machine could handle the memory needed for the matrix) - is there a GNFS/SNFS difference in the matrix too? or is it number of relations squaring things up?
|
|
|
|
|
#8 | |
|
Nov 2003
164448 Posts |
Quote:
unsolvable on a 32-bit machine. The matrix for 11,199- was quite small as such things go. [and very sparse!]. The matrix for RSA-512 took 2.4G. Of course you could partition the matrix and solve it in parallel on 32-bit machines, but you still need a central machine to hold the entire thing. |
|
|
|
|
|
#9 | |
|
Dec 2003
23·33 Posts |
Quote:
|
|
|
|
|
|
#10 | |
|
Bamboozled!
"πΊππ·π·π"
May 2003
Down not across
10,753 Posts |
Quote:
I have frequently solved matrices on Windows clusters where the matrix size on disk is markedly greater than the memory used by any cluster member. NTFS will very happily store multigigabyte files on a 32-bit machine. Paul |
|
|
|
|
|
#11 | |
|
Bamboozled!
"πΊππ·π·π"
May 2003
Down not across
250018 Posts |
Quote:
Paul |
|
|
|