mersenneforum.org  

Go Back   mersenneforum.org > Other Stuff > Archived Projects > NFSNET Discussion

 
 
Thread Tools
Old 2004-09-25, 23:47   #1
Minty
 
Minty's Avatar
 
Sep 2004

22×5 Posts
Default RSA-640

I just know there's lots of eye rolling and sighing going on when those in the know read the thread name, but anyhoo here goes! Some of this has probably been covered already but I was wondering how long it might take with the current resources to factor RSA-640 (193 digits with GNFS - tall order?) - I know this would be record breaking and depend very much on the chosen start values, number of participants, luck etc. but any rough ideas? My main reason for asking is a serious case of crypto-crack withdrawal and I can't think of anything better than to get going again! Is there a problem with choosing relatively accurate start parameters, and thereby a risk of having a project that never finds the factors?
I think an attempt at RSA-640 would definitely bring in a lot of support from the old md5crk/ecc2 people as well as others (both for the RSA-640 challenge at the time and possibly that stay on afterwards to build NFSNet up). Previously it was mentioned that there was a problem with manpower on the NFSNet side - if feasible would rsa-640 need more or less maintenance after it was set running than current projects? Would there be a problem with the resulting matrix size? Just a few thoughts, anyway.
Minty is offline  
Old 2004-09-27, 13:00   #2
R.D. Silverman
 
R.D. Silverman's Avatar
 
Nov 2003

746010 Posts
Default

Quote:
Originally Posted by Minty
I just know there's lots of eye rolling and sighing going on when those in the know read the thread name, but anyhoo here goes! Some of this has probably been covered already but I was wondering how long it might take with the current resources to factor RSA-640 (193 digits with GNFS - tall order?) - I know this would be record breaking and depend very much on the chosen start values, number of participants, luck etc. but any rough ideas? My main reason for asking is a serious case of crypto-crack withdrawal and I can't think of anything better than to get going again! Is there a problem with choosing relatively accurate start parameters, and thereby a risk of having a project that never finds the factors?
I think an attempt at RSA-640 would definitely bring in a lot of support from the old md5crk/ecc2 people as well as others (both for the RSA-640 challenge at the time and possibly that stay on afterwards to build NFSNet up). Previously it was mentioned that there was a problem with manpower on the NFSNet side - if feasible would rsa-640 need more or less maintenance after it was set running than current projects? Would there be a problem with the resulting matrix size? Just a few thoughts, anyway.

Sieving could be done without too much difficulty. Dealing with the
matrix would be a MAJOR problem.
R.D. Silverman is offline  
Old 2004-09-28, 06:50   #3
BotXXX
 
BotXXX's Avatar
 
Aug 2003
Europe

2·97 Posts
Default

Indeed in the time that the previous one got factored the talk was over 640. Also in that mean time there were hints of certain people that it would not be wise to invest computing time (I am sure there are people privatly working on this number) on this 640 number. But in the meantime there is always the point of being lucky if other methods are used than the GNFS. But also those can go on almost for 1000's of years.

The biggest problem is solving the matrix, sieving the relations can be done with out any hassle. It just takes a bit longer. But the matrix is mind breaking.

Developing the matrix solver that can be run in a distributed computing platform is the answer. But the difficulty if it is possible is enormouse.
BotXXX is offline  
Old 2004-09-28, 07:23   #4
akruppa
 
akruppa's Avatar
 
"Nancy"
Aug 2002
Alexandria

2,467 Posts
Default

Quote:
But in the meantime there is always the point of being lucky if other methods are used than the GNFS.
You mean special factoring algorithms like ECM? Forget it. I haven't computed what optimal parameters for a p95 would be, but I expect each curve would take something like a year for a one-in-several-millions chance of success.

I seem to remember a comment in a newspaper article after Franke's team factored RSA-576 which said they were working an a 200 digit number next. That probably means RSA-640, but it wasn't clear whether they were already attacking it or merely planning it as the next project for the future. In an email to me, Jens Franke mentioned that they are optimizing the siever for larger numbers (probably allowing larger or more large primes, he wasn't specific) and that they are working on Athlon64 code.

Alex
akruppa is offline  
Old 2004-09-28, 09:44   #5
BotXXX
 
BotXXX's Avatar
 
Aug 2003
Europe

2×97 Posts
Default

Quote:
Originally Posted by akruppa
You mean special factoring algorithms like ECM? Forget it. I haven't computed what optimal parameters for a p95 would be, but I expect each curve would take something like a year for a one-in-several-millions chance of success.
i more ment the trialfactoring attempt. way way way to many possibilities but you might strike lucky. As in finding the only talking dust particle in the world ;)
BotXXX is offline  
Old 2004-09-28, 10:23   #6
akruppa
 
akruppa's Avatar
 
"Nancy"
Aug 2002
Alexandria

2,467 Posts
Default

Hmm.. the number of elementary particles in the universe is estimated at around 10^80, so it's more like searching a certain electron in this and the next 10^15 parallel universes...

Alex
akruppa is offline  
Old 2004-09-29, 15:35   #7
Minty
 
Minty's Avatar
 
Sep 2004

101002 Posts
Default

Thanks for all the replies - this is very interesting stuff - I hadn't really thought that someone might already be attacking RSA-640, which is funny because pretty much the same thing happened to people from the md5crk project. I suppose I could be outrageous and suggest we attack RSA-704 pre-emptively instead!!! (212 digits hehehe) Quick question though, how would the matrix produced by an RSA-640 attack differ from that generated from i.e. 11,199- (208 digits - 10 days on a G5 can't be bad - even 100 days for RSA would probably be an acceptable period if the machine could handle the memory needed for the matrix) - is there a GNFS/SNFS difference in the matrix too? or is it number of relations squaring things up?
Minty is offline  
Old 2004-09-29, 16:15   #8
R.D. Silverman
 
R.D. Silverman's Avatar
 
Nov 2003

164448 Posts
Default

Quote:
Originally Posted by Minty
Thanks for all the replies - this is very interesting stuff - I hadn't really thought that someone might already be attacking RSA-640, which is funny because pretty much the same thing happened to people from the md5crk project. I suppose I could be outrageous and suggest we attack RSA-704 pre-emptively instead!!! (212 digits hehehe) Quick question though, how would the matrix produced by an RSA-640 attack differ from that generated from i.e. 11,199- (208 digits - 10 days on a G5 can't be bad - even 100 days for RSA would probably be an acceptable period if the machine could handle the memory needed for the matrix) - is there a GNFS/SNFS difference in the matrix too? or is it number of relations squaring things up?
It is highly probable that the matrix would exceed 4G, and therefore be
unsolvable on a 32-bit machine. The matrix for 11,199- was quite small
as such things go. [and very sparse!]. The matrix for RSA-512 took
2.4G.

Of course you could partition the matrix and solve it in parallel on 32-bit
machines, but you still need a central machine to hold the entire thing.
R.D. Silverman is offline  
Old 2004-09-29, 16:38   #9
S00113
 
S00113's Avatar
 
Dec 2003

23·33 Posts
Default

Quote:
Originally Posted by Bob Silverman
It is highly probable that the matrix would exceed 4G, and therefore be
unsolvable on a 32-bit machine.
Would a 64 way Itanium 2 SuperDome with 128GB RAM be sufficient?
S00113 is offline  
Old 2004-09-29, 22:24   #10
xilman
Bamboozled!
 
xilman's Avatar
 
"π’‰Ίπ’ŒŒπ’‡·π’†·π’€­"
May 2003
Down not across

10,753 Posts
Default

Quote:
Originally Posted by Bob Silverman
It is highly probable that the matrix would exceed 4G, and therefore be
unsolvable on a 32-bit machine. The matrix for 11,199- was quite small
as such things go. [and very sparse!]. The matrix for RSA-512 took
2.4G.

Of course you could partition the matrix and solve it in parallel on 32-bit
machines, but you still need a central machine to hold the entire thing.
As long as the 32-bit machines have access to a >32-bit filesystem, all should be well. There is no need for a central machine to store the entire matrix in memory, only that it be able to read it from a file.

I have frequently solved matrices on Windows clusters where the matrix size on disk is markedly greater than the memory used by any cluster member. NTFS will very happily store multigigabyte files on a 32-bit machine.

Paul
xilman is offline  
Old 2004-09-29, 22:26   #11
xilman
Bamboozled!
 
xilman's Avatar
 
"π’‰Ίπ’ŒŒπ’‡·π’†·π’€­"
May 2003
Down not across

250018 Posts
Default

Quote:
Originally Posted by S00113
Would a 64 way Itanium 2 SuperDome with 128GB RAM be sufficient?
Almost certainly.

Paul
xilman is offline  
 

Thread Tools


All times are UTC. The time now is 00:13.


Sat Jul 17 00:13:53 UTC 2021 up 49 days, 22:01, 1 user, load averages: 1.54, 1.75, 1.62

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.