P1 stage 2 on 2 cpu
Hi George,
Is it in your plan to have P1 stage 2 run on 2 cpu? In stage 2 I see two seperate process where one count the difference between primes and the other one the product minus the last step. They can be both working at the same time. Is there a memory limitation? 
Also a 2nd issue, when doing stage 2 with a small amount of ram why not increase by two instead of saving a list of prime steps. I know it's a lot faster your way but also ram agressive. This way we could test stage 2 much higher.

actually stage 2 can be run on many cpu's as it can be separated in ranges. NFS will soon reach the limit of feasablity. ECM will need even more ram to find 60 or 70 digit factors. And most of the time P1 are out of reach of large factors, especially when one prime power is large. We could team up to find one factor on M2137 and find large factors using P1. Ecm is almost done to 45 digits. Would need to test it to a certain b1 on stage 1 and then assigned ranges of P1 stage2. Would anyone be interested?

[QUOTE=jocelynl]NFS will soon reach the limit of feasablity.[/QUOTE]Could you explain in more detail what you mean by that statement please?
Paul 
I'm talking ram size to analyze the 100 million relations of the recent finds it take in excess of 2GB of ram. But then again the advance in technology may permit to reach higher limits.

[QUOTE=jocelynl]I'm talking ram size to analyze the 100 million relations of the recent finds it take in excess of 2GB of ram. But then again the advance in technology may permit to reach higher limits.[/QUOTE]Ok, it's making a bit more sense but I'm still not fully understanding your meaning.
For a start, it's certainly possible to analyze 100M relations in less than 2GB RAM but I concede it can be tricky. I've developed processes to let me do it, as have other workers, but it does seem to require rather more effort than any of the simple approaches. For instance, removing duplicates and (most) singletons can be done with a multipass algorithm in relatively modest amounts of memory. Merges can also be done in stages, each stage using a different range of large primes, but it can take quite a bit of time and labour. I think what I'm really not understanding is your claim that NFS is running out of steam. There are many many integers within reach of current implementations of NFS, far too many to do with a reasonable amount of time and effort. So I guess that you may mean that there are relatively few integers of the form 2^n1 (and possibly 2^n+1) which are still feasible to factor by SNFS. Is that what you mean? Paul 
Yes Paul, that is excatly what I meant. As the numbers grow it will take more and more time to do. It will be kind of stuck at working on the many low digits numbers for a long time. With today's technology what is the highest number (in digits) that can be done with SNFS in a reasonable amount of time (months)?
Joss 
Joss,
Do you plan on updating your p1 sofware for k2^n1? I plan on adding this as another stage in RMA. I value your input on this. Thanks Shane F. 
[QUOTE=jocelynl]Yes Paul, that is excatly what I meant. As the numbers grow it will take more and more time to do. It will be kind of stuck at working on the many low digits numbers for a long time. With today's technology what is the highest number (in digits) that can be done with SNFS in a reasonable amount of time (months)?
Joss[/QUOTE]In this thread [url]http://www.mersenneforum.org/showthread.php?t=5396[/url] was reported the factorization of a 274digit (i.e. 907bit) integer. They started 10 September 2005 and finished 23 January 2006. That counts as "months" to me. Paul 
[QUOTE=jocelynl]Yes Paul, that is excatly what I meant. As the numbers grow it will take more and more time to do. It will be kind of stuck at working on the many low digits numbers for a long time. With today's technology what is the highest number (in digits) that can be done with SNFS in a reasonable amount of time (months)?
Joss[/QUOTE] Aoki et.al. Just did 6,353 C274 (about 911 bits) with SNFS. It took them several months. Allow me to quote John Selfridge: (paraphrased) factoring will ALWAYS be a difficult problem, because any new method is very quickly applied and pushed to its limits. We have not yet reached the limit for NFS, even when restricted to just the Cunningham project. BTW, the definition of "reasonable" will vary from person to person. 
[QUOTE=TTn]Joss,
Do you plan on updating your p1 sofware for k2^n1? I plan on adding this as another stage in RMA. I value your input on this. Thanks Shane F.[/QUOTE] That P1 software was a slow working software using giantint. Perhaps with GIMPS library it could be a addon to LLR. Have you ask Jean if he'd be interested in adding it to his software? And thanks Bob and Paul for your quick and generous replies. Also I'm in no way trying to put down NFS not only does it accomplishes large scale works, it also brings the developpers to a higher level of thinking. Joss 
All times are UTC. The time now is 02:59. 
Powered by vBulletin® Version 3.8.11
Copyright ©2000  2021, Jelsoft Enterprises Ltd.