20190403, 15:20  #23  
"Luke Richards"
Jan 2018
Birmingham, UK
3^{2}×31 Posts 
Quote:


20190404, 06:54  #24  
"Luke Richards"
Jan 2018
Birmingham, UK
3^{2}×31 Posts 
Quote:


20190404, 07:07  #25 
"Luke Richards"
Jan 2018
Birmingham, UK
3^{2}×31 Posts 
How does the Cunningham Project function?
I'll post this here, mods may decide this is worthy of a seperate thread and can therefore move it as appropriate.
I've just reread all the posts on this thread to try to make sure that the answer hadn't already been given. I've managed to glean a few key points:
I have opted to do 3) for now and I'm close to reaching 100k BOINC credit in the past week on this project. I do however have a few questions, largely of general curiosity but which may be of interest to anyone looking to join TCP in the future. 1) I'm not 100% clear on what my NFS@home activities are doing. My understanding is that I will not be finding factors myself, but rather doing a lot of the preparatory work to allow others to postprocess, which is where the factors will be found, is that correct? 2) Is NFS@home the most efficient/productive thing to be done by someone who has over $1000 in Google Cloud credit to use up? I'm still on the trial credit tier, so I'm limited to 8 cores per instance and 24 cores total at any one time. 3) If someone (not necessarily me  I'm content with NFS@home at the moment) wanted to do GNFS or SNFS, how would they go about it: acquiring the software, setting it up, choosing the composites to factor, reporting the results etc? There are other questions I've had over the past few days but can't remember them right now  I'll post back later when I've got them. 
20190404, 11:41  #26 
Tribal Bullet
Oct 2004
2^{2}×3×293 Posts 
To get started with the factoring software we use, see the Factoring subforum here and ask questions in the NFS@Home subforum (or the NFS@Home message board, though there's barely any activity there)
NFS uses sieving to find relations, and NFS@Home uses the BOINC client to do the sieving. You won't get credit for factors found because finding the factors requires piling all of the relations in one place and performing extremely complex postprocessing on them. You can of course volunteer to run NFS software yourself to do the preprocessing, and you will get credit for completed jobs (but not BOINC credit, in case that matters to you) but those are jobs that run for weeks and need a pretty big computer. The postprocessing for the largest jobs (pretty much all the Cunningham numbers) requires nationallevel computing resources and Greg Childers has an NSF grant to occasionally use big clusters to do the postprocessing. Think 10002000 cores with highspeed interconnects, working together on a single linear algebra problem. That isn't anything a hobbyist can reasonably hope to contribute to. This won't get you closer to factoring your target number, but it's also a valuable lesson for your students that smart people can change their mind because of new things that they learn. 
20190404, 15:12  #27  
Mar 2018
201_{8} Posts 
Quote:
I'm not 100% clear on what my NFS@home activities are doing. I'll say things that jasonp said, but in a different way. Factoring with NFS consists of three "steps": preparation (finding a poly), sieving, postprocessing. With NFS@Home the first and last steps are done by few dedicated people and the general crowd using BOINC is doing the middle part – sieving. Sieving is something that can be paralleled very well. I'm surprised it's still not done on GPUs. You contribute "relations" that you sieve out, and they are found in thousands by every work unit you run on BOINC. To do the factoring for a large number (the size of which remaining Cunningham numbers of Cunningham tables are), you need hundreds of millions of those, multiple gigabytes of data. The postprocessing consists of filtering the relations, linear algebra and square root. The factors are found on the square root step. The linear algebra is what takes most time and resources. Filtering is the most annoying one (it can result in "nope, need more sieving", you might have to do it multiple times to get a better matrix for linear algebra). The processing of a smaller number like a 150160 digit size GNFS can be done by an enthusiast. RemainingCunninghamnumbersofCunninghamtables are much larger than that (200 digit GNFS, 270 digit SNFS) and require Big Guns power. Is NFS@home the most efficient/productive thing to be done I mean, depends on your goals? If your goal is to help humanity factor some numbers that are in GNFS/SNFSfeasible range than yes. If your goal is to help Cunningham Project specifically – yes, but I guess only run 16e subproject? If your goal is to factor the composite cofactors mentioned above to prove your number than no, just keep on ECMing them and look into stuff wblipp wrote in that other thread. There's also another BOINC project yoyo@home that does ECM, but there are no Cunninghamprojectnumbers there. If someone wanted to do GNFS or SNFS, how would they go about it There is a good thread for an intro into that in one of the subforums, but for some reason I cannot find it right now? Gah. I vaguely remember it was titled something like "welcome to number field sieve", but search by titles doesn't find anything with the word "welcome" right now. Started this year. It had links for the software and a guide to factor a 100digit number yourself to get a feel of the tools. Last fiddled with by DukeBG on 20190404 at 15:17 

20190404, 18:01  #28 
Jun 2012
2×3^{2}×151 Posts 
Last fiddled with by swellman on 20190404 at 18:23 
20190404, 22:11  #29  
6809 > 6502
"""""""""""""""""""
Aug 2003
101×103 Posts
1111001010110_{2} Posts 
Quote:
Dario Alpern's factoring tool that switches to different methods demonstrates changing methods at different levels. 

20190405, 06:43  #30  
"Luke Richards"
Jan 2018
Birmingham, UK
3^{2}·31 Posts 
Quote:


20190407, 09:26  #31 
Jul 2018
2^{2}·7 Posts 
So far done around 4k curves at B1=11000000, B2=10*B1 for the C1996.

20190410, 12:20  #32  
Nov 2003
5^{3}×59 Posts 
Do you *really* want to help?
Quote:
If people *really* want to help: There are currently 69 unfinished numbers from the 1987 hardcover edition of the Cunningham book. It would be nice to finish them. They are all from base 2, with index < 1200 for 2,n+ and index < 2400 for 2LM. Two of them have been sieved and are waiting for LA. (2,2078M, 2,2098L) Two of them are about to start sieving: (2,2102L, 2, 2158M). One of them is relatively easy: 2,1144+ (exponent divisible by 11) Several more are "within reach" of NFS@Home: 2,1063+, 2,2126M, 2,1072+, 2,1076+, 2,2150M, 2,2158L They start to get quite a bit harder after that via SNFS. Of course the 2 table was finished to index 1200, so the rest are all doable, but it would take a massive effort. I have run an additional 1000 ECM curves on 2,4k+ up to 1136 with B1 = 3G I will finish the rest of 2,4k+ up to 1200 in about 6 months. How about a very large ECM effort to pick off as many of the rest as we can? Note that because they are base 2, they are particularly efficient for GMPECM. Perhaps yoyo might tackle these with B1 = 850M? 

20190410, 14:23  #33  
Sep 2002
Database er0rr
2^{2}·5·157 Posts 
Quote:
Last fiddled with by paulunderwood on 20190410 at 14:25 

Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Cunningham ECM efforts  pinhodecarlos  Cunningham Tables  7  20171221 13:29 
Cunningham ECM Now Futile?  R.D. Silverman  GMPECM  4  20120425 02:45 
Cunningham Project on YouTube  Batalov  Cunningham Tables  0  20120226 02:58 
Extended Cunningham or so  rekcahx  Factoring  6  20110819 12:45 
Introduction: ECM work done on Cunningham Project composites  garo  Cunningham Tables  2  20050120 10:06 