mersenneforum.org  

Go Back   mersenneforum.org > Factoring Projects > Cunningham Tables

Reply
 
Thread Tools
Old 2019-04-03, 15:20   #23
lukerichards
 
lukerichards's Avatar
 
"Luke Richards"
Jan 2018
Birmingham, UK

32×31 Posts
Default

Quote:
Originally Posted by Dr Sardonicus View Post
Just by way of "not giving up," It occurred to me to wonder if you had checked, for your PRP number N = 3^504206 + 2, and some base b,

gcd(N, Mod(b,N)^((N-1)/f) - 1), and

f = each known prime factor of 3^504206 + 1 and also (just for the sake of thoroughness)

f = each composite cofactor with no known proper factors.
No, some time ago I wrote a little program in python to test something to do with GCDs, but it wasn't think. Thanks for the suggestion.
lukerichards is offline   Reply With Quote
Old 2019-04-04, 06:54   #24
lukerichards
 
lukerichards's Avatar
 
"Luke Richards"
Jan 2018
Birmingham, UK

32×31 Posts
Default

Quote:
Originally Posted by penlu View Post
I'd offer to help ECM some of the smaller remaining composites up to 60 digits... but not for free: as payment, I want an explanation as to why you want this particular number factored.

Edit: though I'll warn you first that going up to 60 digits on C1996 (the smallest composite amongst those remaining) has a factor-finding probability lower than the probability of grabbing two random hydrogen atoms from the universe, with replacement, and then discovering that they were the same.
Only just spotted this - thanks for the very kind offer, which will be gladly accepted if you're satisfied with my explanation of why I'm doing this. It would be a great lesson for my students about asking for help etc and working collaboratively. No worries if you think you have better things to spend your CPU cycles on though :)
lukerichards is offline   Reply With Quote
Old 2019-04-04, 07:07   #25
lukerichards
 
lukerichards's Avatar
 
"Luke Richards"
Jan 2018
Birmingham, UK

32×31 Posts
Default How does the Cunningham Project function?

I'll post this here, mods may decide this is worthy of a seperate thread and can therefore move it as appropriate.

I've just re-read all the posts on this thread to try to make sure that the answer hadn't already been given. I've managed to glean a few key points:
  • As we know, Cunningham numbers are b^n\pm1
  • The Cunningham Project seeks to factor these numbers
  • The Cunningham tables are a list of factors of these numbers ordered by exponent
  • Numbers are generally referred to in the form b,n,+/- where, for example 3,676,+ would refer to 3^{676}+1 and 3,676,- would refer to 3^{676}-1
  • Cunningham numbers have already been factored to levels which are relatively easy to do - any further work requires a lot of computing
  • Those wishing to contribute seem to have a few options: 1) ECM existing known composite factors (expect to get factors no larger than ~60 digits); 2) GNFS or SNFS existing known composites at home; or 3) Download BOINC and join NFS@home

I have opted to do 3) for now and I'm close to reaching 100k BOINC credit in the past week on this project. I do however have a few questions, largely of general curiosity but which may be of interest to anyone looking to join TCP in the future.

1) I'm not 100% clear on what my NFS@home activities are doing. My understanding is that I will not be finding factors myself, but rather doing a lot of the preparatory work to allow others to post-process, which is where the factors will be found, is that correct?
2) Is NFS@home the most efficient/productive thing to be done by someone who has over $1000 in Google Cloud credit to use up? I'm still on the trial credit tier, so I'm limited to 8 cores per instance and 24 cores total at any one time.
3) If someone (not necessarily me - I'm content with NFS@home at the moment) wanted to do GNFS or SNFS, how would they go about it: acquiring the software, setting it up, choosing the composites to factor, reporting the results etc?

There are other questions I've had over the past few days but can't remember them right now - I'll post back later when I've got them.
lukerichards is offline   Reply With Quote
Old 2019-04-04, 11:41   #26
jasonp
Tribal Bullet
 
jasonp's Avatar
 
Oct 2004

22×3×293 Posts
Default

To get started with the factoring software we use, see the Factoring subforum here and ask questions in the NFS@Home subforum (or the NFS@Home message board, though there's barely any activity there)

NFS uses sieving to find relations, and NFS@Home uses the BOINC client to do the sieving. You won't get credit for factors found because finding the factors requires piling all of the relations in one place and performing extremely complex postprocessing on them. You can of course volunteer to run NFS software yourself to do the preprocessing, and you will get credit for completed jobs (but not BOINC credit, in case that matters to you) but those are jobs that run for weeks and need a pretty big computer.

The postprocessing for the largest jobs (pretty much all the Cunningham numbers) requires national-level computing resources and Greg Childers has an NSF grant to occasionally use big clusters to do the postprocessing. Think 1000-2000 cores with high-speed interconnects, working together on a single linear algebra problem. That isn't anything a hobbyist can reasonably hope to contribute to.

This won't get you closer to factoring your target number, but it's also a valuable lesson for your students that smart people can change their mind because of new things that they learn.
jasonp is offline   Reply With Quote
Old 2019-04-04, 15:12   #27
DukeBG
 
Mar 2018

2018 Posts
Default

Quote:
Originally Posted by lukerichards View Post
  • The Cunningham tables are a list of factors of these numbers ordered by exponent
  • Numbers are generally referred to in the form b,n,+/- where, for example 3,676,+ would refer to 3^{676}+1 and 3,676,- would refer to 3^{676}-1
  • Cunningham numbers have already been factored to levels which are relatively easy to do - any further work requires a lot of computing
Some more points from me to mull over:
  • Cunningham tables have a fixed largest exponent at which they stop. As you've already seen, I assume, out of the tables you are interested in: the 3^n+1 table is for n<850; 3^n+1, n<=850; aurifellian L,M for n=6k-3<=1695.
  • Cunningham numbers with larger n's than original tables might not have recieved that much heavy factoring. If you go for larger n's you might find some not fully factored on a level that an enthusiast can do by themselves. I did myself in base 2 and I'm very happy about it.
  • The numbers you're interested to factor for your PRP are indeed all way in the "larger n" territory.
  • NFS@Home does a lot of other types of numbers, not Cunningham. As of this writing, out of the numbers being sieved right now in 15e Lattice Sieve there's only 2,2158M. No Cunningham numbers in 14e Lattice Sieve However, there are also numbers in the 16e Lattice Sieve V5 (there are no other detailed pages for those, so I don't know which are being sieved right now. I guess, looking at the history there, 3,653+ will be done some time this year, maybe 3,763+ and 3,763- too?

I'm not 100% clear on what my NFS@home activities are doing.
I'll say things that jasonp said, but in a different way.

Factoring with NFS consists of three "steps": preparation (finding a poly), sieving, post-processing. With NFS@Home the first and last steps are done by few dedicated people and the general crowd using BOINC is doing the middle part – sieving. Sieving is something that can be paralleled very well. I'm surprised it's still not done on GPUs. You contribute "relations" that you sieve out, and they are found in thousands by every work unit you run on BOINC. To do the factoring for a large number (the size of which remaining Cunningham numbers of Cunningham tables are), you need hundreds of millions of those, multiple gigabytes of data. The post-processing consists of filtering the relations, linear algebra and square root. The factors are found on the square root step. The linear algebra is what takes most time and resources. Filtering is the most annoying one (it can result in "nope, need more sieving", you might have to do it multiple times to get a better matrix for linear algebra).

The processing of a smaller number like a 150-160 digit size GNFS can be done by an enthusiast. Remaining-Cunningham-numbers-of-Cunningham-tables are much larger than that (200 digit GNFS, 270 digit SNFS) and require Big Guns power.

Is NFS@home the most efficient/productive thing to be done
I mean, depends on your goals? If your goal is to help humanity factor some numbers that are in GNFS/SNFS-feasible range than yes. If your goal is to help Cunningham Project specifically – yes, but I guess only run 16e subproject? If your goal is to factor the composite cofactors mentioned above to prove your number than no, just keep on ECM-ing them and look into stuff wblipp wrote in that other thread. There's also another BOINC project yoyo@home that does ECM, but there are no Cunningham-project-numbers there.

If someone wanted to do GNFS or SNFS, how would they go about it
There is a good thread for an intro into that in one of the subforums, but for some reason I cannot find it right now? Gah. I vaguely remember it was titled something like "welcome to number field sieve", but search by titles doesn't find anything with the word "welcome" right now. Started this year. It had links for the software and a guide to factor a 100-digit number yourself to get a feel of the tools.

Last fiddled with by DukeBG on 2019-04-04 at 15:17
DukeBG is offline   Reply With Quote
Old 2019-04-04, 18:01   #28
swellman
 
swellman's Avatar
 
Jun 2012

2×32×151 Posts
Default

How to get started with Factoring


Other links that may be interesting at Jonathan Crombie’s page.

Last fiddled with by swellman on 2019-04-04 at 18:23
swellman is offline   Reply With Quote
Old 2019-04-04, 22:11   #29
Uncwilly
6809 > 6502
 
Uncwilly's Avatar
 
"""""""""""""""""""
Aug 2003
101×103 Posts

11110010101102 Posts
Default

Quote:
Originally Posted by lukerichards View Post
It would be a great lesson for my students about asking for help etc and working collaboratively. No worries if you think you have better things to spend your CPU cycles on though :)
Using your project as a teaching tool sounds interesting. You can show how different tools can be brought to bear in factoring attempts.
Dario Alpern's factoring tool that switches to different methods demonstrates changing methods at different levels.

Uncwilly is offline   Reply With Quote
Old 2019-04-05, 06:43   #30
lukerichards
 
lukerichards's Avatar
 
"Luke Richards"
Jan 2018
Birmingham, UK

32·31 Posts
Default

Quote:
Originally Posted by DukeBG View Post
Is NFS@home the most efficient/productive thing to be done
I mean, depends on your goals? If your goal is to help humanity factor some numbers that are in GNFS/SNFS-feasible range than yes. If your goal is to help Cunningham Project specifically – yes, but I guess only run 16e subproject? If your goal is to factor the composite cofactors mentioned above to prove your number than no, just keep on ECM-ing them and look into stuff wblipp wrote in that other thread. There's also another BOINC project yoyo@home that does ECM, but there are no Cunningham-project-numbers there.
Thanks! At present, my immediate goal is just to help Cunningham Project. I had intended to use my Google Cloud credit to run as many ECM curves as I could on my large composites but thought until I can get a bit more organised on that front, I'll focus on just contributing to TCP for now.
lukerichards is offline   Reply With Quote
Old 2019-04-07, 09:26   #31
penlu
 
Jul 2018

22·7 Posts
Default

So far done around 4k curves at B1=11000000, B2=10*B1 for the C1996.
penlu is offline   Reply With Quote
Old 2019-04-10, 12:20   #32
R.D. Silverman
 
R.D. Silverman's Avatar
 
Nov 2003

53×59 Posts
Default Do you *really* want to help?

Quote:
Originally Posted by penlu View Post
So far done around 4k curves at B1=11000000, B2=10*B1 for the C1996.
This last number is not currently part of the Cunningham project.

If people *really* want to help:

There are currently 69 unfinished numbers from the 1987 hardcover
edition of the Cunningham book.

It would be nice to finish them. They are all from base 2, with index
< 1200 for 2,n+ and index < 2400 for 2LM.

Two of them have been sieved and are waiting for LA. (2,2078M, 2,2098L)
Two of them are about to start sieving: (2,2102L, 2, 2158M).
One of them is relatively easy: 2,1144+ (exponent divisible by 11)
Several more are "within reach" of NFS@Home: 2,1063+, 2,2126M, 2,1072+,
2,1076+, 2,2150M, 2,2158L

They start to get quite a bit harder after that via SNFS. Of course the 2- table
was finished to index 1200, so the rest are all doable, but it would take
a massive effort.

I have run an additional 1000 ECM curves on 2,4k+ up to 1136 with B1 = 3G
I will finish the rest of 2,4k+ up to 1200 in about 6 months.

How about a very large ECM effort to pick off as many of the rest as we can?
Note that because they are base 2, they are particularly efficient for GMP-ECM.

Perhaps yoyo might tackle these with B1 = 850M?
R.D. Silverman is offline   Reply With Quote
Old 2019-04-10, 14:23   #33
paulunderwood
 
paulunderwood's Avatar
 
Sep 2002
Database er0rr

22·5·157 Posts
Default

Quote:
Originally Posted by R.D. Silverman View Post
This last number is not currently part of the Cunningham project.

If people *really* want to help:

There are currently 69 unfinished numbers from the 1987 hardcover
edition of the Cunningham book.

It would be nice to finish them. They are all from base 2, with index
< 1200 for 2,n+ and index < 2400 for 2LM.

Two of them have been sieved and are waiting for LA. (2,2078M, 2,2098L)
Two of them are about to start sieving: (2,2102L, 2, 2158M).
One of them is relatively easy: 2,1144+ (exponent divisible by 11)
Several more are "within reach" of NFS@Home: 2,1063+, 2,2126M, 2,1072+,
2,1076+, 2,2150M, 2,2158L

They start to get quite a bit harder after that via SNFS. Of course the 2- table
was finished to index 1200, so the rest are all doable, but it would take
a massive effort.

I have run an additional 1000 ECM curves on 2,4k+ up to 1136 with B1 = 3G
I will finish the rest of 2,4k+ up to 1200 in about 6 months.

How about a very large ECM effort to pick off as many of the rest as we can?
Note that because they are base 2, they are particularly efficient for GMP-ECM.

Perhaps yoyo might tackle these with B1 = 850M?
Is there a server to which I can attach an AMD 1090T or two? If so what do I need? Getting GMP-ECM under Debian is no problem. But what about client scripts?

Last fiddled with by paulunderwood on 2019-04-10 at 14:25
paulunderwood is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
Cunningham ECM efforts pinhodecarlos Cunningham Tables 7 2017-12-21 13:29
Cunningham ECM Now Futile? R.D. Silverman GMP-ECM 4 2012-04-25 02:45
Cunningham Project on YouTube Batalov Cunningham Tables 0 2012-02-26 02:58
Extended Cunningham or so rekcahx Factoring 6 2011-08-19 12:45
Introduction: ECM work done on Cunningham Project composites garo Cunningham Tables 2 2005-01-20 10:06

All times are UTC. The time now is 23:29.

Wed Apr 1 23:29:35 UTC 2020 up 7 days, 21:02, 3 users, load averages: 1.23, 1.15, 1.22

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.