mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Cunningham Tables (https://www.mersenneforum.org/forumdisplay.php?f=51)
-   -   Contributing to Cunningham Project (https://www.mersenneforum.org/showthread.php?t=24211)

lukerichards 2019-04-03 15:20

[QUOTE=Dr Sardonicus;512529]Just by way of "not giving up," It occurred to me to wonder if you had checked, for your PRP number N = 3^504206 + 2, and some base b,

gcd(N, Mod(b,N)^((N-1)/f) - 1), and

f = each known prime factor of 3^504206 + 1 and also (just for the sake of thoroughness)

f = each composite cofactor with no known proper factors.[/QUOTE]

No, some time ago I wrote a little program in python to test something to do with GCDs, but it wasn't think. Thanks for the suggestion.

lukerichards 2019-04-04 06:54

[QUOTE=penlu;512434]I'd offer to help ECM some of the smaller remaining composites up to 60 digits... but not for free: as payment, I want an explanation as to why you want this particular number factored.

Edit: though I'll warn you first that going up to 60 digits on C1996 (the smallest composite amongst those remaining) has a factor-finding probability lower than the probability of grabbing two random hydrogen atoms from the universe, with replacement, and then discovering that they were the same.[/QUOTE]

Only just spotted this - thanks for the very kind offer, which will be gladly accepted if you're satisfied with my explanation of why I'm doing this. It would be a great lesson for my students about asking for help etc and working collaboratively. No worries if you think you have better things to spend your CPU cycles on though :)

lukerichards 2019-04-04 07:07

How does the Cunningham Project function?
 
I'll post this here, mods may decide this is worthy of a seperate thread and can therefore move it as appropriate.

I've just re-read all the posts on this thread to try to make sure that the answer hadn't already been given. I've managed to glean a few key points:
[LIST][*]As we know, Cunningham numbers are [TEX]b^n\pm1[/TEX][*]The Cunningham Project seeks to factor these numbers[*]The Cunningham tables are a list of factors of these numbers ordered by exponent[*]Numbers are generally referred to in the form b,n,+/- where, for example 3,676,+ would refer to [TEX]3^{676}+1[/TEX] and 3,676,- would refer to [TEX]3^{676}-1[/TEX][*]Cunningham numbers have already been factored to levels which are relatively easy to do - any further work requires a lot of computing[*]Those wishing to contribute seem to have a few options: 1) ECM existing known composite factors (expect to get factors no larger than ~60 digits); 2) GNFS or SNFS existing known composites at home; or 3) Download BOINC and join NFS@home[/LIST]
I have opted to do 3) for now and I'm close to reaching 100k BOINC credit in the past week on this project. I do however have a few questions, largely of general curiosity but which may be of interest to anyone looking to join TCP in the future.

1) I'm not 100% clear on what my NFS@home activities are doing. My understanding is that I will not be finding factors myself, but rather doing a lot of the preparatory work to allow others to post-process, which is where the factors will be found, is that correct?
2) Is NFS@home the most efficient/productive thing to be done by someone who has over $1000 in Google Cloud credit to use up? I'm still on the trial credit tier, so I'm limited to 8 cores per instance and 24 cores total at any one time.
3) If someone (not necessarily me - I'm content with NFS@home at the moment) wanted to do GNFS or SNFS, how would they go about it: acquiring the software, setting it up, choosing the composites to factor, reporting the results etc?

There are other questions I've had over the past few days but can't remember them right now - I'll post back later when I've got them.

jasonp 2019-04-04 11:41

To get started with the factoring software we use, see the Factoring subforum here and ask questions in the NFS@Home subforum (or the NFS@Home message board, though there's barely any activity there)

NFS uses sieving to find relations, and NFS@Home uses the BOINC client to do the sieving. You won't get credit for factors found because finding the factors requires piling all of the relations in one place and performing extremely complex postprocessing on them. You can of course volunteer to run NFS software yourself to do the preprocessing, and you will get credit for completed jobs (but not BOINC credit, in case that matters to you) but those are jobs that run for weeks and need a pretty big computer.

The postprocessing for the largest jobs (pretty much all the Cunningham numbers) requires national-level computing resources and Greg Childers has an NSF grant to occasionally use big clusters to do the postprocessing. Think 1000-2000 cores with high-speed interconnects, working together on a single linear algebra problem. That isn't anything a hobbyist can reasonably hope to contribute to.

This won't get you closer to factoring your target number, but it's also a valuable lesson for your students that smart people can change their mind because of new things that they learn.

DukeBG 2019-04-04 15:12

[QUOTE=lukerichards;512613][LIST][*]The Cunningham tables are a list of factors of these numbers ordered by exponent[*]Numbers are generally referred to in the form b,n,+/- where, for example 3,676,+ would refer to [TEX]3^{676}+1[/TEX] and 3,676,- would refer to [TEX]3^{676}-1[/TEX][*]Cunningham numbers have already been factored to levels which are relatively easy to do - any further work requires a lot of computing[/LIST][/QUOTE]

Some more points from me to mull over:[LIST][*]Cunningham tables have a fixed largest exponent at which they stop. As you've already seen, I assume, out of the tables you are interested in: the 3^n+1 table is for n<850; 3^n+1, n<=850; aurifellian L,M for n=6k-3<=1695. [*]Cunningham numbers with [B]larger[/B] n's than original tables [b]might[/b] not have recieved that much heavy factoring. If you go for larger n's you might find some not fully factored on a level that an enthusiast can do by themselves. I did myself in base 2 and I'm very happy about it.[*]The numbers you're interested to factor for your PRP are indeed all way in the "larger n" territory.[*]NFS@Home does a lot of other types of numbers, not Cunningham. As of this writing, out of the numbers being sieved right now in [I]15e Lattice Sieve[/I] there's only 2,2158M. No Cunningham numbers in [I]14e Lattice Sieve[/I] However, there are also numbers in the [URL="https://escatter11.fullerton.edu/nfs/numbers.html"]16e Lattice Sieve V5[/URL] (there are no other detailed pages for those, so I don't know which are being sieved right now. I guess, looking at the history there, 3,653+ will be done some time this year, maybe 3,763+ and 3,763- too?[/LIST]
[i]I'm not 100% clear on what my NFS@home activities are doing.[/i]
I'll say things that jasonp said, but in a different way.

Factoring with NFS consists of three "steps": preparation (finding a poly), sieving, post-processing. With NFS@Home the first and last steps are done by few dedicated people and the general crowd using BOINC is doing the middle part – sieving. Sieving is something that can be paralleled very well. I'm surprised it's still not done on GPUs. You contribute "relations" that you [i]sieve out[/i], and they are found in thousands by every work unit you run on BOINC. To do the factoring for a large number (the size of which remaining Cunningham numbers of Cunningham tables are), you need hundreds of millions of those, multiple gigabytes of data. The post-processing consists of [i]filtering[/i] the relations, [i]linear algebra[/i] and [i]square root[/i]. The factors are found on the square root step. The linear algebra is what takes most time and resources. Filtering is the most annoying one (it can result in "nope, need more sieving", you might have to do it multiple times to get a better matrix for linear algebra).

The processing of a smaller number like a 150-160 digit size GNFS can be done by an enthusiast. Remaining-Cunningham-numbers-of-Cunningham-tables are much larger than that (200 digit GNFS, 270 digit SNFS) and require Big Guns power.

[i]Is NFS@home the most efficient/productive thing to be done[/i]
I mean, depends on your goals? If your goal is to help humanity factor [i]some[/i] numbers that are in GNFS/SNFS-feasible range than yes. If your goal is to help Cunningham Project specifically – yes, but I guess only run 16e subproject? If your goal is to factor the composite cofactors mentioned above to prove your number than no, just keep on ECM-ing them and look into stuff wblipp wrote in that other thread. There's also another BOINC project yoyo@home that does ECM, but there are no Cunningham-project-numbers there.

[i]If someone wanted to do GNFS or SNFS, how would they go about it[/i]
There is a good thread for an intro into that in one of the subforums, but for some reason I cannot find it right now? Gah. I vaguely remember it was titled something like "welcome to number field sieve", but search by titles doesn't find anything with the word "welcome" right now. Started this year. It had links for the software and a guide to factor a 100-digit number yourself to get a feel of the tools.

swellman 2019-04-04 18:01

[url=https://mersenneforum.org/showthread.php?t=23078]How to get started with Factoring[/url]


Other links that may be interesting at [url=http://cownoise.com]Jonathan Crombie’s page[/url].

Uncwilly 2019-04-04 22:11

[QUOTE=lukerichards;512612]It would be a great lesson for my students about asking for help etc and working collaboratively. No worries if you think you have better things to spend your CPU cycles on though :)[/QUOTE]
Using your project as a teaching tool sounds interesting. You can show how different tools can be brought to bear in factoring attempts.
Dario Alpern's [URL="https://www.alpertron.com.ar/ECM.HTM"]factoring tool[/URL] that switches to different methods demonstrates changing methods at different levels.

:batalov:

lukerichards 2019-04-05 06:43

[QUOTE=DukeBG;512662]
[i]Is NFS@home the most efficient/productive thing to be done[/i]
I mean, depends on your goals? If your goal is to help humanity factor [i]some[/i] numbers that are in GNFS/SNFS-feasible range than yes. If your goal is to help Cunningham Project specifically – yes, but I guess only run 16e subproject? If your goal is to factor the composite cofactors mentioned above to prove your number than no, just keep on ECM-ing them and look into stuff wblipp wrote in that other thread. There's also another BOINC project yoyo@home that does ECM, but there are no Cunningham-project-numbers there.[/QUOTE]

Thanks! At present, my immediate goal is just to help Cunningham Project. I had intended to use my Google Cloud credit to run as many ECM curves as I could on my large composites but thought until I can get a bit more organised on that front, I'll focus on just contributing to TCP for now.

penlu 2019-04-07 09:26

So far done around 4k curves at B1=11000000, B2=10*B1 for the C1996.

R.D. Silverman 2019-04-10 12:20

Do you *really* want to help?
 
[QUOTE=penlu;512948]So far done around 4k curves at B1=11000000, B2=10*B1 for the C1996.[/QUOTE]

This last number is not currently part of the Cunningham project.

If people *really* want to help:

There are currently 69 unfinished numbers from the 1987 hardcover
edition of the Cunningham book.

It would be nice to finish them. They are all from base 2, with index
< 1200 for 2,n+ and index < 2400 for 2LM.

Two of them have been sieved and are waiting for LA. (2,2078M, 2,2098L)
Two of them are about to start sieving: (2,2102L, 2, 2158M).
One of them is relatively easy: 2,1144+ (exponent divisible by 11)
Several more are "within reach" of NFS@Home: 2,1063+, 2,2126M, 2,1072+,
2,1076+, 2,2150M, 2,2158L

They start to get quite a bit harder after that via SNFS. Of course the 2- table
was finished to index 1200, so the rest are all doable, but it would take
a massive effort.

I have run an additional 1000 ECM curves on 2,4k+ up to 1136 with B1 = 3G
I will finish the rest of 2,4k+ up to 1200 in about 6 months.

How about a very large ECM effort to pick off as many of the rest as we can?
Note that because they are base 2, they are particularly efficient for GMP-ECM.

Perhaps yoyo might tackle these with B1 = 850M?

paulunderwood 2019-04-10 14:23

[QUOTE=R.D. Silverman;513331]This last number is not currently part of the Cunningham project.

If people *really* want to help:

There are currently 69 unfinished numbers from the 1987 hardcover
edition of the Cunningham book.

It would be nice to finish them. They are all from base 2, with index
< 1200 for 2,n+ and index < 2400 for 2LM.

Two of them have been sieved and are waiting for LA. (2,2078M, 2,2098L)
Two of them are about to start sieving: (2,2102L, 2, 2158M).
One of them is relatively easy: 2,1144+ (exponent divisible by 11)
Several more are "within reach" of NFS@Home: 2,1063+, 2,2126M, 2,1072+,
2,1076+, 2,2150M, 2,2158L

They start to get quite a bit harder after that via SNFS. Of course the 2- table
was finished to index 1200, so the rest are all doable, but it would take
a massive effort.

I have run an additional 1000 ECM curves on 2,4k+ up to 1136 with B1 = 3G
I will finish the rest of 2,4k+ up to 1200 in about 6 months.

How about a very large ECM effort to pick off as many of the rest as we can?
Note that because they are base 2, they are particularly efficient for GMP-ECM.

Perhaps yoyo might tackle these with B1 = 850M?[/QUOTE]

Is there a server to which I can attach an AMD 1090T or two? If so what do I need? Getting GMP-ECM under Debian is no problem. But what about client scripts?


All times are UTC. The time now is 15:50.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.