mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Cunningham Tables (https://www.mersenneforum.org/forumdisplay.php?f=51)
-   -   Contributing to Cunningham Project (https://www.mersenneforum.org/showthread.php?t=24211)

lukerichards 2019-03-26 12:12

Contributing to Cunningham Project
 
Hi,

I'm interested in devoting some CPU work to factoring Cunningham numbers. Unsurprisingly, to anyone who has followed my exploits, I'm interested particularly in 3+ numbers.

I can't find an obvious page on the project web page or an obvious thread on here which clearly explains how to get started. Can anyone point me in the right direction? I'm happy to write it up as a guide to others, which can be stickied if desired.

xilman 2019-03-26 13:00

[QUOTE=lukerichards;511824]Hi,

I'm interested in devoting some CPU work to factoring Cunningham numbers. Unsurprisingly, to anyone who has followed my exploits, I'm interested particularly in 3+ numbers.

I can't find an obvious page on the project web page or an obvious thread on here which clearly explains how to get started. Can anyone point me in the right direction? I'm happy to write it up as a guide to others, which can be stickied if desired.[/QUOTE]Above all, you should realize that finding new Cunningham factors requires a lot of computation or incredibly good luck. I tell you this not to dissuade you but to set expectations and avoid disappointment.

As a rough guide, you need to be able to factor 200 digit numbers by GNFS, 300 digits by SNFS or believe that you are likely to find 60 to 65-digit factors by ECM. Most individuals do not have the resources for the first two tasks and so join co-operatives like NFS@Home

To set the scale of effort required of ECM, I'm currently finding about one 45 to 50-digit factor (of GCWs, not Cunninghams) every few week with approximately 20 cores running 24/7. I doubt I'd find even one Cunningham factor per annum with that amount of computation. However, ECM is easy to install and easy to run in fire&forget mode. Load it up, set it going with parameters suitable for finding 65-digit factors and check on it every so often. You might get lucky.

If you are still seriously interested let us know and we'll go into detail but I've already spent enough time if your question is only out of idle curiosity.

lukerichards 2019-03-26 17:20

Fire away, I'm definitely interested. Consider my expectations set and my enthusiasm undampened.

My aim is to factor 3[SUP]504206[/SUP]+1 and 3[SUP]504205[/SUP]+1 sufficiently [b]in my lifetime[/b] to provide a primality proof of 3[SUP]504206[/SUP]+2. As intractable a goal as that might be, I need to start somewhere and the Cunningham project seems a sensible place to start.

For the time being, a number of friends have signed up to Google Cloud Console to get the free credit, with no intention of using it themselves. As a result I have about $1200 of VM credit to use up. When that's all gone, I'll look into my options then.

So is NFS@Home the way forward? I've run BOINC before with PrimeGrid so I'm familiar with that.

GP2 2019-03-27 20:09

[QUOTE=lukerichards;511836]My aim is to factor 3[SUP]504206[/SUP]+1 and 3[SUP]504205[/SUP]+1 sufficiently [b]in my lifetime[/b] to provide a primality proof of 3[SUP]504206[/SUP]+2. As intractable a goal as that might be, I need to start somewhere and the Cunningham project seems a sensible place to start.[/QUOTE]

Except for extremely improbable scenarios with odds comparable to winning the lottery multiple times in a row, this goal is completely infeasible with any currently known algorithm. Not in your lifetime. Not before the heat death of the universe.

Imagine you're trying to collect a few dozen eggs. Some of them are right next to you. Some are next door. Some are across town. Some are within one day's driving distance. Some are across the ocean. Some are on the Moon. Some are on Pluto. Some are on Alpha Centauri. Some are across the galaxy. Some are in the Andromeda Galaxy.

The vast majority of the eggs you are looking for have either been found already, or are forever out of reach. A tremendous lifelong effort might find a few more eggs in the "across the ocean" category. That's the only category where your efforts, or anyone else's, can actually make a difference.

lukerichards 2019-03-27 20:13

[QUOTE=GP2;511978]Except for extremely improbable scenarios with odds comparable to winning the lottery multiple times in a row, this goal is completely infeasible with any currently known algorithm. Not in your lifetime. Not before the heat death of the universe.
[/QUOTE]

I'm 30 years old, nobody knows what will happen in the next 50 years in terms of advances in computing power and algorithm research. Current achievements would not have been thought possible 50 years ago.Everyone needs a hobby!

Anyway, I'm offering to contribute to Cunningham, so let's not question why.

GP2 2019-03-27 23:42

[QUOTE=lukerichards;511980]Anyway, I'm offering to contribute to Cunningham, so let's not question why.[/QUOTE]

The Cunningham project actually deals with much smaller exponents, around 1k rather than 500k. So the work you intend to do is entirely different.

If you actually want to contribute to the Cunningham project (i.e., very small exponents), the best way would be to join NFS@Home. A great deal of ECM has already been done there, but there might be a bit left over if you coordinate with the people who've already worked there.

On the other hand, if your heart is set on the quixotic quest for this one particular huge exponent, then it seems there's not much anyone can say to dissuade you.

lukerichards 2019-03-28 08:51

[QUOTE=GP2;512006]The Cunningham project actually deals with much smaller exponents, around 1k rather than 500k. So the work you intend to do is entirely different.

If you actually want to contribute to the Cunningham project (i.e., very small exponents), the best way would be to join NFS@Home. A great deal of ECM has already been done there, but there might be a bit left over if you coordinate with the people who've already worked there.

On the other hand, if your heart is set on the quixotic quest for this one particular huge exponent, then it seems there's not much anyone can say to dissuade you.[/QUOTE]

Thanks for all the information - all of which I was actually aware but I understand that you are trying to make sure I'm not approaching this with undue expectations.

I do know the fact that the TCP is focussed on exponents much smaller than mine. I'm not expecting to get the factors for the ~500k exponent through this, but my interest in this number has led me to The Cunningham Project, which is probably a more useful goal in the short term.

Furthermore, factoring smaller 3+ Cunningham numbers will help factor the one I'm trying to factor, so it's not a completely disconnected interest.

Dr Sardonicus 2019-03-30 14:21

[QUOTE=lukerichards;512025]Furthermore, factoring smaller 3+ Cunningham numbers will help factor the one I'm trying to factor, so it's not a completely disconnected interest.[/QUOTE] I looked at the factordb page for 3^504206 + 3. Dividing out the factor 3, borrowing the found prime factors from the page, fiddling a bit with a text editor, and writing a mindless Pari-GP script to exhibit the known factors as divisors of cyclotomic factors gives the following:[code]M = [2,2; 61,1; 139627,1; 398581,1; 180117541,1; 630728992609,1; 545454568700731,1; 1254120593677177,1; 62262627532596661,1; 85741124649607123,1; 105919308797935444986721,1];[/code]

[code]v=divisors(504205);a=#v;cf=vector(a);for(i=1,a,f=1;m=v[i];fordiv(m,d,f*=(3^d+1)^moebius(m/d));cf[i]=f);r=#M[,1];for(i=1,a,q=cf[i];for(j=1,r,g=gcd(q,M[j,1]^M[j,2]);if(g>1,print(v[i]" "g);q=q/g));if(q>1,l=1+floor(log(q)/log(10));print(v[i]" C"l)))[/code]

1 4
5 61
13 398581
65 105919308797935444986721
7757 139627
7757 1254120593677177
7757 85741124649607123
7757 C3664
38785 180117541
38785 C14795
100841 630728992609
100841 C44395
504205 545454568700731
504205 62262627532596661
504205 C177595

Judging by the results for 9^252103 + 1, I would imagine the C's have been ECM'd to the point of all but excluding any additional factors < 10^40. So I image "will help" has become "has helped all it can." You might try ECMing the remaining C's to look for somewhat larger factors. You might get lucky and find one or more. You might even get luckier, and find a large PRP cofactor (but don't use the base 3 for a PRP test with these numbers). The [i]smallest[/i] remaining cofactor is C3664.

Good luck...

chris2be8 2019-03-30 17:13

[QUOTE=lukerichards;511836] My aim is to factor 3[SUP]504206[/SUP]+1 and 3[SUP]504205[/SUP]+1 sufficiently [B]in my lifetime[/B] to provide a primality proof of 3[SUP]504206[/SUP]+2.[/QUOTE]

The only reasonable hope with foreseeable technology is that someone builds a quantum computer able to factor the algebraic factors of 3[SUP]504205[/SUP]+1. Since the largest factor is 44395 digits it will need about 148000 qubits (*after* error correction). Even if that's possible it will be *very* expensive and I've no idea how long it will take to build.

Without that you need several miraculously lucky ECM hits or a mathematical breakthrough in factoring.

Chris

lukerichards 2019-04-01 20:51

Anyone reading this thread would be forgiven for thinking nobody wants me to contribute to the Cunningham Project.

penlu 2019-04-02 12:16

This thread reads to me like people telling you not to expect to find factors of 3^504205+1 or 3^504206+1. For contributing to the Cunningham project, a number of posters have indicated that signing up for NFS@Home would be the best way to go.

As for factoring 3^504205+1 and 3^504206+1, Dr. Sardonicus's post is helpful: if you take a look at the corresponding factordb pages, you can find targets for ECM:
[url]http://factordb.com/index.php?id=1100000001124804267[/url]
[url]http://factordb.com/index.php?id=1100000001122158773[/url]

I'd offer to help ECM some of the smaller remaining composites up to 60 digits... but not for free: as payment, I want an explanation as to why you want this particular number factored.

Edit: though I'll warn you first that going up to 60 digits on C1996 (the smallest composite amongst those remaining) has a factor-finding probability lower than the probability of grabbing two random hydrogen atoms from the universe, with replacement, and then discovering that they were the same.

Dr Sardonicus 2019-04-02 12:39

[QUOTE=penlu;512434]This thread reads to me like people telling you not to expect to find factors of 3^504205+1 or 3^504206+1. For contributing to the Cunningham project, a number of posters have indicated that signing up for NFS@Home would be the best way to go.
I'd offer to help ECM some of the smaller remaining composites up to 60 digits... but not for free: as payment, I want an explanation as to why you want this particular number factored.[/QUOTE]
This may be found in a 2018-04-25, 13:16 post [url=https://www.mersenneforum.org/showpost.php?p=486161&postcount=24]here[/url]:[quote]I'm working on factoring because I'm seeking to prove the primality of 3^504206+2.[/quote] Calling this number N, we have N - 1 = 3^504206 + 1, and N + 1 = 3*(3^504205 + 1). And, of course, factoring N - 1 and/or N + 1 "sufficiently" can give a primality proof.

So the [i]reason[/i] for wanting these factorizations is clear enough. I'm sure having at least one of the smaller remaining composites ECM'ed beyond current limits would be appreciated.

penlu 2019-04-02 14:30

At risk of being the 5-year-old with the "why"s, I want to know why the poster wants to know that that number is prime...

Dr Sardonicus 2019-04-02 15:06

[QUOTE=penlu;512444]At risk of being the 5-year-old with the "why"s, I want to know why the poster wants to know that that number is prime...[/QUOTE]Beyond that it tested as a PRP, I don't know.

There are, of course, methods for proving N prime other than those involving the factorization of N - 1 or N + 1. It is possible that one or more of them might be better suited to the purpose that the factorization approach.

In the almost year-old thread, and in subsequent threads, it has been pointed out that the remaining composites are [i]much[/i] larger than composites on "most-wanted" lists. [url=http://homes.cerias.purdue.edu/~ssw/cun/want135]This one[/url] is for remaining factors of Cunningham numbers.

The "most wanted" is 2^1207 - 1, a C337.

lukerichards 2019-04-02 15:11

[QUOTE=penlu;512444]At risk of being the 5-year-old with the "why"s, I want to know why the poster wants to know that that number is prime...[/QUOTE]

It's a good question and one I don't mind answering at all!

So this PRP is one I discovered in March 2018. I'm a high school teacher so in order to induce some interest in primes among my classes I had all 240568 digits printed on a huge poster and put up in my classroom. It's great, the kids love it. Every now and again a kid takes a real interest in primes and goes away and looks into them, coming back having found out that large primes are very hard to prove and asks if I've proved it.

I always say 'no', but some kids ask me how I'm doing with proving it. Obviously the answer is usually 'not very far' but I like to have something to explain what I'm doing to try. Saying "it's practically impossible, I've pretty much given up" doesn't give the kind of message to the kids I want to give them about not giving up etc. (I'm sure others will argue that I should be teaching kids to be realistic about their goals etc, which I do agree with but it's a nuance that would be lost on teenagers. They would just hear 'i've given up' which I don't believe is a great message to give them.

GP2 2019-04-02 17:15

[QUOTE=penlu;512434]Edit: though I'll warn you first that going up to 60 digits on C1996 (the smallest composite amongst those remaining) has a factor-finding probability lower than the probability of grabbing two random hydrogen atoms from the universe, with replacement, and then discovering that they were the same.[/QUOTE]

Wait... why so low probability for even a single factor of up to 60 digits?

The [URL="http://factordb.com/index.php?id=1100000001124158887"]C1996[/URL] is (3^4462+1)*10/(3^194+1)/(3^46+1)/135641573315088856753

Surely (3^4462+1) hasn't been ECM'd to anywhere near t=60. I was finding numerous factors for 3+ with prime exponents in low ranges as recently as a few months ago with a mere t=25.

Just one example, the 28-digit factor 3,1637,+,[URL="http://factordb.com/index.php?id=1100000001212313285"]4426039274041115597684571331[/URL] was unknown to FactorDB before December of last year and was also unknown to the Brent tables ([URL="http://myfactors.mooo.com/"]myfactors.mooo.com[/URL]). Not sure who found that one, it wasn't me.

VBCurtis 2019-04-02 20:41

[QUOTE=GP2;512459]Wait... why so low probability for even a single factor of up to 60 digits?
[/QUOTE]
I suspect penlu conflated "find a factor" with "fully factor". The usual heuristic of a 1/n chance to find an n-digit factor should still apply; the catch is that the cofactor is highly likely to be composite, so finding a factor in the 40 to 60 digit range with ECM, while cool, is highly unlikely to help one along the way to "fully factored". No idea where penlu got the 1/10^85ish chance, though; a single newfound factor has a higher chance than that to result in a prime cofactor!

GP2 2019-04-02 21:59

[QUOTE=lukerichards;512447]Saying "it's practically impossible, I've pretty much given up" doesn't give the kind of message to the kids I want to give them about not giving up etc. (I'm sure others will argue that I should be teaching kids to be realistic about their goals etc, which I do agree with but it's a nuance that would be lost on teenagers. They would just hear 'i've given up' which I don't believe is a great message to give them.[/QUOTE]

You have set yourself a task equivalent to winning the lottery.

If you actually won, it would be due to sheer incredible luck and not hard work or perseverance. And that would still be true even if you'd been diligently spending £100 on lottery tickets every week without fail for your entire life.

So how does that send a positive message?

The lottery analogy breaks down somewhat because technological advances will steadily improve the odds over time, as others have pointed out. But probably not enough to make a difference in your lifetime, unless you make a fortune and spend it all on your deathbed.

penlu 2019-04-03 04:38

[QUOTE=VBCurtis;512474]I suspect penlu conflated "find a factor" with "fully factor". The usual heuristic of a 1/n chance to find an n-digit factor should still apply; the catch is that the cofactor is highly likely to be composite, so finding a factor in the 40 to 60 digit range with ECM, while cool, is highly unlikely to help one along the way to "fully factored". No idea where penlu got the 1/10^85ish chance, though; a single newfound factor has a higher chance than that to result in a prime cofactor![/QUOTE]

Big mistake, I was looking at a probability of being about-60-digit-smooth, obviously far smaller than factor finding probability. And I'm off by thirty orders even then. No idea why all of that didn't trip all kinds of ballpark violation alarms in my head...

lukerichards 2019-04-03 07:34

[QUOTE=GP2;512480]You have set yourself a task equivalent to winning the lottery.

If you actually won, it would be due to sheer incredible luck and not hard work or perseverance. And that would still be true even if you'd been diligently spending £100 on lottery tickets every week without fail for your entire life.

So how does that send a positive message?[/QUOTE]

Because I do not live in the real world between the hours of 8.30 and 15.30. I live in the world inhabited by teenagers, who are not experts in maths or number theory and are just starting out on their mathematical journey. Your or I might know the futility of this endeavour, we might understand the true scale of the number we're talking about here but at this point they don't. They see a number and a role model they look up to and that he has faced a difficult problem but hasn't yet given up with it.

I do completely get what you are saying though - there's a message about being realistic about our challenges, setting achievable goals which is also important. However, it's important to realise and appreciate that as a high school teacher the biggest challenge we often face is "this is difficult, I give up" and I'm modelling the behaviour I want to see from them.

DukeBG 2019-04-03 08:33

I, for one, approve your stance and desire to prove this number no matter how futile and improbable that is. Though I don't really like kids, this not giving up example is important for a teacher.

But yeah, as others said, helping the Cunningham project doesn't directly or even indirectly help unless you're working with ECM on specific remaining cofactors (C1996, C2329, C10253 and maybe C225698 too from 3^504206+1 and C3664, C14795, C44395 and maybe C177595 too from 3^504205+1), which Cunningham project currently doesn't and wouldn't for probably a long time. [url=https://www.mersenneforum.org/showpost.php?p=512453&postcount=33]This reply in another thread[/url] was a good outline of chances and directions for the hobby project future.

Dr Sardonicus 2019-04-03 13:08

Just by way of "not giving up," It occurred to me to wonder if you had checked, for your PRP number N = 3^504206 + 2, and some base b,

gcd(N, Mod(b,N)^((N-1)/f) - 1), and

f = each known prime factor of 3^504206 + 1 and also (just for the sake of thoroughness)

f = each composite cofactor with no known proper factors.

lukerichards 2019-04-03 15:20

[QUOTE=Dr Sardonicus;512529]Just by way of "not giving up," It occurred to me to wonder if you had checked, for your PRP number N = 3^504206 + 2, and some base b,

gcd(N, Mod(b,N)^((N-1)/f) - 1), and

f = each known prime factor of 3^504206 + 1 and also (just for the sake of thoroughness)

f = each composite cofactor with no known proper factors.[/QUOTE]

No, some time ago I wrote a little program in python to test something to do with GCDs, but it wasn't think. Thanks for the suggestion.

lukerichards 2019-04-04 06:54

[QUOTE=penlu;512434]I'd offer to help ECM some of the smaller remaining composites up to 60 digits... but not for free: as payment, I want an explanation as to why you want this particular number factored.

Edit: though I'll warn you first that going up to 60 digits on C1996 (the smallest composite amongst those remaining) has a factor-finding probability lower than the probability of grabbing two random hydrogen atoms from the universe, with replacement, and then discovering that they were the same.[/QUOTE]

Only just spotted this - thanks for the very kind offer, which will be gladly accepted if you're satisfied with my explanation of why I'm doing this. It would be a great lesson for my students about asking for help etc and working collaboratively. No worries if you think you have better things to spend your CPU cycles on though :)

lukerichards 2019-04-04 07:07

How does the Cunningham Project function?
 
I'll post this here, mods may decide this is worthy of a seperate thread and can therefore move it as appropriate.

I've just re-read all the posts on this thread to try to make sure that the answer hadn't already been given. I've managed to glean a few key points:
[LIST][*]As we know, Cunningham numbers are [TEX]b^n\pm1[/TEX][*]The Cunningham Project seeks to factor these numbers[*]The Cunningham tables are a list of factors of these numbers ordered by exponent[*]Numbers are generally referred to in the form b,n,+/- where, for example 3,676,+ would refer to [TEX]3^{676}+1[/TEX] and 3,676,- would refer to [TEX]3^{676}-1[/TEX][*]Cunningham numbers have already been factored to levels which are relatively easy to do - any further work requires a lot of computing[*]Those wishing to contribute seem to have a few options: 1) ECM existing known composite factors (expect to get factors no larger than ~60 digits); 2) GNFS or SNFS existing known composites at home; or 3) Download BOINC and join NFS@home[/LIST]
I have opted to do 3) for now and I'm close to reaching 100k BOINC credit in the past week on this project. I do however have a few questions, largely of general curiosity but which may be of interest to anyone looking to join TCP in the future.

1) I'm not 100% clear on what my NFS@home activities are doing. My understanding is that I will not be finding factors myself, but rather doing a lot of the preparatory work to allow others to post-process, which is where the factors will be found, is that correct?
2) Is NFS@home the most efficient/productive thing to be done by someone who has over $1000 in Google Cloud credit to use up? I'm still on the trial credit tier, so I'm limited to 8 cores per instance and 24 cores total at any one time.
3) If someone (not necessarily me - I'm content with NFS@home at the moment) wanted to do GNFS or SNFS, how would they go about it: acquiring the software, setting it up, choosing the composites to factor, reporting the results etc?

There are other questions I've had over the past few days but can't remember them right now - I'll post back later when I've got them.

jasonp 2019-04-04 11:41

To get started with the factoring software we use, see the Factoring subforum here and ask questions in the NFS@Home subforum (or the NFS@Home message board, though there's barely any activity there)

NFS uses sieving to find relations, and NFS@Home uses the BOINC client to do the sieving. You won't get credit for factors found because finding the factors requires piling all of the relations in one place and performing extremely complex postprocessing on them. You can of course volunteer to run NFS software yourself to do the preprocessing, and you will get credit for completed jobs (but not BOINC credit, in case that matters to you) but those are jobs that run for weeks and need a pretty big computer.

The postprocessing for the largest jobs (pretty much all the Cunningham numbers) requires national-level computing resources and Greg Childers has an NSF grant to occasionally use big clusters to do the postprocessing. Think 1000-2000 cores with high-speed interconnects, working together on a single linear algebra problem. That isn't anything a hobbyist can reasonably hope to contribute to.

This won't get you closer to factoring your target number, but it's also a valuable lesson for your students that smart people can change their mind because of new things that they learn.

DukeBG 2019-04-04 15:12

[QUOTE=lukerichards;512613][LIST][*]The Cunningham tables are a list of factors of these numbers ordered by exponent[*]Numbers are generally referred to in the form b,n,+/- where, for example 3,676,+ would refer to [TEX]3^{676}+1[/TEX] and 3,676,- would refer to [TEX]3^{676}-1[/TEX][*]Cunningham numbers have already been factored to levels which are relatively easy to do - any further work requires a lot of computing[/LIST][/QUOTE]

Some more points from me to mull over:[LIST][*]Cunningham tables have a fixed largest exponent at which they stop. As you've already seen, I assume, out of the tables you are interested in: the 3^n+1 table is for n<850; 3^n+1, n<=850; aurifellian L,M for n=6k-3<=1695. [*]Cunningham numbers with [B]larger[/B] n's than original tables [b]might[/b] not have recieved that much heavy factoring. If you go for larger n's you might find some not fully factored on a level that an enthusiast can do by themselves. I did myself in base 2 and I'm very happy about it.[*]The numbers you're interested to factor for your PRP are indeed all way in the "larger n" territory.[*]NFS@Home does a lot of other types of numbers, not Cunningham. As of this writing, out of the numbers being sieved right now in [I]15e Lattice Sieve[/I] there's only 2,2158M. No Cunningham numbers in [I]14e Lattice Sieve[/I] However, there are also numbers in the [URL="https://escatter11.fullerton.edu/nfs/numbers.html"]16e Lattice Sieve V5[/URL] (there are no other detailed pages for those, so I don't know which are being sieved right now. I guess, looking at the history there, 3,653+ will be done some time this year, maybe 3,763+ and 3,763- too?[/LIST]
[i]I'm not 100% clear on what my NFS@home activities are doing.[/i]
I'll say things that jasonp said, but in a different way.

Factoring with NFS consists of three "steps": preparation (finding a poly), sieving, post-processing. With NFS@Home the first and last steps are done by few dedicated people and the general crowd using BOINC is doing the middle part – sieving. Sieving is something that can be paralleled very well. I'm surprised it's still not done on GPUs. You contribute "relations" that you [i]sieve out[/i], and they are found in thousands by every work unit you run on BOINC. To do the factoring for a large number (the size of which remaining Cunningham numbers of Cunningham tables are), you need hundreds of millions of those, multiple gigabytes of data. The post-processing consists of [i]filtering[/i] the relations, [i]linear algebra[/i] and [i]square root[/i]. The factors are found on the square root step. The linear algebra is what takes most time and resources. Filtering is the most annoying one (it can result in "nope, need more sieving", you might have to do it multiple times to get a better matrix for linear algebra).

The processing of a smaller number like a 150-160 digit size GNFS can be done by an enthusiast. Remaining-Cunningham-numbers-of-Cunningham-tables are much larger than that (200 digit GNFS, 270 digit SNFS) and require Big Guns power.

[i]Is NFS@home the most efficient/productive thing to be done[/i]
I mean, depends on your goals? If your goal is to help humanity factor [i]some[/i] numbers that are in GNFS/SNFS-feasible range than yes. If your goal is to help Cunningham Project specifically – yes, but I guess only run 16e subproject? If your goal is to factor the composite cofactors mentioned above to prove your number than no, just keep on ECM-ing them and look into stuff wblipp wrote in that other thread. There's also another BOINC project yoyo@home that does ECM, but there are no Cunningham-project-numbers there.

[i]If someone wanted to do GNFS or SNFS, how would they go about it[/i]
There is a good thread for an intro into that in one of the subforums, but for some reason I cannot find it right now? Gah. I vaguely remember it was titled something like "welcome to number field sieve", but search by titles doesn't find anything with the word "welcome" right now. Started this year. It had links for the software and a guide to factor a 100-digit number yourself to get a feel of the tools.

swellman 2019-04-04 18:01

[url=https://mersenneforum.org/showthread.php?t=23078]How to get started with Factoring[/url]


Other links that may be interesting at [url=http://cownoise.com]Jonathan Crombie’s page[/url].

Uncwilly 2019-04-04 22:11

[QUOTE=lukerichards;512612]It would be a great lesson for my students about asking for help etc and working collaboratively. No worries if you think you have better things to spend your CPU cycles on though :)[/QUOTE]
Using your project as a teaching tool sounds interesting. You can show how different tools can be brought to bear in factoring attempts.
Dario Alpern's [URL="https://www.alpertron.com.ar/ECM.HTM"]factoring tool[/URL] that switches to different methods demonstrates changing methods at different levels.

:batalov:

lukerichards 2019-04-05 06:43

[QUOTE=DukeBG;512662]
[i]Is NFS@home the most efficient/productive thing to be done[/i]
I mean, depends on your goals? If your goal is to help humanity factor [i]some[/i] numbers that are in GNFS/SNFS-feasible range than yes. If your goal is to help Cunningham Project specifically – yes, but I guess only run 16e subproject? If your goal is to factor the composite cofactors mentioned above to prove your number than no, just keep on ECM-ing them and look into stuff wblipp wrote in that other thread. There's also another BOINC project yoyo@home that does ECM, but there are no Cunningham-project-numbers there.[/QUOTE]

Thanks! At present, my immediate goal is just to help Cunningham Project. I had intended to use my Google Cloud credit to run as many ECM curves as I could on my large composites but thought until I can get a bit more organised on that front, I'll focus on just contributing to TCP for now.

penlu 2019-04-07 09:26

So far done around 4k curves at B1=11000000, B2=10*B1 for the C1996.

R.D. Silverman 2019-04-10 12:20

Do you *really* want to help?
 
[QUOTE=penlu;512948]So far done around 4k curves at B1=11000000, B2=10*B1 for the C1996.[/QUOTE]

This last number is not currently part of the Cunningham project.

If people *really* want to help:

There are currently 69 unfinished numbers from the 1987 hardcover
edition of the Cunningham book.

It would be nice to finish them. They are all from base 2, with index
< 1200 for 2,n+ and index < 2400 for 2LM.

Two of them have been sieved and are waiting for LA. (2,2078M, 2,2098L)
Two of them are about to start sieving: (2,2102L, 2, 2158M).
One of them is relatively easy: 2,1144+ (exponent divisible by 11)
Several more are "within reach" of NFS@Home: 2,1063+, 2,2126M, 2,1072+,
2,1076+, 2,2150M, 2,2158L

They start to get quite a bit harder after that via SNFS. Of course the 2- table
was finished to index 1200, so the rest are all doable, but it would take
a massive effort.

I have run an additional 1000 ECM curves on 2,4k+ up to 1136 with B1 = 3G
I will finish the rest of 2,4k+ up to 1200 in about 6 months.

How about a very large ECM effort to pick off as many of the rest as we can?
Note that because they are base 2, they are particularly efficient for GMP-ECM.

Perhaps yoyo might tackle these with B1 = 850M?

paulunderwood 2019-04-10 14:23

[QUOTE=R.D. Silverman;513331]This last number is not currently part of the Cunningham project.

If people *really* want to help:

There are currently 69 unfinished numbers from the 1987 hardcover
edition of the Cunningham book.

It would be nice to finish them. They are all from base 2, with index
< 1200 for 2,n+ and index < 2400 for 2LM.

Two of them have been sieved and are waiting for LA. (2,2078M, 2,2098L)
Two of them are about to start sieving: (2,2102L, 2, 2158M).
One of them is relatively easy: 2,1144+ (exponent divisible by 11)
Several more are "within reach" of NFS@Home: 2,1063+, 2,2126M, 2,1072+,
2,1076+, 2,2150M, 2,2158L

They start to get quite a bit harder after that via SNFS. Of course the 2- table
was finished to index 1200, so the rest are all doable, but it would take
a massive effort.

I have run an additional 1000 ECM curves on 2,4k+ up to 1136 with B1 = 3G
I will finish the rest of 2,4k+ up to 1200 in about 6 months.

How about a very large ECM effort to pick off as many of the rest as we can?
Note that because they are base 2, they are particularly efficient for GMP-ECM.

Perhaps yoyo might tackle these with B1 = 850M?[/QUOTE]

Is there a server to which I can attach an AMD 1090T or two? If so what do I need? Getting GMP-ECM under Debian is no problem. But what about client scripts?

lukerichards 2019-04-10 14:56

[QUOTE=paulunderwood;513346]Is there a server to which I can attach an AMD 1090T or two? If so what do I need? Getting GMP-ECM under Debian is no problem. But what about client scripts?[/QUOTE]

+1 to this

pinhodecarlos 2019-04-10 16:58

[QUOTE=paulunderwood;513346]Is there a server to which I can attach an AMD 1090T or two? If so what do I need? Getting GMP-ECM under Debian is no problem. But what about client scripts?[/QUOTE]

I’ll forward you the Linux ecm server so you can set up one for us.

xilman 2019-04-10 20:16

I could very easily set up a v2 ECMNET server for Cunningham project numbers if anyone wants one. Two are already running here, one for my GCW project and another for Jon's HCN project.

I do not know whether GPU-enabled clients exist but GMP-ECM cpu clients have been available for many years.

swellman 2019-04-10 21:43

[QUOTE=R.D. Silverman;513331]This last number is not currently part of the Cunningham project.

If people *really* want to help:

There are currently 69 unfinished numbers from the 1987 hardcover
edition of the Cunningham book.

It would be nice to finish them. They are all from base 2, with index
< 1200 for 2,n+ and index < 2400 for 2LM.

Two of them have been sieved and are waiting for LA. (2,2078M, 2,2098L)
Two of them are about to start sieving: (2,2102L, 2, 2158M).
One of them is relatively easy: 2,1144+ (exponent divisible by 11)
Several more are "within reach" of NFS@Home: 2,1063+, 2,2126M, 2,1072+,
2,1076+, 2,2150M, 2,2158L

They start to get quite a bit harder after that via SNFS. Of course the 2- table
was finished to index 1200, so the rest are all doable, but it would take
a massive effort.

I have run an additional 1000 ECM curves on 2,4k+ up to 1136 with B1 = 3G
I will finish the rest of 2,4k+ up to 1200 in about 6 months.

How about a very large ECM effort to pick off as many of the rest as we can?
Note that because they are base 2, they are particularly efficient for GMP-ECM.

Perhaps yoyo might tackle these with B1 = 850M?[/QUOTE]

Yoyo has indicated his interest in this effort. He’s looking for some composites to run - any suggested list to get things started? I’m assuming we are limiting ECM to B1=850M for now.

Obviously some coordination will be required with the ECMNET effort to avoid overlap.

swellman 2019-04-10 23:58

Here is a list of 29 composites I found over at cownoise, all remaining composites of form 2^n+1 where n<1200.

1037 - sieved, awaiting LA (16f)
1052* - sieved, awaiting LA (16f)
1063 - within reach of NFS@Home
1072* - within reach of NFS@Home
1076* - within reach of NFS@Home
1084*
1087 - ECM being run to t65 (yoyo@Home) - p64 found by yoyo
1091
1097
1108*
1109
1115
1123
1124*
1129
1135
1136*
1139
1144* - divisible by 11, relatively easy (quintic)
1147
1151
1153
1157
1159
1163
1165
1168*
1180*
1187

*All n=4K being ECM’d by RD Silverman for 1000 curves @B1=3G.

frmky 2019-04-11 05:31

I'll prioritize these for linear algebra. I'll do 2,1037+ next. 2,2078M is sieved and ready for LA if anyone wants to try their hand at a 70M x 70M 35GB matrix. :smile:

lukerichards 2019-04-11 11:43

[QUOTE=frmky;513396]I'll prioritize these for linear algebra. I'll do 2,1037+ next. 2,2078M is sieved and ready for LA if anyone wants to try their hand at a 70M x 70M 35GB matrix. :smile:[/QUOTE]

My understanding from previous posts in this thread is that the LA stage involves a huge amount of processing resource and practically impossible for us mere mortals, is that correct?

VBCurtis 2019-04-11 15:33

[QUOTE=lukerichards;513401]My understanding from previous posts in this thread is that the LA stage involves a huge amount of processing resource and practically impossible for us mere mortals, is that correct?[/QUOTE]

If you peruse the 15e thread in the NFS@ home subforum, you'll get an idea of the combination of hardware & time required for various matrices.

The 35GB of data has to fit in memory with some room to spare; It's possible this job would fit on a 48GB-ram machine, but not guaranteed. The largest matrix I've solved is 41M x 41M, and that took ~800hr on a 10-core Xeon. Time required scales roughly with the square of the dimension, so 800* (70/41)^2 = about 100 days on a 10-core machine to solve this one. The job doesn't scale perfectly with number of cores, so a quad-core with 64GB ought to take slightly less than 250 days to solve it.
I believe swellman solved a 48M matrix in 2017 on a quad-core w/32GB; it took 5 or 6 months. I don't think he enjoyed it.
Thus, frmky's joke.

DukeBG 2019-04-11 17:49

[QUOTE=swellman;513385]Here is a list of 29 composites I found over at cownoise, all remaining composites of form 2^n+1 where n<1200.

1037 - sieved, awaiting LA (16f)
1052* - sieved, awaiting LA (16f)
1063 - within reach of NFS@Home
1072* - within reach of NFS@Home
1076* - within reach of NFS@Home
1084*
1087
1091
1097
1108*
1109
1115
1123
1124*
1129
1135
1136*
1139
1144* - divisible by 11, relatively easy (quintic)
1147
1151
1153
1157
1159
1163
1165
1168*
1180*
1187

*All n=4K being ECM’d by RD Silverman for 1000 curves @B1=3G.[/QUOTE]
Not that anyone asked, but since i have the data at hand, i can make a post with all the remaining digit sizes.

The current Cunningham Project limit is 1300

[FONT="Courier New"]Table 2- Factorizations of 2^n-1, n odd, n<1300

2,1207- [url=http://factordb.com/index.php?id=1100000000002356256]C337[/url] (7121450524...71)
2,1213- [url=http://factordb.com/index.php?id=1100000000773077743]C297[/url] (6022881435...11)
2,1217- [url=http://factordb.com/index.php?id=1100000000630500108]C248[/url] (1599862690...13)
2,1229- [url=http://factordb.com/index.php?id=1100000000642992619]C284[/url] (5339295584...87)
2,1231- [url=http://factordb.com/index.php?id=1100000000189314197]C329[/url] (1050967524...39)
2,1237- [url=http://factordb.com/index.php?id=1100000000226529793]C303[/url] (9323469976...37)
2,1243- [url=http://factordb.com/index.php?id=1100000000000211664]C337[/url] (7124875134...91)
2,1249- [url=http://factordb.com/index.php?id=1100000000216804210]C326[/url] (8547356648...69)
2,1253- [url=http://factordb.com/index.php?id=1100000000026593009]C268[/url] (2761303291...69)
2,1255- [url=http://factordb.com/index.php?id=1100000000475540293]C220[/url] (7728972831...21)
2,1259- [url=http://factordb.com/index.php?id=1100000000630500118]C309[/url] (8826461643...09)
2,1265- [url=http://factordb.com/index.php?id=1100000000035238098]C223[/url] (3177637419...71)
2,1277- [url=http://factordb.com/index.php?id=1000000000000001277]C385[/url] (2601983048...71)
2,1283- [url=http://factordb.com/index.php?id=1100000000007485007]C347[/url] (3451567269...53)
2,1291- [url=http://factordb.com/index.php?id=1100000000013035391]C348[/url] (5077368744...11)
2,1297- [url=http://factordb.com/index.php?id=1100000000670267433]C302[/url] (4344219763...47)


Table 2+ Factorizations of 2^n+1, n odd, n<1300

2,1037+ [url=http://factordb.com/index.php?id=1100000000298972614]C209[/url] (9704276083...31)
2,1063+ [url=http://factordb.com/index.php?id=1100000000016576802]C281[/url] (1096893725...57)
2,1087+ [url=http://factordb.com/index.php?id=1100000000017307785]C276[/url] (5346580396...69)
2,1091+ [url=http://factordb.com/index.php?id=1100000000017307795]C307[/url] (2117208798...47)
2,1097+ [url=http://factordb.com/index.php?id=1100000000017307831]C288[/url] (4601819937...49)
2,1109+ [url=http://factordb.com/index.php?id=1100000000303118455]C225[/url] (1264518768...61)
2,1115+ [url=http://factordb.com/index.php?id=1100000000193459927]C253[/url] (7846343024...91)
2,1123+ [url=http://factordb.com/index.php?id=1100000000000421608]C338[/url] (3798077969...03)
2,1129+ [url=http://factordb.com/index.php?id=1100000000001662828]C330[/url] (4588925133...73)
2,1135+ [url=http://factordb.com/index.php?id=1100000000212379315]C223[/url] (6339171561...91)
2,1139+ [url=http://factordb.com/index.php?id=1100000000212379330]C248[/url] (8822461361...43)
2,1147+ [url=http://factordb.com/index.php?id=1100000000001501351]C317[/url] (6354612555...19)
2,1151+ [url=http://factordb.com/index.php?id=1100000000379549018]C236[/url] (2374137574...11)
2,1153+ [url=http://factordb.com/index.php?id=1100000000017308105]C306[/url] (2151805224...51)
2,1157+ [url=http://factordb.com/index.php?id=1100000000212379365]C270[/url] (1867094354...71)
2,1159+ [url=http://factordb.com/index.php?id=1100000000002266129]C318[/url] (1654131320...79)
2,1163+ [url=http://factordb.com/index.php?id=1100000000017308167]C297[/url] (5719568944...97)
2,1165+ [url=http://factordb.com/index.php?id=1100000000019397463]C217[/url] (3213773553...11)
2,1187+ [url=http://factordb.com/index.php?id=1100000000017308297]C334[/url] (1118268083...61)
2,1201+ [url=http://factordb.com/index.php?id=1100000000000481647]C325[/url] (1526708784...07)
2,1205+ [url=http://factordb.com/index.php?id=1100000000508735056]C232[/url] (2468529119...71)
2,1213+ [url=http://factordb.com/index.php?id=1100000000216843466]C282[/url] (2891280194...57)
2,1223+ [url=http://factordb.com/index.php?id=1100000000627386690]C297[/url] (1606513949...59)
2,1231+ [url=http://factordb.com/index.php?id=1100000000017308470]C358[/url] (2169916752...99)
2,1241+ [url=http://factordb.com/index.php?id=1100000000212377711]C279[/url] (5643972046...79)
2,1249+ [url=http://factordb.com/index.php?id=1100000000026596219]C334[/url] (1792058696...99)
2,1259+ [url=http://factordb.com/index.php?id=1100000000003757108]C379[/url] (3308592540...63)
2,1261+ [url=http://factordb.com/index.php?id=1100000000193100201]C302[/url] (4863988444...93)
2,1271+ [url=http://factordb.com/index.php?id=1100000000627386683]C314[/url] (6283241362...79)
2,1273+ [url=http://factordb.com/index.php?id=1100000000026622049]C329[/url] (2870340808...79)
2,1283+ [url=http://factordb.com/index.php?id=1100000000004495011]C371[/url] (1173232881...51)
2,1285+ [url=http://factordb.com/index.php?id=1100000000189635930]C292[/url] (2684912289...11)
2,1289+ [url=http://factordb.com/index.php?id=1100000000193100141]C334[/url] (5364703591...53)
2,1291+ [url=http://factordb.com/index.php?id=1100000000032219944]C284[/url] (1783306340...37)
2,1297+ [url=http://factordb.com/index.php?id=1100000000627386559]C330[/url] (1711626162...41)

Table 2LM Factorizations of 2^n+1, n=4k-2, n<2600

2,2078M [url=http://factordb.com/index.php?id=1100000000002471198]C313[/url] (1178136172...93)
2,2098L [url=http://factordb.com/index.php?id=1100000000003064013]C299[/url] (1565791882...09)
2,2102L [url=http://factordb.com/index.php?id=1100000000026611363]C282[/url] (1237617434...41)
2,2126M [url=http://factordb.com/index.php?id=1100000000193496017]C219[/url] (7433401681...13)
2,2150M [url=http://factordb.com/index.php?id=1100000000193461294]C228[/url] (2311441782...01)
2,2158M [url=http://factordb.com/index.php?id=1100000000193495810]C193[/url] (1199517686...77)
2,2158L [url=http://factordb.com/index.php?id=1100000000193495812]C296[/url] (8048225928...81)
2,2162M [url=http://factordb.com/index.php?id=1100000000615982464]C236[/url] (2209092210...57)
2,2162L [url=http://factordb.com/index.php?id=1100000000193461289]C258[/url] (9181494251...09)
2,2174L [url=http://factordb.com/index.php?id=1100000000004659064]C273[/url] (1081193285...97)
2,2174M [url=http://factordb.com/index.php?id=1100000000004659082]C309[/url] (4703499248...37)
2,2194M [url=http://factordb.com/index.php?id=1100000000216821031]C301[/url] (1377276963...17)
2,2194L [url=http://factordb.com/index.php?id=1100000000216821032]C304[/url] (4253349343...13)
2,2206L [url=http://factordb.com/index.php?id=1100000000216844440]C243[/url] (3853916779...29)
2,2206M [url=http://factordb.com/index.php?id=1100000000216845375]C256[/url] (9617295416...81)
2,2210M [url=http://factordb.com/index.php?id=1100000000193461046]C211[/url] (1035494967...61)
2,2222L [url=http://factordb.com/index.php?id=1100000000193495835]C228[/url] (4981822942...73)
2,2222M [url=http://factordb.com/index.php?id=1100000000193495836]C289[/url] (3843155399...57)
2,2230M [url=http://factordb.com/index.php?id=1100000000193495838]C225[/url] (1566334536...61)
2,2246M [url=http://factordb.com/index.php?id=1100000000921160018]C221[/url] (2402338719...37)
2,2246L [url=http://factordb.com/index.php?id=1100000000212379294]C253[/url] (2764013567...57)
2,2266L [url=http://factordb.com/index.php?id=1100000000193460804]C255[/url] (8936973078...81)
2,2278M [url=http://factordb.com/index.php?id=1100000000193460678]C234[/url] (1247586626...89)
2,2278L [url=http://factordb.com/index.php?id=1100000000001983375]C289[/url] (1585908218...13)
2,2302L [url=http://factordb.com/index.php?id=1100000000026616046]C293[/url] (1555795129...13)
2,2306L [url=http://factordb.com/index.php?id=1100000000026616132]C287[/url] (6466232365...21)
2,2318M [url=http://factordb.com/index.php?id=1100000000193460529]C296[/url] (4674619356...97)
2,2330L [url=http://factordb.com/index.php?id=1100000000193495859]C207[/url] (3343774377...21)
2,2330M [url=http://factordb.com/index.php?id=1100000000193495862]C210[/url] (2019167005...61)
2,2342M [url=http://factordb.com/index.php?id=1100000000265572960]C291[/url] (6587796139...73)
2,2350M [url=http://factordb.com/index.php?id=1100000000193460415]C248[/url] (8840362519...01)
2,2354M [url=http://factordb.com/index.php?id=1100000000193495883]C271[/url] (5618608313...33)
2,2354L [url=http://factordb.com/index.php?id=1100000000193495892]C314[/url] (5780062512...97)
2,2374L [url=http://factordb.com/index.php?id=1100000000026617819]C309[/url] (8460204308...81)
2,2378L [url=http://factordb.com/index.php?id=1100000000001838308]C305[/url] (2735008348...13)
2,2386L [url=http://factordb.com/index.php?id=1100000000626848817]C248[/url] (1327560990...73)
2,2390M [url=http://factordb.com/index.php?id=1100000000193495905]C260[/url] (1310349067...21)
2,2390L [url=http://factordb.com/index.php?id=1100000000193495906]C273[/url] (8659017743...01)
2,2398M [url=http://factordb.com/index.php?id=1100000000193460223]C326[/url] (3341217650...81)
2,2402L [url=http://factordb.com/index.php?id=1100000000748540236]C231[/url] (2790265208...53)
2,2402M [url=http://factordb.com/index.php?id=1100000000032222825]C340[/url] (7111773792...41)
2,2410L [url=http://factordb.com/index.php?id=1100000000032408882]C290[/url] (3118500483...01)
2,2414L [url=http://factordb.com/index.php?id=1100000000626070368]C269[/url] (7683343371...21)
2,2414M [url=http://factordb.com/index.php?id=1100000000475540367]C312[/url] (1831972383...69)
2,2426L [url=http://factordb.com/index.php?id=1100000000189633011]C355[/url] (1089480867...17)
2,2426M [url=http://factordb.com/index.php?id=1100000000004657771]C366[/url] (1410537837...21)
2,2434L [url=http://factordb.com/index.php?id=1100000000032222982]C323[/url] (7621004892...81)
2,2434M [url=http://factordb.com/index.php?id=1100000000032222981]C324[/url] (1127841823...61)
2,2438M [url=http://factordb.com/index.php?id=1100000000475540377]C233[/url] (4718965258...33)
2,2438L [url=http://factordb.com/index.php?id=1100000000748546164]C256[/url] (2515912720...49)
2,2446L [url=http://factordb.com/index.php?id=1100000000001686264]C359[/url] (4789884597...73)
2,2462M [url=http://factordb.com/index.php?id=1100000000625994652]C292[/url] (6451621604...29)
2,2474L [url=http://factordb.com/index.php?id=1100000000748526538]C294[/url] (1873722136...41)
2,2482M [url=http://factordb.com/index.php?id=1100000000475540392]C292[/url] (3294126177...13)
2,2482L [url=http://factordb.com/index.php?id=1100000000475540390]C292[/url] (3916976792...69)
2,2494M [url=http://factordb.com/index.php?id=1100000000755075795]C221[/url] (7598363413...61)
2,2494L [url=http://factordb.com/index.php?id=1100000000005344831]C340[/url] (8607642141...01)
2,2498M [url=http://factordb.com/index.php?id=1100000000755075797]C271[/url] (2060995209...89)
2,2498L [url=http://factordb.com/index.php?id=1100000000625994646]C318[/url] (1853502750...17)
2,2506M [url=http://factordb.com/index.php?id=1100000000032304300]C309[/url] (1510975913...21)
2,2510M [url=http://factordb.com/index.php?id=1100000000032409442]C293[/url] (4174790960...61)
2,2510L [url=http://factordb.com/index.php?id=1100000000032409444]C295[/url] (2449903066...61)
2,2518L [url=http://factordb.com/index.php?id=1100000000003239406]C371[/url] (7020354441...13)
2,2522L [url=http://factordb.com/index.php?id=1100000000035435504]C312[/url] (4293040311...21)
2,2534M [url=http://factordb.com/index.php?id=1100000000475540423]C265[/url] (2620357157...21)
2,2534L [url=http://factordb.com/index.php?id=1100000000032305773]C314[/url] (1957448198...81)
2,2542M [url=http://factordb.com/index.php?id=1100000000034999199]C264[/url] (8011977902...09)
2,2542L [url=http://factordb.com/index.php?id=1100000000000517620]C354[/url] (6954628346...61)
2,2546M [url=http://factordb.com/index.php?id=1100000000002573996]C334[/url] (1496566985...77)
2,2546L [url=http://factordb.com/index.php?id=1100000000418486850]C334[/url] (1698885675...49)
2,2554M [url=http://factordb.com/index.php?id=1100000000475540425]C329[/url] (7956086727...49)
2,2554L [url=http://factordb.com/index.php?id=1100000000000641062]C379[/url] (8489756590...41)
2,2558L [url=http://factordb.com/index.php?id=1100000000475540427]C344[/url] (1229865189...81)
2,2558M [url=http://factordb.com/index.php?id=1100000000475540429]C364[/url] (1815491966...53)
2,2566L [url=http://factordb.com/index.php?id=1100000000905222122]C221[/url] (6490858084...89)
2,2570M [url=http://factordb.com/index.php?id=1100000000755075801]C252[/url] (8691610207...61)
2,2570L [url=http://factordb.com/index.php?id=1100000000032409784]C290[/url] (1823682014...61)
2,2578L [url=http://factordb.com/index.php?id=1100000000475540433]C337[/url] (7911714954...41)
2,2582L [url=http://factordb.com/index.php?id=1100000000642813794]C327[/url] (1095239775...53)
2,2582M [url=http://factordb.com/index.php?id=1100000000475540437]C360[/url] (2136826433...41)
2,2586L [url=http://factordb.com/index.php?id=1100000000475540439]C224[/url] (1151763494...21)
2,2594M [url=http://factordb.com/index.php?id=1100000000000819303]C390[/url] (5456753954...97)

Table 2+(4k) Factorizations of 2^n+1, n=4k, n<=1300

2,1052+ [url=http://factordb.com/index.php?id=1100000000001686553]C300[/url] (1254621486...77)
2,1072+ [url=http://factordb.com/index.php?id=1100000000019399363]C271[/url] (1432415504...21)
2,1076+ [url=http://factordb.com/index.php?id=1100000000017307719]C238[/url] (1786536523...93)
2,1084+ [url=http://factordb.com/index.php?id=1100000000001960238]C318[/url] (2160891904...97)
2,1108+ [url=http://factordb.com/index.php?id=1100000000017307886]C271[/url] (1601823292...93)
2,1124+ [url=http://factordb.com/index.php?id=1100000000017307953]C311[/url] (4366370736...77)
2,1136+ [url=http://factordb.com/index.php?id=1100000000049642113]C247[/url] (1373091589...09)
2,1144+ [url=http://factordb.com/index.php?id=1100000000019398026]C274[/url] (1007628438...41)
2,1168+ [url=http://factordb.com/index.php?id=1100000000019397534]C326[/url] (1150350247...73)
2,1180+ [url=http://factordb.com/index.php?id=1100000000019397323]C249[/url] (1085900753...61)
2,1208+ [url=http://factordb.com/index.php?id=1100000000017308384]C330[/url] (3232559893...21)
2,1216+ [url=http://factordb.com/index.php?id=1100000000026595686]C328[/url] (1666221151...57)
2,1240+ [url=http://factordb.com/index.php?id=1100000000658874001]C216[/url] (3279089858...41)
2,1256+ [url=http://factordb.com/index.php?id=1100000000625994672]C312[/url] (4853845816...81)
2,1276+ [url=http://factordb.com/index.php?id=1100000000658874134]C287[/url] (2860268892...37)
2,1288+ [url=http://factordb.com/index.php?id=1100000000032307277]C284[/url] (3345785657...93)
2,1292+ [url=http://factordb.com/index.php?id=1100000000189536207]C320[/url] (3855715629...01)[/FONT]

DukeBG 2019-04-11 19:16

[QUOTE=DukeBG;513436]The current Cunningham Project limit is 1300[/QUOTE]

I should stress that while I've posted numbers up to 1300, R.D. Silverman proposed we focus on the previous limit – 1200, so [B]not[/B] all the numbers in my list.

VBCurtis 2019-04-11 23:46

I think 2,1165+ is GNFS at C217. This would make a nice size for a forum-team-sieve, if we can round up enough CADO users willing to pledge CPU cycles.
Rather than clutter this thread, anyone interested can start a new thread for 2,1165+ specifically and we can coordinate ECM / poly select / sieve pledges / etc.
(I didn't check whether Greg already has plans for this number, it may already be reserved to NFS@home)

frmky 2019-04-12 05:09

I don't have 2,1165+ reserved. The forum is welcome to do it as a team project. If you prefer, I'm happy to run the sieving on NFS@Home. Entirely up to y'all.

pinhodecarlos 2019-04-12 05:53

I did point my cores to NFS@home, there’s a mersenneforum.org team in there.

ET_ 2019-04-12 11:52

[QUOTE=VBCurtis;513465]I think 2,1165+ is GNFS at C217. This would make a nice size for a forum-team-sieve, if we can round up enough CADO users willing to pledge CPU cycles.
Rather than clutter this thread, anyone interested can start a new thread for 2,1165+ specifically and we can coordinate ECM / poly select / sieve pledges / etc.
(I didn't check whether Greg already has plans for this number, it may already be reserved to NFS@home)[/QUOTE]

I could be interested as well.

swellman 2019-04-12 12:15

I will help poly search for 2,1165+.

Max0526 2019-04-12 13:02

C217 2,1165+ poly select
 
Could we please drop an intro message on [URL]https://mersenneforum.org/showthread.php?t=18368&page=159[/URL] about this C217? And post msieve/CADO and ranges for everybody involved? Thank you!

VBCurtis 2019-04-12 16:37

It's not clear that 2,1165+ has been ECM'ed enough to jump into poly select.
That said, since frmky is willing to sieve it in case we don't have enough forum interest for a team sieve, we have every reason to poly select it once ECM is nearly-done.
I'm not personally up on how much ECM has been done on which Cunningham numbers; this project gets a lot of attention so I wouldn't be surprised if a t65 has already been done, but we should be sure of that before we spend cycles on poly select.

Using 0.31 * GNFS size = as a heuristic for ECM depth, we get 67.27 which is just over 2*t65. I wonder what the Bayesian tool suggests?

Mr Silverman kindly PMed me to point out that there are also 2,2210M C211, 2,2330L C207, and 2,2330M C210 as GNFS targets in the Cunningham project. I personally like 2,1165+ more than these for a forum project (for one, it ought to be degree 6 poly), but at C217 it's more than twice as hard. I'm willing to get behind whichever candidate gets best support from the forum.

Max0526 2019-04-12 17:32

next C200+
 
I would vote for 2,2330L C207. It is the smallest, it could potentially be degree 5 or degree 6 for the poly, and we already have a goal for the poly score from two years ago on 5- or 6-side: E = 1.6e-15.

swellman 2019-04-12 19:25

[QUOTE=Max0526;513524]I would vote for 2,2330L C207. It is the smallest, it could potentially be degree 5 or degree 6 for the poly, and we already have a goal for the poly score from two years ago on 5- or 6-side: E = 1.6e-15.[/QUOTE]

Ditto. It’s the smallest GNFS job left in the bunch though I would not call it easy.

Adding the four GNFS composites for ECM to t65. Chasing t65 beyond ~70K curves is not currently planned.

BTW, yoyo just found a [url=http://www.rechenkraft.net/yoyo//y_factors_ecm.php]p64 factor for 2,1087+[/url], which is now fully factored!

lukerichards 2019-04-12 19:33

[QUOTE=VBCurtis;513516]I personally like 2,1165+ more than these for a forum project (for one, it ought to be degree 6 poly).[/QUOTE]

Me too. :tu:

xilman 2019-04-12 19:44

[QUOTE=swellman;513542]BTW, yoyo just found a [url=http://www.rechenkraft.net/yoyo//y_factors_ecm.php]p64 factor for 2,1087+[/url], which is now fully factored![/QUOTE]Yay!

VBCurtis 2019-04-12 22:28

[QUOTE=Max0526;513524]I would vote for 2,2330L C207. It is the smallest, it could potentially be degree 5 or degree 6 for the poly, and we already have a goal for the poly score from two years ago on 5- or 6-side: E = 1.6e-15.[/QUOTE]

Pretty sure the transition from deg 5 to deg 6 lies between 215 and 220 digits. Our poly search for that C206 from XYYX had deg 5 and deg 6 scoring similarly, but the deg 5 performed something like 30% faster; another reminder of jasonp's admonition that scores from different degrees cannot be compared.

We can do C207 as a 34LP project on CADO, and I can host server tasks running I=16 for those with memory to spare and I=15 for those with lower-memory machines (say, 12GB and 3GB available for CADO client). I think I know how to merge the relations files from two runs into something CADO can handle; if not, msieve can likely put up with ~1.2G unique relations for postprocessing.

I'll set a core or two doing high-bound ECM on 2,2330L; I'll also PM user bdodson to see how much ECM he may have done on these numbers (thanks again to Mr Silverman for the suggestion).

RichD 2019-04-12 22:45

You can count me in for whatever number is picked (I don't care). I have small personal resources. I need to find my CADO install folder or maybe just start over with a new GitHub download. What is the latest release or stable developer version?

VBCurtis 2019-04-12 23:42

I follow the instructions from the CADO-NFS download page for the current development release via git here: [url]http://cado-nfs.gforge.inria.fr/download.html[/url]

There are small issues, for example the starting Q-value is now tasks.qmin rather than tasks.sieve.qmin, but many of the params files still use the latter causing an error. Go in and change the line by deleting ".sieve", and the params files run fine.

I'm surprised there is no "official" 3.0 release, but the current git is at minimum 15% faster than 2.3.0 for an entire factorization; I don't have a good way to measure how the las siever application compares now versus 2.3.0.

swellman 2019-06-30 00:13

What next?
 
2,1165+ and 2,2210M are being worked to t65 thanks to Yoyo@Home. They should be fully ECMd in a few weeks. Any preferences on our next ECM target?

3,748+ c204 has been [url=https://www.mersenneforum.org/showthread.php?t=24548]suggested[/url]. Seems like a worthy candidate to me, but the effort so far has been focused on factoring all remaining 2+ (n<1200) and 2LM (n<2400). Do we limit our efforts to only that list? Scope creep has been the downfall of many projects...

I have asked Greg Childers where he is heading with the 16f queue so that we don’t step on each other. He may want our help in some sector. Perhaps he will chime in here.

xilman 2019-06-30 08:16

[QUOTE=swellman;520361]2,1165+ and 2,2210M are being worked to t65 thanks to Yoyo@Home. They should be fully ECMd in a few weeks. Any preferences on our next ECM target?

3,748+ c204 has been [url=https://www.mersenneforum.org/showthread.php?t=24548]suggested[/url]. Seems like a worthy candidate to me, but the effort so far has been focused on factoring all remaining 2+ (n<1200) and 2LM (n<2400). Do we limit our efforts to only that list? Scope creep has been the downfall of many projects...

I have asked Greg Childers where he is heading with the 16f queue so that we don’t step on each other. He may want our help in some sector. Perhaps he will chime in here.[/QUOTE]FWIW, my view is to finish off the traditional Cunninghams first, and then move to the extensions. That is, the 2+ and 2LM candidates and resist mission creep.

R.D. Silverman 2019-06-30 14:50

[QUOTE=swellman;520361]2,1165+ and 2,2210M are being worked to t65 thanks to Yoyo@Home. They should be fully ECMd in a few weeks. Any preferences on our next ECM target?
[/QUOTE]

Yes. 2,1135+ and 2, 1109+. The next two base 2 numbers ordered by size of the
cofactors. I skip 2,2126M because it is the same size as 2,1063+ and hence doable
by SNFS.

[QUOTE]

3,748+ c204 has been suggested..... <snip> Scope creep has been the downfall of many projects...
[/QUOTE]

Allow me to repeat (with updates) a post I made about 2 months ago:

There are currently 65 unfinished numbers from the 1987 hardcover
edition of the Cunningham book.

It would be nice to finish them. They are all from base 2, with index
< 1200 for 2,n+ and index < 2400 for 2LM.

Three of them have been sieved and are waiting for LA. (2,2102L, 2,2098L, 2,1052+)
Two of them are sieving: (2,2330L, 2,1063+)
One is about to start sieving (2,1072+)
One of them is relatively easy: 2,1144+ (exponent divisible by 11)
Several more are "within reach" of NFS@Home: (2,2126M, 2,1076+, 2,2150M, 2,2158L)
Several are within reach of GNFS (2,2330M, 2,2210L, 2,1165+)

They get quite a bit harder after that via SNFS. Of course the 2- table
was finished to index 1200, so the rest are all doable, but it would take
a massive effort.

I have run an additional 1000 ECM curves on 2,4k+ up to 1144 with B1 = 3G
I will finish the rest of 2,4k+ in about 4 months. (Two more to do: 2,1168+ in progress)

How about a very large ECM effort to pick off as many of the rest as we can?

Perhaps yoyo might tackle these with B1 = 850M?

VBCurtis 2019-06-30 16:40

[QUOTE=swellman;520361]3,748+ c204 has been [url=https://www.mersenneforum.org/showthread.php?t=24548]suggested[/url]. Seems like a worthy candidate to me, but the effort so far has been focused on factoring all remaining 2+ (n<1200) and 2LM (n<2400). Do we limit our efforts to only that list? Scope creep has been the downfall of many projects...[/QUOTE]

I view GNFS targets as exceptions to the fear of mission creep, when they're easier than the next target in the main mission. If 3,748+ was under 200 digits, I would favor doing it immediately.

There has been quite a large interest in the forum-team 2330L factorization, with sieving likely to finish in about 110 days start-to-finish. If there isn't enough interest to do a C210+ next, we should do 3,748+ next and thus should have yoyo do the ECM. Using the C207 as a baseline, that's a 5-month sieving project for 2330M versus 10 weeks or so for C204.

swellman 2019-06-30 17:17

Mr. Silverman, thank you for the update. My records now reflect the new information. A few clarifications:

- Why skip ECM 2,2246M? It’s a C221 that seems beyond SNFS.

- Isn’t 2,1157+, with a SNFS difficulty of 322, also within reach of NFS@Home?

- 2,2162L and M both seem similar in difficulty to 2,2158L, i.e. SNFS 325. Which is scary difficult but are either a feasible candidate for NFS@Home?

- Lastly, 2,1139+ pops up as an octic(!) with difficulty 323. Another reachable candidate perhaps, or a complete nonstarter?

Apologies if the above issues have been kicked around for years/decades by the experts, just trying to plan efficiently.

BTW, I had suggested 2,2398M to Ryan Propper for ECM back in April. He was working it but I’ve not been able to contact him since. I would suggest we not put this one through Yoyo@Home, at least not at the t65 level. It appears to be the highest difficulty composite from the 1987 list.

As to 3,748+, a C204 and good GNFS candidate, there is sometimes more to a project than pure analytics. Keeping volunteers engaged is an example of such. It will take years to chew through the remaining list of 40-odd 2+ and 2LM composites. A 2% increase in that time to actively do work other than watch Yoyo slowly run curves is a huge multiplier in my estimation, especially when that work advances the overall Cunningham project. Just my two cents.

pinhodecarlos 2019-06-30 18:07

Sean, once again, the problem is not the sieving, is on the linear algebra side.

swellman 2019-06-30 18:19

[QUOTE=pinhodecarlos;520396]Sean, once again, the problem is not the sieving, is on the linear algebra side.[/QUOTE]

Agreed. But I was only discussing ECM priorities.

pinhodecarlos 2019-06-30 18:26

[QUOTE=swellman;520398]Agreed. But I was only discussing ECM priorities.[/QUOTE]

Ok.

Batalov 2019-06-30 18:27

Here is what we have for the visible future. (ECM can be run for all 147 base-2 targets, regardless)
[CODE]224 2,2586L 259.4 0.861 /quartic/resvd
228 2,2150M 258.8 0.88 /5q
211 2,2210M 266.1 0.792 /5q
225 2,2230M 268.5 0.837 /5q
253 2,1115+ 268.5 0.942 /5q
223 2,1135+ 273.3 0.815 /5q
299 2,2098L 315.7 0.946 /resvd
282 2,2102L 316.3 0.891 /resvd
300 2,1052+ 316.6 0.947 /resvd
274 2,1144+ 317.9 0.875 /13
219 2,2126M 319.9 0.684
281 2,1063+ 319.9 0.878
270 2,1157+ 321.5 0.839 /13
271 2,1072+ 322.7 0.839
268 2,1253- 323.3 0.827 /7
309 2,2506M 323.3 0.954 /7
238 2,1076+ 323.9 0.734
296 2,2158L 324.8 0.911
236 2,2162M 325.4 0.725
258 2,2162L 325.4 0.792
318 2,1084+ 326.3 0.974
265 2,2534M 326.9 0.809 /7
314 2,2534L 326.9 0.958 /7
273 2,2174L 327.2 0.834
309 2,2174M 327.2 0.944 [/CODE]

VBCurtis 2019-06-30 18:47

It was a surprise to me to learn that the smaller tasks sieved by Greg on 16f are so well- sieved that the matrices can be solved on single machines. For instance, the C206 I am post-processing had both filtering and LA fit in a 32GB machine. That means that with a little patience, anyone with a 48GB+ machine can solve some of Greg's easier matrices for him, allowing him to use his cluster-time to solve the ones sized 60M+. I hope there are such jobs available to us, that they're not all too big already.

I plan to do just this with my 64GB box this fall. Perhaps Mr Womack will also run one matrix for NFS@home this year also, reducing Greg's backlog by 2 jobs. I still plan to expand my box to 128GB memory around the time 2330L finishes sieving, which would make patience rather than capacity the limiting factor for my own matrix-solving.

Again, C206 LA requires just 6 weeks on my machine; Mr Womack's best machine is likely twice as fast. I'm fine with 10 weeks per matrix personally, which is likely somewhere around 55M matrix size. There's no reason to think the cluster Greg uses to be limited to 100M matrices; at 120M to 140M, SNFS-330 should be covered?

I personally enjoy variety in projects a bit, but mainly I enjoy efforts to optimize tools for peak speed. I've learned quite a bit during the 2230L project, and I'd like to try to apply that knowledge to a C201-C205 before tackling 2330M. Perhaps that means Mr Silverman's advice to leave these C210-C217 GNFS jobs to NFS@home is what we must do; but I very much enjoy the team-sieve happening now, and I think it has attracted contributors that would not have sieved for NFS@home. With that in mind, I think CADO-team-sieve has merit for attracting CPU cycles and for advancing the development of CADO parameters for faster factorizations.

R.D. Silverman 2019-06-30 18:48

[QUOTE=swellman;520388]Mr. Silverman, thank you for the update. My records now reflect the new information. A few clarifications:

- Why skip ECM 2,2246M? It’s a C221 that seems beyond SNFS.
[/QUOTE]

It was (accidently) listed out of order in my file....

[QUOTE]
- Isn’t 2,1157+, with a SNFS difficulty of 322, also within reach of NFS@Home?
[/QUOTE]

Yep. I missed that 1157 is divisible by 13.

[QUOTE]
- 2,2162L and M both seem similar in difficulty to 2,2158L, i.e. SNFS 325. Which is scary difficult but are either a feasible candidate for NFS@Home?
[/QUOTE]

One must choose a cutoff. I chose 1080 bits. One can usually find a number that is
"just a little bit bigger" to do. The cutoff is a pure guess, based on some informal
remarks made by Greg about the LA.

[QUOTE]
- Lastly, 2,1139+ pops up as an octic(!) with difficulty 323. Another reachable candidate perhaps, or a complete nonstarter?
[/QUOTE]

I think out of reach; The LA is problematic, as is sieving with an octic.

R.D. Silverman 2019-06-30 18:54

[QUOTE=VBCurtis;520403]It was a surprise to me to learn that the smaller tasks sieved by Greg on 16f are so well- sieved that the matrices can be solved on single machines. For instance, the C206 I am post-processing had both filtering and LA fit in a 32GB machine. That means that with a little patience, anyone with a 48GB+ machine can solve some of Greg's easier matrices for him, allowing him to use his cluster-time to solve the ones sized 60M+. I hope there are such jobs available to us, that they're not all too big already.

[/QUOTE]

I get the impression that the current set of NFS@Home numbers waiting for LA require
> 64GB and a few dozen cores minimum.

I would be [b]very[/b] pleased to find that it is not true and that others might help Greg with the LA.

VBCurtis 2019-06-30 18:55

[QUOTE=R.D. Silverman;520404]One must choose a cutoff. I chose 1080 bits. One can usually find a number that is "just a little bit bigger" to do. The cutoff is a pure guess, based on some informal remarks made by Greg about the LA.[/QUOTE]

Thank you; this addresses my questions about SNFS-330 and what candidates are reasonable for Greg on SNFS.

R.D. Silverman 2019-06-30 19:07

[QUOTE=Batalov;520401]Here is what we have for the visible future. (ECM can be run for all 147 base-2 targets, regardless)
[CODE]224 2,2586L 259.4 0.861 /quartic/resvd
228 2,2150M 258.8 0.88 /5q
211 2,2210M 266.1 0.792 /5q
225 2,2230M 268.5 0.837 /5q
253 2,1115+ 268.5 0.942 /5q
223 2,1135+ 273.3 0.815 /5q
299 2,2098L 315.7 0.946 /resvd
282 2,2102L 316.3 0.891 /resvd
300 2,1052+ 316.6 0.947 /resvd
274 2,1144+ 317.9 0.875 /13
219 2,2126M 319.9 0.684
281 2,1063+ 319.9 0.878
270 2,1157+ 321.5 0.839 /13
271 2,1072+ 322.7 0.839
268 2,1253- 323.3 0.827 /7
309 2,2506M 323.3 0.954 /7
238 2,1076+ 323.9 0.734
296 2,2158L 324.8 0.911
236 2,2162M 325.4 0.725
258 2,2162L 325.4 0.792
318 2,1084+ 326.3 0.974
265 2,2534M 326.9 0.809 /7
314 2,2534L 326.9 0.958 /7
273 2,2174L 327.2 0.834
309 2,2174M 327.2 0.944 [/CODE][/QUOTE]

Let's finish the original tables before starting index > 1200......

swellman 2019-06-30 19:22

[QUOTE=R.D. Silverman;520404]
I think out of reach; The LA is problematic, as is sieving with an octic.[/QUOTE]

I believe Greg recently [url=https://www.mersenneforum.org/showpost.php?p=511744&postcount=259]updated msieve[/url] to better handle octics but I have no idea if those changes scale to composites of this difficulty.

Thanks again for the additional information - I’ve updated my records accordingly, including plans to enqueue 2,2246M for ECM by Yoyo@Home.

Got some good news: Yoyo is allowing me to enqueue the t65 work in blocks of 12000 curves rather than 9000. Hoping 6x submissions are a bit faster than 8x in the BOINC environment.

R.D. Silverman 2019-06-30 19:36

[QUOTE=swellman;520409]I believe Greg recently [url=https://www.mersenneforum.org/showpost.php?p=511744&postcount=259]updated msieve[/url] to better handle octics but I have no idea if those changes scale to composites of this difficulty.
[/QUOTE]



When the NFS norms are unbalanced it results in much larger matrices.
Norms are unbalanced when the degree is smaller or larger than optimal.

sweety439 2019-07-01 12:55

[QUOTE=swellman;520361]2,1165+ and 2,2210M are being worked to t65 thanks to Yoyo@Home. They should be fully ECMd in a few weeks. Any preferences on our next ECM target?

3,748+ c204 has been [url=https://www.mersenneforum.org/showthread.php?t=24548]suggested[/url]. Seems like a worthy candidate to me, but the effort so far has been focused on factoring all remaining 2+ (n<1200) and 2LM (n<2400). Do we limit our efforts to only that list? Scope creep has been the downfall of many projects...

I have asked Greg Childers where he is heading with the 16f queue so that we don’t step on each other. He may want our help in some sector. Perhaps he will chime in here.[/QUOTE]

Also 10, 323- c271 has been [URL="https://mersenneforum.org/showthread.php?t=24550"]suggested[/URL], not only 3,748+ c204

R.D. Silverman 2019-07-01 15:55

[QUOTE=sweety439;520455]Also 10, 323- c271 has been [URL="https://mersenneforum.org/showthread.php?t=24550"]suggested[/URL], not only 3,748+ c204[/QUOTE]

Suggested for what? NFS@Home will get to it, as I [b]already[/b] pointed out. It doesn't
need more ECM. It is beyond the capabilities of the mersenneforum. So I
therefore ask: What you are suggesting to do with it?

VBCurtis 2019-07-01 15:59

[QUOTE=sweety439;520455]Also 10, 323- c271 has been [URL="https://mersenneforum.org/showthread.php?t=24550"]suggested[/URL], not only 3,748+ c204[/QUOTE]

You suggested this one, and were told it's going to wait until easier ones are done first. Please don't keep asking.

sweety439 2019-07-02 12:42

[QUOTE=R.D. Silverman;520471]Suggested for what? NFS@Home will get to it, as I [b]already[/b] pointed out. It doesn't
need more ECM. It is beyond the capabilities of the mersenneforum. So I
therefore ask: What you are suggesting to do with it?[/QUOTE]

Currently, there is [B]NO[/B] "10,323- c271" in the page [URL="http://homes.cerias.purdue.edu/~ssw/cun/who"]http://homes.cerias.purdue.edu/~ssw/cun/who[/URL].

R.D. Silverman 2019-07-02 12:54

[QUOTE=sweety439;520532]Currently, there is [B]NO[/B] "10,323- c271" in the page [URL="http://homes.cerias.purdue.edu/~ssw/cun/who"]http://homes.cerias.purdue.edu/~ssw/cun/who[/URL].[/QUOTE]

And?

If you bothered to pay attention over the last few years you would have observed that
only a few numbers get queued at a time. 10,323- has not been queued yet.

[b]SO WHAT?[/b]

You would also have observed that Greg is generally (with a few exceptions) doing
numbers in order of increasing SNFS difficulty. A small amount of effort would reveal
that there are about a dozen Cunningham numbers smaller than 10,323- that have yet
to be queued.

What is your <censored> obsession with this one number?

R.D. Silverman 2019-07-02 12:57

[QUOTE=VBCurtis;520472]You suggested this one, and were told it's going to wait until easier ones are done first. Please don't keep asking.[/QUOTE]

See his next post. He can't stop. It's an obsession.

swellman 2019-07-03 11:27

Greg has been in contact about the current effort. He asks us to focus the ECM focus on 2_1084+, 2_2126M, and 2_2150M. He will queue them in a couple of months but they need more ECM.

R.D. Silverman 2019-07-03 14:34

[QUOTE=swellman;520647]Greg has been in contact about the current effort. He asks us to focus the ECM focus on 2_1084+, 2_2126M, and 2_2150M. He will queue them in a couple of months but they need more ECM.[/QUOTE]

Greg should add 2,1157+ to his list as well. We'll need to run more ECM. (exponent
divisible by 13 so a sextic works very well)

R.D. Silverman 2019-07-03 14:54

[QUOTE=swellman;520647]Greg has been in contact about the current effort. He asks us to focus the ECM focus on 2_1084+, 2_2126M, and 2_2150M. He will queue them in a couple of months but they need more ECM.[/QUOTE]

You might want to ask Bruce Dodson how much work he did with these numbers.

R.D. Silverman 2019-07-03 15:06

[QUOTE=R.D. Silverman;520668]You might want to ask Bruce Dodson how much work he did with these numbers.[/QUOTE]

One data point. I know that Arjen ran 20K curves with B1 = 10^9 on all of the
base 2 numbers a number of years ago. You may want to check with him as well.

swellman 2019-07-03 21:44

[QUOTE=R.D. Silverman;520666]Greg should add 2,1157+ to his list as well. We'll need to run more ECM. (exponent
divisible by 13 so a sextic works very well)[/QUOTE]

Certainly, it’s all just a matter of prioritization. Even yoyo likely won’t be able to complete a full t65 on the three composites in two months. But we will do what we can.

We can weave the others into the tapestry.

swellman 2019-07-31 13:00

[QUOTE=swellman;520647]Greg has been in contact about the current effort. He asks us to focus the ECM focus on 2_1084+, 2_2126M, and 2_2150M. He will queue them in a couple of months but they need more ECM.[/QUOTE]

These are now starting to get some ECM, well at least 2,1084+ so far. I estimate it will take until mid Oct to complete these three at the t65 level. Hoping this meshes with Greg’s queuing plans but there it is.

Planning on queuing 2,1157+, 2,1144+ and 2,2158L next.

pinhodecarlos 2019-07-31 17:30

Under yoyo is is possible to choose only to ecm from the Cunningham or is it still sending from all sub projects.

R.D. Silverman 2019-07-31 18:26

[QUOTE=swellman;522719]These are now starting to get some ECM, well at least 2,1084+ so far. I estimate it will take until mid Oct to complete these three at the t65 level. Hoping this meshes with Greg’s queuing plans but there it is.

Planning on queuing 2,1157+, 2,1144+ and 2,2158L next.[/QUOTE]

Finishing the sieving on 2,1072+ and 2,1076+ should take another 3-4 weeks.
I am guessing that Greg might then queue 2,2126M (same difficulty as 2,1063+).
This should take us to the 2nd/3rd week of September before he would be ready
for one of the currently queued YoYo ECM candidates.

DukeBG 2019-08-01 12:25

[QUOTE=pinhodecarlos;522748]Under yoyo is is possible to choose only to ecm from the Cunningham or is it still sending from all sub projects.[/QUOTE]

it will still send workunits for all available ecm numbers if you select the ecm subproject

R.D. Silverman 2019-09-22 03:36

[QUOTE=DukeBG;522827]it will still send workunits for all available ecm numbers if you select the ecm subproject[/QUOTE]

YoYo can stop further ECM work on 2,2126M. NFS@Home has started sieving it.

pinhodecarlos 2019-09-22 06:58

[QUOTE=R.D. Silverman;526253]YoYo can stop further ECM work on 2,2126M. NFS@Home has started sieving it.[/QUOTE]

Isn’t worth to keep it still active for a further couple of days since most of the NFS@Home users have queue set up to at least 3 days and sometimes with additional days. My case it will take me 2 days to start processing 2,2126M.

R.D. Silverman 2019-09-22 08:18

[QUOTE=pinhodecarlos;526259]Isn’t worth to keep it still active for a further couple of days since most of the NFS@Home users have queue set up to at least 3 days and sometimes with additional days. My case it will take me 2 days to start processing 2,2126M.[/QUOTE]

No. Further ECM is pointless.

pinhodecarlos 2019-09-22 12:26

[QUOTE=R.D. Silverman;526261]No. Further ECM is pointless.[/QUOTE]

Ok, noted. Thank you.

swellman 2019-09-22 12:27

I have let yoyo know about 2,2126M. Only he can kill it.

Please let me know when sieving starts on 2_1084+1 and 2_2150M and I will ask yoyo to kill those jobs as well.

R.D. Silverman 2019-09-22 14:04

[QUOTE=swellman;526274]I have let yoyo know about 2,2126M. Only he can kill it.

Please let me know when sieving starts on 2_1084+1 and 2_2150M and I will ask yoyo to kill those jobs as well.[/QUOTE]

I don't know what Greg will do next. 2,2126M should take ~2 to 3 weeks. He may then
do 2,2330M via GNFS. I believe that the forum has selected a polynomial for it???

2,2150M will take a month to sieve and 2,1084+ should take 6-7 weeks.

We don't really need to manually kill the current work on 2,2126M since there are only
a couple of hundred curves left. It should suffice to merely remove it from the queue.

swellman 2019-09-22 15:45

[QUOTE=R.D. Silverman;526282]I don't know what Greg will do next. 2,2126M should take ~2 to 3 weeks. He may then
do 2,2330M via GNFS. I believe that the forum has selected a polynomial for it???

2,2150M will take a month to sieve and 2,1084+ should take 6-7 weeks.

We don't really need to manually kill the current work on 2,2126M since there are only
a couple of hundred curves left. It should suffice to merely remove it from the queue.[/QUOTE]

It’s all good. “Killing it” simply refers to stopping new WU from being distributed. I suppose there is a vanishingly small chance of a factor appearing before the job finishes completely.

No worries on the other two Cunningham numbers. I don’t have visibility on Greg’s queue (other than [url=https://escatter11.fullerton.edu/nfs/numbers.html]here[/url]) so I would appreciate any news. Until then, I will keep up with yoyo’s ECM efforts.

pinhodecarlos 2019-09-22 16:34

The only work around to get access to what is being sieved and boundary is go to one of the hosts at [URL]https://escatter11.fullerton.edu/nfs/top_hosts.php[/URL] and look at the tasks. Right now the sieve wave for 2,2126M is at S2M2126_102082_0 of delivered.

R.D. Silverman 2019-09-28 04:12

[QUOTE=swellman;526292]

<snip>

I don’t have visibility on Greg’s queue (other than [url=https://escatter11.fullerton.edu/nfs/numbers.html]here[/url]) so I would appreciate any news. Until then, I will keep up with yoyo’s ECM efforts.[/QUOTE]

Sam Wagstaff just got 2,2386L via ECM.

[Just in case anyone was wondering whether the remaining composites are worth
attacking with ECM]


All times are UTC. The time now is 11:32.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.