![]() |
![]() |
#177 | |
Nov 2003
22×5×373 Posts |
![]() Quote:
12^319-1 has an algebraic factor. We are not factoring 12^319-1, we are factoring (12^319-1)/(12^29-1) ~12^290 ~ C313. The resulting polynomial is reciprocal, so we can do this number with a quintic. However, quintic polynomials for numbers this size result in matrices that are significantly larger than those for numbers of similar size done with sextics. Greg is LA constrained right now, so he skipped 12^319-1 for the time being. He did C314, C315, C316 C317 and is now working on C318's via 3^667-1 etc. Greg may indeed do R323 before he does 12^319-1. I think he will. R323 might well be done by a reciprocal octic to take advantage of the algebraic factor 10^19-1. Whether the octic would be easier than the obvious sextic might be an interesting experiment. It might also be interesting to see if a septic would be any better. I think a septic will be slightly better in general for numbers of this size. Let's do a "back of the envelope" look at the norms. Take (10^6, 10^6) == (a,b) as a 'typical lattice point'. For a sextic, an algebraic norm is ~ a^6 ~ 10^36 and a linear norm is ~ b * (10^324/6) ~ 10^60. For a septic an anorm is ~a^7 ~ 10^42 and a linear norm is b *(10^322/7) ~ 10^52. The norms are closer for the septic and their product is slightly smaller. A septic seems slightly superior. For the reciprocal octic an anorm is a^8 ~ 10^48 and a linear norm is b * (10^38) ~ 10^44 which seems even better still. Note that one also needs to adjust these estimates by the special-q. The estimates also ignore the effect of variance on the norms. Since we want smooth numbers we are more concerned with the tails of the distributions of the norms rather than the means. However, it does give a quick comparison. NFS works best when the norms are as nearly equal as possible, other things being equal. This very rough estimate is based on the assumption that (10^6, 10^6) is a typical lattice point. Adjust the analysis if this assumption is not a good enough estimate. I do now know what sieve areas the lasievef siever uses. Noone has been calling for him to do 12^319-1. It is possible that Greg missed the reciprocal octic for R323. He will get to it. Doing R323 seems to be a compulsion with you. |
|
![]() |
![]() |
![]() |
#178 |
Jul 2003
So Cal
80716 Posts |
![]() |
![]() |
![]() |
![]() |
#179 | |
Jul 2003
So Cal
3×5×137 Posts |
![]() Quote:
|
|
![]() |
![]() |
![]() |
#180 | |
Nov 2003
22·5·373 Posts |
![]() Quote:
a degree 7 polynomial would be better (than degree 6) for Greg to use moving forward for numbers that NFS@Home is about to undertake. |
|
![]() |
![]() |
![]() |
#181 | |
"Bo Chen"
Oct 2005
Wuhan,China
22×41 Posts |
![]() Quote:
deg 6 need 102 CPU years to collect 1200M raw relations, while deg 7 need 182 CPU years on an i3 CPU. I attach the poly and test files. |
|
![]() |
![]() |
![]() |
#182 | |
Nov 2003
22×5×373 Posts |
![]() Quote:
algebraic and decrease the linear. |
|
![]() |
![]() |
![]() |
#183 |
Nov 2003
22·5·373 Posts |
![]() |
![]() |
![]() |
![]() |
#184 |
"Bo Chen"
Oct 2005
Wuhan,China
2448 Posts |
![]()
I guess you mean increase alim and decrease rlim, use option -a.
But I test again with these changes, the situation is the same. When I use alim=800M rlim =200M -a, and binary lasieve5_f compiled, it need 100 CPU years to collect 1200M raw relations on an i7 CPU; While use alim=rlim =400M,-r with the same binary and processor,it need 40 CPU years to collect 1200M raw relations. Though I don't know why,it is a little strange. |
![]() |
![]() |
![]() |
#185 | |
"Max"
Jun 2016
Toronto
2C916 Posts |
![]() Quote:
Kurt Beschorner's team (http://kurtbeschorner.de/) is days away from cracking R459/C221 by SNFS and 40% in the GNFS sieving for R1740M/C204. Slowly but surely we are selecting our next repunit cofactor to work on. One of the candidates is R337/C202. If nobody is actively working on this number, could we please reserve it for Kurt's team? Since the discussion in late February 2020 (see wreck's message above), did anybody try to polyselect for this C202? I ran CADO with standard parameters for a day and found a mediocre baseline poly (2.39e-15), the 2018 record belongs to fivemack (3.665e-15). If nobody minds our reservation, we would really appreciate your help in polyselecting, especially on the msieve side. I will run CADO with improved parameters and spin up all good candidates as always. Please let us know, here or via PM. Stay safe, Max |
|
![]() |
![]() |
![]() |
#186 |
"Curtis"
Feb 2005
Riverside, CA
2×3×769 Posts |
![]()
Fine with me- we're not in a big hurry to grab a 201-202 for a 15e/home-CADO hybrid.
I wager there's less than 10% chance anymore of msieve finding a winning poly for a composite at 200+ digits. I can do a little CADO poly select, but not a ton- if Kurt's group wants firepower, please reserve a range of c5 values (admin/admax) for them, and some of us will add our efforts in nonoverlapping ranges. Might not be worth their time to coordinate, though. |
![]() |
![]() |
![]() |
#187 | |
"Max"
Jun 2016
Toronto
23×31 Posts |
![]() Quote:
Kurt cracked R459 in the morning: http://kurtbeschorner.de/ Gimarel found an exceptional poly for R337/C202: https://mersenneforum.org/showpost.p...postcount=1930 And I started the spinup process: https://mersenneforum.org/showpost.p...postcount=1931 |
|
![]() |
![]() |
![]() |
Thread Tools | |
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Cunningham ECM efforts | pinhodecarlos | Cunningham Tables | 7 | 2017-12-21 13:29 |
Cunningham ECM Now Futile? | R.D. Silverman | GMP-ECM | 4 | 2012-04-25 02:45 |
Cunningham Project on YouTube | Batalov | Cunningham Tables | 0 | 2012-02-26 02:58 |
Extended Cunningham or so | rekcahx | Factoring | 6 | 2011-08-19 12:45 |
Introduction: ECM work done on Cunningham Project composites | garo | Cunningham Tables | 2 | 2005-01-20 10:06 |