20050118, 14:10  #1 
Aug 2002
Termonfeckin, IE
2^{4}×173 Posts 
10 table
Code:
Size Base Index Mod Diff Ratio 328 10 353  353 0.929 288 10 365  292 0.986 /5q 311 10 377  348 0.893 /13 230 10 383  383 0.600 270 10 389  389 0.694 312 10 391  391 0.797 Last fiddled with by Batalov on 20220501 at 05:49 Reason: 10,323 is done 
20061009, 21:43  #2  
Jun 2005
lehigh.edu
2^{10} Posts 
Quote:
found near the end to testing to p50. Prime cofactor! Bruce 

20061221, 09:00  #3  
Jun 2005
lehigh.edu
2^{10} Posts 
late p44
Quote:
finished last year, this is definitely late: p44= 37633698993045258670863410188544865190871951 Can't say that I'd encourage anyone else to go looking for further p44's, but suppose I can't (yet) rule out another one or two. A nice complete factorization, 10,329 c255 = p44*p212 (iirc). Bruce 

20070214, 16:05  #4  
Jun 2005
lehigh.edu
2000_{8} Posts 
early p54, 262 = 54 + 209, complete
Quote:
630/3155 (=t45) + 28.09% (new curves) = 47.748% of t50, 10, 395 C262 finished early, with p54 = 388603184868446209952357338208961774763421470820867551 xilman/Paul_L mentions that (hard) factorizations by ecm factors of 56digits are a remarkable advance (since 1985!). Actually, measuring difficulty of ecm factorizations (as compared with the ecmdifficulty of finding a prime factor of a given size) hasn't always been uncontroversial; as for example in Nov. 2003 when Paul_Z was listing Backstrom's p58 as the best ecm factor; while Brent was still listing my June 2003 p57 as the ecm "champ". In fact, it was on that occasion that Richard introduced his restriction that the ratio of the pxx found to the cxxx being factored satisfy r >= 2.2 for r = size(cxxx)/size(pxx). Brent's point was perhaps founded on his p40 factorization (complete!) of F10, which was surely a harder factorization than many of the following p41p42's (say). So my p57 was the first factor of the Mersenne M997 = 2^9971 = c301, which also had a prime cofactor (and was then on George's Most wanted list; now consisting of M1061 only). By contrast, the p58 was a factor of a c110, with c110 = p58*p52, and Richard's champs03 observes: Quote:
preferred to simply rely on the estimated ecm runtime, since ecm had no way of knowing the size of the cofactor (except through the multi precise arithmetic, a smaller oterm (Oterm?)). Anyway, a more current comparison, of the current top3; the p64 of AokiShimoyama gave the complete kilobit factorization of the c3111 = R311 = 111...1 = (10^311 1)/9, with 311 1's and is arguably a harder factorization (by snfs difficulty, say) than the other ecm candidates p67, p66 and Alex's p63. An awfully long wayaround, but the Brent ratio for today's p54 is 262/54 = 4.851; and restricted to pxx's having prime cofactor (and so, giving the complete factorization; rather than leaving a hard composite cofactor, perhaps an snfs that's no easier than before the ecm factor was found) one of the more difficult ecm factorizations. Just scanning the "Notes" at the end of Richard's list, the above p57 has r = 5.28; and the above p64 has r = 4.86; and today's 4.851 is above any of the other champs ratios. Not bad, considering that I didn't know where it fell when I started today's post. Bruce 

20070613, 02:39  #5  
Jun 2005
lehigh.edu
2^{10} Posts 
10,293 268 > 218 with p50
Quote:
during the last 1100/7830 curves to finish t50. Composite cofactor of 218digits will be done to t50 in a few more hours. Will be one of the last factors before t50 finishes on the complete Cunningham list. Bruce ps  cofactor is still easier by snfs (293 prime, difficulty 293). Last fiddled with by bdodson on 20070613 at 02:48 Reason: added difficulty note 

20070614, 13:48  #6  
Jun 2005
lehigh.edu
2^{10} Posts 
t50 update
Quote:
below 100,000. There are six numbers that have 2300/7830 curves remaining to finish t50, and three more that have 1300/7830 curves left. These are the last nine numbers from 2 or 2+ (with n < 1200), under 18K curves total. The other range has 74 numbers with 1100/7830 curves to go, 81.4K curves total. (These being "generic" c251c299's; not 2 or 2+ with n < 1200.) All gmpecm curves with B1=43M and default B2. Might sound like a lot, but it's under 13 t50's (13*7830 > 99.1K), out of c. 800 Cunninghams. That might sound like 13 chances for t50's to find a factor, but with the exception of those last six base2's with two more passes (instead of one), the other 77 have nearly complete t50's, Very little chance of a late p44p47 left, and the probablity curve (for a known p50) tails off near the end. So very little chance for p44p47; a chance that's tailing off for p48p51; and a small chance for an early p52p54. Hard to give a realtime estimate (calendar date); I'll update these counts if/when there's a next factor. Bruce 

20070930, 01:22  #7  
Jun 2005
lehigh.edu
2^{10} Posts 
c219 > c165 via p54, still snfs
Quote:
of difficulty 220229 to get it's 3rd t50; 10, 339 c219  > c165 via p54 = 777734075184513369134763199249605543798943174359980119 At difficulty 226, the snfs is still easier than c165 gnfs. Not a lot of chance for a p52p70 from the c165 after 3*t50; seems likely to be around for a while unless someone's sieving repunits. I'm starting 3rd t50's on c155c169 of difficulty under 230 anyway (along with finishing the 2nd t50 on difficulty below 220, the last ones below c251), so this c165 is having an early start. The rest of c147c154 as well, despite their being too small for a large factor to meet Brent's ratio. The c149 of difficulty 256.91 is already past 3*t50; seems ready for gnfs. Bruce 

20070930, 23:19  #8 
(loop (#_fork))
Feb 2006
Cambridge, England
2×7×461 Posts 
The c149 of difficulty 256.91 is presumably 7,380+; why are you mentioning it on the 10 table thread?
What's the status of 10,375+ (difficulty 200 by one metric, but to take out the factor 15 requires using a quartic, and 200digit quartic is deeply enough suboptimal that 149digit GNFS might well be easier ... difficulty 225 using a sextic and only the factor three is almost certainly harder than 149digit GNFS)? I'm probably within a week of finishing 11,251+ by gnfs and 7,380+ might be a nice one to do next, though I'll have to check with Wagstaff when I submit 11,251+. 
20071001, 03:46  #9  
Jun 2005
lehigh.edu
2000_{8} Posts 
Quote:
to finish testing to 3*t50 (finished already since then; no factor found; not a surprize!). That brought to my mind the status of the other c16x's, which I've been running as c155c169 split by difficulty as < 220, in 220229, and over 229. I spent quite a bit of time over the past week or two looking over the smallest Cunninghams (7 in c149c154) and the ones of smallest difficulty (below 220). I generally spend a lot of time looking over input files; what's running now (3 or 4 or 5 ranges); what ought to be run next. The immediate issue was Bob's report of an "almost ecm miss" of a p53. His number was from the range c155c169 of difficulty below 220, above, which I've subsequently identified as "most undertested" at 1.668*t50 ... last I'd looked there were 23 numbers, but updating an input file, there's 17 now. (And, as I responded to Bob's email report on the "almost miss", finding his p53 to (11/e) probability would have taken nearer to 3*t50; so it would at worst have been a "bdodson miss", not an ecm missing  it not being ecm's fault that I stopped early in favor of more curves on the harder numbers.) One can almost never win on these "ecm miss" arguments, and I usually end up convincing myself that  while it may have been plausible to stop early at the time; with hindsight, and with other more urgent ranges finishing; it's time to go back for another pass. I started some time ago with the smallest numbers of difficulty 230 and up; c149c189, including the hardest c155c169's. I just recently finished that pass, with most numbers tested past 3*t50. The smallest being the c149 you identify; which it seemed to me would be nice to have off of that input file before I get back for another round. I've switched those cpus (xps with just 250Mb available for condor grid jobs) to a file with the other six numbers from c149c154; the 17 numbers most undertested from c155c169, and 11 more of that size with difficulty in 220229. That was when I noticed 6,284+ c168 (also not from the 10 thread) on Sam's "who's doing what" list; took it off of the input file; checked with Richard & Sam to confirm that it is the next NFSNET number (unless Paul decides otherwise), and then followed Richard's suggestion and put it back into the input file, at the top of the list, for one more t50 (to 3*t50). Halfway done already. Quote:
c149 7, 380+ 17712381340... ; perhaps a larger chance of an ecm miss? It's in the above input file; which I expect to run up to 3*t50. On other numbers of current interest, the Opteron run that found the above p54 (which is from 10 ...) is about to do a 3rd t50 on some of the other 768bit numbers (from the poll that was to follow the poll which our gerbils removed; with some reason, as it turns out ...), not that we ought to expect anything; just further enhancing the probabilities that they won't give an ecm miss or other unwantedly small prime factor. Quote:


20071001, 12:06  #10  
"Bob Silverman"
Nov 2003
North of Boston
1110100111010_{2} Posts 
Quote:
short while ago. 10,375 will be easier with SNFS. The LA for 2,1582L is about 50% done, and sieving for 2,1962M is about 80% done. I then plan to do 2,1630M and 2,1914M via SNFS. These will be relatively hard, as the best polynomials are quartics. Depending on how hard 2,1630M is, I may then do 2,1690M. When these are done, I plan on tacking 2,1059 via SNFS unless someone else does it sooner. This number is a tossup between SNFS and GNFS. There are not a lot of numbers left with SNFS difficulty under 210. Most of these will require a quartic. 

20071001, 12:24  #11  
"Bob Silverman"
Nov 2003
North of Boston
2·3·29·43 Posts 
Quote:
I try to be careful in my choice of language. I really meant "almost" miss. I don't regard any factor over 50 digits as an ECM miss, since an optimal ECM effort will still miss it with probability (1/e)^k with a k * t50 effort. My recommendation is not to bother with ECM on any of the composites with SNFS difficulty under 210 except for perhaps 2,2190L, 2,2226L, and 2,2370L. Actually the SNFS difficulty for these is at least 220 because there doesn't seem to be any way to take advantage of the fact that they have more than one algebraic factor. The index is divisible by 3, but this leads to numbers with difficulty above 220 and requires a quartic. [yech!!!!]. There seems to be no way to take advantage of the additional algebraic factors. One gets an expression with Hamming weight (i.e. too many nonzero terms) that does not yield a good polynomial. They are also not reciprocal. *Definitely* do not waste ECM time on 7,366+, 10,309, 7,369+, 10,312+ or 5,447. They are too easy with SNFS. 

Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
7+ table  garo  Cunningham Tables  87  20220325 19:16 
5+ table  garo  Cunningham Tables  100  20210104 22:36 
6+ table  garo  Cunningham Tables  80  20210104 22:33 
5 table  garo  Cunningham Tables  82  20200315 21:47 
6 table  garo  Cunningham Tables  41  20160804 04:24 