mersenneforum.org  

Go Back   mersenneforum.org > Factoring Projects > Cunningham Tables

Reply
 
Thread Tools
Old 2005-01-18, 14:10   #1
garo
 
garo's Avatar
 
Aug 2002
Termonfeckin, IE

2,753 Posts
Default 10- table

Code:
Size	Base	Index	Mod	Diff	Ratio
271	10	323	-	323	0.839
202	10	337	-	337	0.6	/gnfs
328	10	353	-	353	0.929
288	10	365	-	292	0.986	/5q
255	10	371	-	318	0.801	/7
311	10	377	-	348	0.893	/13
230	10	383	-	383	0.600
270	10	389	-	389	0.694
312	10	391	-	391	0.797

Last fiddled with by Batalov on 2020-03-16 at 15:27 Reason: 10,337- is ecm'd
garo is offline   Reply With Quote
Old 2006-10-09, 21:43   #2
bdodson
 
bdodson's Avatar
 
Jun 2005
lehigh.edu

210 Posts
Default

Quote:
Originally Posted by garo View Post
Code:
Base	Index	Size	11M(45digits)	43M(50digits)	110M(55digits)	260M(60digits)	Decimal
...
10	393-	C217	0(0.267423)	0(0.0522979)	165(0.00921839)	0(0.00148122)	9552561281598698059450125009714586765573478465037641744474941129718710126192246540255861267712062571354798865189507832431821533150133896454255639413028630537593388980921184589800719288091100283350080536075868128426279
...
p49 = 1100517845115354201024243897527295703743726722437
found near the end to testing to p50. Prime cofactor! -Bruce
bdodson is offline   Reply With Quote
Old 2006-12-21, 09:00   #3
bdodson
 
bdodson's Avatar
 
Jun 2005
lehigh.edu

20008 Posts
Default late p44

Quote:
Originally Posted by garo View Post
Code:
Base	Index	Size	11M(45digits)	43M(50digits)	110M(55digits)	260M(60digits)	Decimal
...
10	329-	C255	0(0.267423)	0(0.0522979)	165(0.00921839)	0(0.00148122)	
458227547730583661200840849804986359353223739887924399591623992617000702067777246428151335318500214134376893906810478196697575337815845442688279952191646422455261499080595864133160204395830607269408074505417807365655778463938747929779437817328848232728437
Since I'm closing in on 2*p45 (along the way towards t50), with t45
finished last year, this is definitely late:

p44= 37633698993045258670863410188544865190871951

Can't say that I'd encourage anyone else to go looking for further p44's,
but suppose I can't (yet) rule out another one or two. A nice
complete factorization, 10,329- c255 = p44*p212 (iirc). -Bruce
bdodson is offline   Reply With Quote
Old 2007-02-14, 16:05   #4
bdodson
 
bdodson's Avatar
 
Jun 2005
lehigh.edu

102410 Posts
Default early p54, 262 = 54 + 209, complete

Quote:
Originally Posted by bdodson View Post
Since I'm closing in on 2*p45 (along the way towards t50), ... this [p44] is definitely late:

... A nice complete factorization, 10,329- c255 = p44*p212 ...
For contrast, since 2*t45 is complete, at t45 + 2200/7830 =
630/3155 (=t45) + 28.09% (new curves) = 47.748% of t50,
10, 395- C262 finished early, with

p54 = 388603184868446209952357338208961774763421470820867551

xilman/Paul_L mentions that (hard) factorizations by ecm factors
of 56-digits are a remarkable advance (since 1985!). Actually, measuring
difficulty of ecm factorizations (as compared with the ecm-difficulty of
finding a prime factor of a given size) hasn't always been un-controversial;
as for example in Nov. 2003 when Paul_Z was listing Backstrom's p58 as the
best ecm factor; while Brent was still listing my June 2003 p57 as the
ecm "champ". In fact, it was on that occasion that Richard introduced
his restriction that the ratio of the pxx found to the cxxx being factored
satisfy r >= 2.2 for r = size(cxxx)/size(pxx). Brent's point was perhaps
founded on his p40 factorization (complete!) of F10, which was surely a
harder factorization than many of the following p41-p42's (say). So my p57
was the first factor of the Mersenne M997 = 2^997-1 = c301, which also
had a prime cofactor (and was then on George's Most wanted list; now
consisting of M1061 only). By contrast, the p58 was a factor of a c110,
with c110 = p58*p52, and Richard's champs03 observes:

Quote:
Of course, as far as factoring c110 goes, finding p58 is no better
than finding p52, and the c110 is small enough that it could have been
factored in a few days by GNFS.
Perhaps by way of tie-breaker (between Paul_Z and Richard), Arjen
preferred to simply rely on the estimated ecm runtime, since ecm had
no way of knowing the size of the cofactor (except through the multi-
precise arithmetic, a smaller o-term (O-term?)). Anyway, a more
current comparison, of the current top3; the p64 of Aoki-Shimoyama
gave the complete kilo-bit factorization of the c3111 = R311 = 111...1
= (10^311 -1)/9, with 311 1's and is arguably a harder factorization (by
snfs difficulty, say) than the other ecm candidates p67, p66 and Alex's p63.

An awfully long way-around, but the Brent ratio for today's p54
is 262/54 = 4.851; and restricted to pxx's having prime cofactor
(and so, giving the complete factorization; rather than leaving a hard
composite cofactor, perhaps an snfs that's no easier than before the
ecm factor was found) one of the more difficult ecm factorizations.
Just scanning the "Notes" at the end of Richard's list, the above p57
has r = 5.28; and the above p64 has r = 4.86; and today's 4.851
is above any of the other champs ratios. Not bad, considering that
I didn't know where it fell when I started today's post. -Bruce
bdodson is offline   Reply With Quote
Old 2007-06-13, 02:39   #5
bdodson
 
bdodson's Avatar
 
Jun 2005
lehigh.edu

40016 Posts
Default 10,293- 268 --> 218 with p50

Quote:
Originally Posted by garo View Post
Code:
Base	Index	Size	11M(45digits)	43M(50digits)	110M(55digits)	260M(60digits)	Decimal
 ...
10	293-	C268	0(0.267423)	0(0.0522979)	165(0.00921839)	0(0.00148122)
1997508528280766886566710899999100472545203601686867972121010697112469440820807186019533598425515394273663114089834088648226419381447814618370242813221448721398286715795642674396726996546311532430627945933342876346400425978508829663723170164898879922182724872688246759
...
That's p50 = 72753092406224481018194292737207864791672573037959

during the last 1100/7830 curves to finish t50. Composite cofactor
of 218-digits will be done to t50 in a few more hours. Will be one of
the last factors before t50 finishes on the complete Cunningham list.
-Bruce

ps - cofactor is still easier by snfs (293 prime, difficulty 293).

Last fiddled with by bdodson on 2007-06-13 at 02:48 Reason: added difficulty note
bdodson is offline   Reply With Quote
Old 2007-06-14, 13:48   #6
bdodson
 
bdodson's Avatar
 
Jun 2005
lehigh.edu

102410 Posts
Default t50 update

Quote:
Originally Posted by bdodson View Post
That's p50 = 7275309... Will be one of the last factors
before t50 finishes on the complete Cunningham list. -Bruce
The number of curves remaining has dropped (or will shortly drop)
below 100,000. There are six numbers that have 2300/7830
curves remaining to finish t50, and three more that have
1300/7830 curves left. These are the last nine numbers from
2- or 2+ (with n < 1200), under 18K curves total. The other
range has 74 numbers with 1100/7830 curves to go, 81.4K
curves total. (These being "generic" c251-c299's; not 2-
or 2+ with n < 1200.) All gmp-ecm curves with B1=43M and
default B2.

Might sound like a lot, but it's under 13 t50's (13*7830 > 99.1K),
out of c. 800 Cunninghams. That might sound like 13 chances
for t50's to find a factor, but with the exception of those last
six base-2's with two more passes (instead of one), the other
77 have nearly complete t50's, Very little chance of a late p44-p47
left, and the probablity curve (for a known p50) tails off near
the end. So very little chance for p44-p47; a chance that's
tailing off for p48-p51; and a small chance for an early p52-p54.

Hard to give a real-time estimate (calendar date); I'll update
these counts if/when there's a next factor. -Bruce
bdodson is offline   Reply With Quote
Old 2007-09-30, 01:22   #7
bdodson
 
bdodson's Avatar
 
Jun 2005
lehigh.edu

210 Posts
Default c219 -> c165 via p54, still snfs

Quote:
Originally Posted by garo View Post
Code:
Base	Index	Size	11M(45digits)	43M(50digits)	110M(55digits)	260M(60digits)	Decimal
...
10	339-	C219	0(0.267423)	0(0.0522979)	165(0.00921839)	0(0.00148122)	
221829569096283860037698160631360671742526500814014695277268438601335285344809552992788121062067625054700398869532114632736858023286026624916567126193273936691571250559537359349239001462588897922528374386555282490750317
...
First factor from the Opterons for a while, one of the last of c190-c233
of difficulty 220-229 to get it's 3rd t50; 10, 339- c219 - > c165 via

p54 = 777734075184513369134763199249605543798943174359980119

At difficulty 226, the snfs is still easier than c165 gnfs. Not a lot of chance
for a p52-p70 from the c165 after 3*t50; seems likely to be around for
a while unless someone's sieving repunits. I'm starting 3rd t50's on
c155-c169 of difficulty under 230 anyway (along with finishing the 2nd
t50 on difficulty below 220, the last ones below c251), so this c165 is
having an early start. The rest of c147-c154 as well, despite their being
too small for a large factor to meet Brent's ratio. The c149 of difficulty
256.91 is already past 3*t50; seems ready for gnfs. -Bruce
bdodson is offline   Reply With Quote
Old 2007-09-30, 23:19   #8
fivemack
(loop (#_fork))
 
fivemack's Avatar
 
Feb 2006
Cambridge, England

143168 Posts
Default

The c149 of difficulty 256.91 is presumably 7,380+; why are you mentioning it on the 10- table thread?

What's the status of 10,375+ (difficulty 200 by one metric, but to take out the factor 15 requires using a quartic, and 200-digit quartic is deeply enough sub-optimal that 149-digit GNFS might well be easier ... difficulty 225 using a sextic and only the factor three is almost certainly harder than 149-digit GNFS)?

I'm probably within a week of finishing 11,251+ by gnfs and 7,380+ might be a nice one to do next, though I'll have to check with Wagstaff when I submit 11,251+.
fivemack is offline   Reply With Quote
Old 2007-10-01, 03:46   #9
bdodson
 
bdodson's Avatar
 
Jun 2005
lehigh.edu

210 Posts
Default

Quote:
Originally Posted by fivemack View Post
The c149 of difficulty 256.91 is presumably 7,380+; why are you mentioning it on the 10- table thread?
Uhm, that was "10, 339- c219 - > c165" with the cofactor about
to finish testing to 3*t50 (finished already since then; no factor
found; not a surprize!). That brought to my mind the status of
the other c16x's, which I've been running as c155-c169 split by
difficulty as < 220, in 220-229, and over 229. I spent quite a bit
of time over the past week or two looking over the smallest Cunninghams
(7 in c149-c154) and the ones of smallest difficulty (below 220). I
generally spend a lot of time looking over input files; what's running
now (3 or 4 or 5 ranges); what ought to be run next.

The immediate issue was Bob's report of an "almost ecm miss" of
a p53. His number was from the range c155-c169 of difficulty below
220, above, which I've subsequently identified as "most undertested"
at 1.668*t50 ... last I'd looked there were 23 numbers, but updating
an input file, there's 17 now. (And, as I responded to Bob's email
report on the "almost miss", finding his p53 to (1-1/e) probability would
have taken nearer to 3*t50; so it would at worst have been a
"bdodson miss", not an ecm missing --- it not being ecm's fault that
I stopped early in favor of more curves on the harder numbers.) One
can almost never win on these "ecm miss" arguments, and I usually
end up convincing myself that --- while it may have been plausible to
stop early at the time; with hindsight, and with other more urgent
ranges finishing; it's time to go back for another pass. I started
some time ago with the smallest numbers of difficulty 230 and up;
c149-c189, including the hardest c155-c169's. I just recently finished
that pass, with most numbers tested past 3*t50. The smallest being
the c149 you identify; which it seemed to me would be nice to have
off of that input file before I get back for another round. I've
switched those cpus (xps with just 250Mb available for condor grid
jobs) to a file with the other six numbers from c149-c154; the 17
numbers most under-tested from c155-c169, and 11 more of that
size with difficulty in 220-229. That was when I noticed 6,284+ c168
(also not from the 10- thread) on Sam's "who's doing what" list; took
it off of the input file; checked with Richard & Sam to confirm that it
is the next NFSNET number (unless Paul decides otherwise), and
then followed Richard's suggestion and put it back into the input file,
at the top of the list, for one more t50 (to 3*t50). Halfway done
already.

Quote:
What's the status of 10,375+ (difficulty 200 by one metric, but to take out the factor 15 requires using a quartic, and 200-digit quartic is deeply enough sub-optimal that 149-digit GNFS might well be easier ... difficulty 225 using a sextic and only the factor three is almost certainly harder than 149-digit GNFS)?
Well, for one thing, it's had less ecm pre-testing than the harder difficulty
c149 7, 380+ 17712381340... ; perhaps a larger chance of an ecm miss?
It's in the above input file; which I expect to run up to 3*t50. On
other numbers of current interest, the Opteron run that found the above
p54 (which is from 10- ...) is about to do a 3rd t50 on some of the other
768-bit numbers (from the poll that was to follow the poll which our
gerbils removed; with some reason, as it turns out ...), not that we
ought to expect anything; just further enhancing the probabilities that
they won't give an ecm miss or other unwantedly small prime factor.

Quote:
I'm probably within a week of finishing 11,251+ by gnfs and 7,380+ might be a nice one to do next, though I'll have to check with Wagstaff when I submit 11,251+.
Glad to hear; hope for no factors under p60! -Bruce
bdodson is offline   Reply With Quote
Old 2007-10-01, 12:06   #10
R.D. Silverman
 
R.D. Silverman's Avatar
 
Nov 2003

744610 Posts
Default

Quote:
Originally Posted by fivemack View Post
The c149 of difficulty 256.91 is presumably 7,380+; why are you mentioning it on the 10- table thread?

What's the status of 10,375+ (difficulty 200 by one metric, but to take out the factor 15 requires using a quartic, and 200-digit quartic is deeply enough sub-optimal that 149-digit GNFS might well be easier ... difficulty 225 using a sextic and only the factor three is almost certainly harder than 149-digit GNFS)?

I'm probably within a week of finishing 11,251+ by gnfs and 7,380+ might be a nice one to do next, though I'll have to check with Wagstaff when I submit 11,251+.
10,375+ C149 is surely easier by GNFS; I just did 10,345- by SNFS a
short while ago. 10,375- will be easier with SNFS.

The LA for 2,1582L is about 50% done, and sieving for 2,1962M is about
80% done. I then plan to do 2,1630M and 2,1914M via SNFS. These
will be relatively hard, as the best polynomials are quartics. Depending
on how hard 2,1630M is, I may then do 2,1690M.

When these are done, I plan on tacking 2,1059- via SNFS unless someone
else does it sooner. This number is a toss-up between SNFS and GNFS.

There are not a lot of numbers left with SNFS difficulty under 210. Most
of these will require a quartic.
R.D. Silverman is offline   Reply With Quote
Old 2007-10-01, 12:24   #11
R.D. Silverman
 
R.D. Silverman's Avatar
 
Nov 2003

2×3×17×73 Posts
Default

Quote:
Originally Posted by bdodson View Post
Uhm, that was "10, 339- c219 - > c165" with the cofactor about
to finish testing to 3*t50 (finished already since then; no factor
found; not a surprize!). That brought to my mind the status of
the other c16x's, which I've been running as c155-c169 split by
difficulty as < 220, in 220-229, and over 229. I spent quite a bit
of time over the past week or two looking over the smallest Cunninghams
(7 in c149-c154) and the ones of smallest difficulty (below 220). I
generally spend a lot of time looking over input files; what's running
now (3 or 4 or 5 ranges); what ought to be run next.

The immediate issue was Bob's report of an "almost ecm miss" of
a p53.
Hi Bruce,

I try to be careful in my choice of language. I really meant "almost" miss.
I don't regard any factor over 50 digits as an ECM miss, since an optimal
ECM effort will still miss it with probability (1/e)^k with a k * t50
effort.

My recommendation is not to bother with ECM on any of the composites
with SNFS difficulty under 210 except for perhaps 2,2190L, 2,2226L,
and 2,2370L. Actually the SNFS difficulty for these is at least 220
because there doesn't seem to be any way to take advantage of the fact
that they have more than one algebraic factor. The index is divisible by 3,
but this leads to numbers with difficulty above 220 and requires a quartic.
[yech!!!!]. There seems to be no way to take advantage of the additional
algebraic factors. One gets an expression with Hamming weight (i.e. too
many non-zero terms) that does not yield a good polynomial. They are
also not reciprocal.

*Definitely* do not waste ECM time on 7,366+, 10,309-, 7,369+, 10,312+ or 5,447-. They are too easy with SNFS.
R.D. Silverman is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
7+ table garo Cunningham Tables 85 2020-04-15 21:12
5- table garo Cunningham Tables 82 2020-03-15 21:47
5+ table garo Cunningham Tables 99 2020-01-10 06:29
6+ table garo Cunningham Tables 79 2020-01-01 15:26
6- table garo Cunningham Tables 41 2016-08-04 04:24

All times are UTC. The time now is 16:18.

Sat Jul 4 16:18:30 UTC 2020 up 101 days, 13:51, 2 users, load averages: 1.55, 1.36, 1.45

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.