mersenneforum.org  

Go Back   mersenneforum.org > Factoring Projects > Factoring

Reply
 
Thread Tools
Old 2008-03-16, 21:20   #1
fivemack
(loop (#_fork))
 
fivemack's Avatar
 
Feb 2006
Cambridge, England

191816 Posts
Default 2^1188+1

I'm coordinating sieving for this number over on the ElevenSmooth forum; the relations are coming in rather slowly, and it's an lp=2^30 number so we need eighty million of them. I would be very appreciative if anyone from here was prepared to join in; there's probably a CPU-year's worth of sieving left.
fivemack is offline   Reply With Quote
Old 2008-03-17, 03:23   #2
bdodson
 
bdodson's Avatar
 
Jun 2005
lehigh.edu

210 Posts
Default

Quote:
Originally Posted by fivemack View Post
I'm coordinating sieving for this number over on the ElevenSmooth forum; the relations are coming in rather slowly, and it's an lp=2^30 number so we need eighty million of them. I would be very appreciative if anyone from here was prepared to join in; there's probably a CPU-year's worth of sieving left.
That's "c195 218557157...77702177 divides 2^1188 + 1 (SNFS 238.42)".
For ecm pre-testing, a typical number (with no extra attention?), so I have
4*t50 here. Expecting p47 and below removed, average p48-p52's to have
been found; but we've already seen p53's and p54's remaining.

If I'm recalling your previous benchmarking, certainly not worth further
ecm on machines that could be sieving. For numbers in c190-c233 with
difficulty in 230-249, 4*t50 is borderline for grid ecm; harder numbers
of this size/difficulty have been getting t55's. Numbers just a bit larger
or harder are typically way under-tested. By contrast, c155-c189
Cunninghams are now all tested well past t55 (at c. 5.7*t50; most nearing
7*t50).

Good luck on this one, with large index (exponent?), I'll be interested
to see the factors. -Bruce
bdodson is offline   Reply With Quote
Old 2008-03-17, 08:48   #3
Andi47
 
Andi47's Avatar
 
Oct 2004
Austria

9B216 Posts
Default

Quote:
Originally Posted by bdodson View Post
By contrast, c155-c189
Cunninghams are now all tested well past t55 (at c. 5.7*t50; most nearing
7*t50).
Are You doing B1 = 43M or 110M?

According to my calculations, 1*t55 ~ 49000 curves at B1 = 43M, which also gives ~14.1% of t60 (and ~1.7% of t65).

If you do ~7000 curves at B1 = 43M and ~15100 curves at B1 = 110M, this should be ~100 % of t55 and ~15.7% of t60 (and ~2.2% of t65), thus giving a slightly higher chance of finding larger factors. (if my calculations are correct).

Last fiddled with by Andi47 on 2008-03-17 at 08:50 Reason: added percentage of t65
Andi47 is offline   Reply With Quote
Old 2008-03-17, 18:40   #4
bdodson
 
bdodson's Avatar
 
Jun 2005
lehigh.edu

210 Posts
Default

Quote:
Originally Posted by Andi47 View Post
Are You doing B1 = 43M or 110M?

According to my calculations, 1*t55 ~ 49000 curves at B1 = 43M, which also gives ~14.1% of t60 (and ~1.7% of t65).

If you do ~7000 curves at B1 = 43M and ~15100 curves at B1 = 110M, this should be ~100 % of t55 and ~15.7% of t60 (and ~2.2% of t65), thus giving a slightly higher chance of finding larger factors. (if my calculations are correct).
Neither. One of the c16x's, done before the curves being reported above,
was MWN-10:

evalf(8800/17900+5700/8000) > = 1.204*t55 on 11/11/07

which, if I recall was 2,787+. The curves /17900 were B1=110M,
the ones /8000 were B1 = 260M. I dropped the B1 = 43M
curves. The gruesome details were 6*t50 as

1st t50: c166-c174: 2500/9000 + 1150/1538 (1.025% t50)
2nd t50: c155-c169: 4000/3155 diff >= 230
3rd+xtra c149-c169 diff230up: 4800/3133
three new t50's: 4550/1575

so 4000+4800 b1=110M's, 1150+4550 b1=260M's. The t50 counts
for 260M were not-so-effecient; but were better considered as t55
curves (better yet at t60, of course!).

More generally, the first t50's in this range depended upon the sizes:

First t50 run, c155 - c250:

1. c155-c174:

(a) 155-c165: 2500/9000 + 3000/7771 + 600/1538, 1.053% t50

(b) c166-c174: 2500/9000 + 1150/1538 (1.025% t50)

2. c175-c194:

(a) c175-c185: 1000/9000 + 590/3155 + 300/1538 + 340/660
(1.008% t50)

(b) c186-c194: 3000/7771 + 630/3155 + 710/1538 (1.012% t50)

3. c195-c233: 620/3155(=t45) + 1335/1538 (1.064% t50)

4. c234-c250: 620/3155(=t45) + 4925/7830 + 285/1538, at t50

The /9000 was b1=43M with ecm5.0 default b2; the /7771 b1 = 43M
with ecm6 default b2 (various versions), /7830 was b1=43M on the
grid pcs (one of ATH's versions); /3155 was b1 = 110M,
/1538 was b1 = 260M and the (few) /660's were b1 =850M.

Second t50's were similarly mixed, /7771 or /3155 in c155-c189, split
by difficulty at 220 (and c190-c233 was /3155 or /1538, also split at
difficulty 220). [That was the end of the b1=43M's (in this range, the
first t50 on c251-c384 was 620/3155 + 6300/7830 = 1.0011*t50, which
ran later).] So the 3rd and 4th t50's on c155-c189 were /3155 or /1538.

New curves are supposed to add 3*t50, to get 7*t50 >> t55. Those
are done on c180-c189, with b1 = 260M; are at 8000 curves of 9500 needed
on c170-c179, with B1 = 110M (except that the ones of difficulty below
220 had one new t50 done with B1 = 260M, and are already at 7*t50); and
on c155-c169, there are 4800 curves done with B1 = 110M (1.5*t50),
2100 curves done with B1 = 260M (1.33*t50), with one more run of
525 curves to go [for 1.5+1.66 > 3.0]. So, for short (as if!), the ones
not already at 7*t50 are at 6.55*t50 or at 6.83*t50.

And as I was saying, these "are now all tested well past t55". I'm not
disagreeing with your computation; but the figure of 5.7*t50 came from
5.7*3133 = 17858 = c. t55, for B1= 110M curves, where the /3133 was
from -v on the grid binary I was looking at, at the time.

I hope that we're not distracting prospective contributors for the
11-smooth project (which also ran some ecm, I expect). -Bruce
bdodson is offline   Reply With Quote
Old 2008-03-18, 00:38   #5
wblipp
 
wblipp's Avatar
 
"William"
May 2003
New Haven

236610 Posts
Default

Quote:
Originally Posted by bdodson View Post
11-smooth project (which also ran some ecm, I expect)
ElevenSmooth has run 1592 curves at B1=43M through the ECM Server - presumably most with ecm 6 defaults.

In a quick scan, it appears this is the last elevensmooth composite that overlaps the current Cunningham range.

William
wblipp is offline   Reply With Quote
Old 2008-03-18, 15:00   #6
bdodson
 
bdodson's Avatar
 
Jun 2005
lehigh.edu

210 Posts
Default 2nd attempt at a reply

Quote:
Originally Posted by Andi47 View Post
Are You doing B1 = 43M or 110M?

According to my calculations, 1*t55 ~ 49000 curves at B1 = 43M, which also gives ~14.1% of t60 (and ~1.7% of t65).

If you do ~7000 curves at B1 = 43M and ~15100 curves at B1 = 110M, this should be ~100 % of t55 and ~15.7% of t60 (and ~2.2% of t65), thus giving a slightly higher chance of finding larger factors. (if my calculations are correct).
A needed clarification on my first (somewhat non-responsive) reply (above)
is that I do agree --- entirely --- with Andi47's point here. It's perhaps
misleading for me to refer to the seven passes through the list of 100 smallest
Cunninghams as t50's, since most of the curves on c155-c189 are done
with B1 = 110M (p55-optimal) or B1 = 260M (p60-optimal). The last
five "t50's" on c155-c189 were exclusively done with B1 = 110M or 260M.
For the larger and/or harder numbers, that was the case for the last
six "t50's"; only the numbers in c155-c169 below difficulty 230 had
B1 = 43M curves in the 2nd t50. If I had actually been doing t55's
with p50-optimal curves (Andi47's 49K curves), that would have been
less optimal use of the cycles, and --- worse --- is less effective on
the larger prime factors that we'd prefer to have ecm find (rather than
sieving), which _was_ the point Andi47 was making. A worst case
for the numbers at/below c169 would have been 2*t50 = c. 15.5K curves
at B1 = 43M then 5*t50 = c. 15.6K curves at B1 = 110M (at
2*7771 and 5*3133). Ah! That's clearly past the better of the two t55's
Andi47 describes, by an extra t50.

The low memory P4's are finishing their share of the c155-c169's today;
switching to small BMtR numbers. The core2duos (running the B1 = 260M
curves) will finish their share in another 2-3 days; and will switch to
adding 3*t50 to c190-c195 (or-so), since there are no longer 100 numbers
in c155-c189 (plus difficulty below 220). That leaves just the larger
memory P4's grinding away on c170-c179 (above diff 220), on the last
1000 curves with B1 = 110M. -Bruce
bdodson is offline   Reply With Quote
Reply

Thread Tools


All times are UTC. The time now is 15:38.


Fri Aug 6 15:38:51 UTC 2021 up 14 days, 10:07, 1 user, load averages: 2.60, 2.59, 2.72

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.