mersenneforum.org  

Go Back   mersenneforum.org > Factoring Projects > Cunningham Tables

Reply
 
Thread Tools
Old 2010-03-23, 11:37   #45
bdodson
 
bdodson's Avatar
 
Jun 2005
lehigh.edu

210 Posts
Default RE: Recent Cunningham Factors

Quote:
Originally Posted by xilman View Post
I plead guilty to mis-using the term "ECM miss". In my case, I was just teasing Bruce. He was so much ahead of the curve at that time.

Paul
I should be so lucky (twice?). Thorsten et.al. seem to be further
ahead of the current curve (a jump from the record p68 to a new
record p73) than I was when my p66 broke the p59 record. The
reason is not the number of digits (plus 5 -vs- plus 7), but rather
on my impression (based on no math whatsoever) that the ratio
of likelihoods for p73 over p69 represents a more substantial jump
than the corresponding likelihood ratio p66 over p59.

In an email reply to Phil McLaughlin, PaulZ reports a very substantial
view of the current ecm status:
Quote:
Originally Posted by PaulZimmermann
I believe the rate of snfs/gnfs factors will decrease, unless
Sam extends the table, and people will go back to ECM (for example the p58
from 5,442+ c168 gnfs might be considered as an ECM miss).
As the person with more experience than anyone else with raising
t55's up towards 3t55's (Aoki and Thorsten hold the range above 3t55),
I'm doubtful. Aoki, Thorsten and PaulZ himself have demonstrated
having the resources to run ECM above 2t55, but my experience
suggests that people with moderate-to-midling resources will find
the search for factors in this range too un-rewarding. Again, the
formulation is that we're running ECM on numbers for which a full
test (to 62%) for p55-factors is already complete (resp. 2t55 and
3t55; 1-e^2 and 1-e^3).

On the topic of ecm-misses, I'm still most interested in the topic of
Silverman-Wagstaff on allocating resources most effectively between
ecm and sieving. As Bob's agreed, our current extreme hetrogeneous
environment massively complicates their situation, in which it is
possible to imagine a reasonable comparison based on using the same
hardware for both ecm and for sieving. Post-Kansas (or Indiana, maybe?)
we've had the sieving hardware distinct from running ecm of a grid of
PCs not suitable for sieving (at the time); then (briefly!) running ecm
under Yoyo/BOINC; and now (for the fortunate few) on GPUs (cell or ...).

We currently have the Batalov+D range of sieving (both snfs and gnfs)
in difficulty below snfs difficulty 250; and Greg's NFS@Home range
of mid-250 to (mostly) just below 270. I'm still holding firm on something
like 1.5t55 (for B+D) and somewhere between 2t55-to-3t55 (for Greg).

Regards, Bruce

PS -- Uhm, there are actually (at least) two fronts in the ecm attack
on the dwindling current Cunningham list. The one above is where both
ecm and sieving are plausible. The two most recent Aoki factors of
p50/p51 represent numbers from c234-c366 that aren't near-term
sieving candidates. These numbers need to have ecm testing raised
from 2t50 to 3t50 (to start ...). There's a different difficulty here;
with too many numbers having no prime factors in ecm range. Already
we can see lots of these in diff below 250, resp. below 270; and this
will get worse with the rest of c234-c366 (the "Largest Cunninghams").

PaulZ's email seems to suggest (to me at least ...) a third front of
smaller numbers, having passed (failed?) a larger ecm test. A moving
target here, this was c154-c189.99; currently c161-c209.99; and
working toward c176-c233.99 (the "Smallest Cunninghams"). If I'm
reading PaulZ correctly, he seems to suggest more extended ecm effort
even on numbers for which sieving is not-so-difficulty; sort-of
"yes, we can" factor these numbers by ecm!
bdodson is offline   Reply With Quote
Old 2010-05-24, 04:49   #46
frmky
 
frmky's Avatar
 
Jul 2003
So Cal

40728 Posts
Default

NFS@Home has finished 5,427+ by SNFS as usual.

Code:
prp54 factor: 563755283862496929168659991708057447948580426252052767
prp145 factor: 2723528882195134624435365406241362651957822673755720745626178581944304857470575651366951292535191262712823827955524329324491207238166486348539107
This is the project's 50th factorization. I would have preferred it not to be an ECM miss, but c'est la vie!
Attached Files
File Type: zip 5p427.zip (10.7 KB, 219 views)
frmky is offline   Reply With Quote
Old 2010-05-24, 10:21   #47
R.D. Silverman
 
R.D. Silverman's Avatar
 
Nov 2003

22·5·373 Posts
Default

Quote:
Originally Posted by frmky View Post
NFS@Home has finished 5,427+ by SNFS as usual.

Code:
prp54 factor: 563755283862496929168659991708057447948580426252052767
prp145 factor: 2723528882195134624435365406241362651957822673755720745626178581944304857470575651366951292535191262712823827955524329324491207238166486348539107
This is the project's 50th factorization. I would have preferred it not to be an ECM miss, but c'est la vie!
This certainly qualifies!
R.D. Silverman is offline   Reply With Quote
Old 2010-05-24, 14:13   #48
bdodson
 
bdodson's Avatar
 
Jun 2005
lehigh.edu

210 Posts
Default

Quote:
Originally Posted by frmky View Post
NFS@Home has finished 5,427+ by SNFS as usual.

Code:
prp54 factor * prp145 factor
This is the project's 50th factorization. I would have preferred it not to be an ECM miss, but c'est la vie!
No serious dispute here, ecm had a more than plausible chance at
finding this p54. The C199 was from the c190-c209.99 range; initially
from c190-c233, over diff 249.99 (at diff 255.8). So that was an
initial 3t50; then the +3t50 for c190-c233 was
Code:
 1236333 Aug  8  2009 c9009/aug07-cu3140-p55-1t50-h2
 1503832 Sep 29  2009 c9009/sep27-cu3800leaf-p60-2t50-h2
So when Greg send me this for pretesting it got noted
Code:
 Mar 12 13:02 c9009/c99.h2-5p427needs4
and then
Code:
3036044 Mar 22 09:49 c9009/mar18-cu7593-p60-4t50-c9h2-fin
That was 6t50 plus 4t50, just short of 2t55 (c. 11.4t50?). None
of the pre-history curves had limits under t50 (perhaps one of the t50's),
the rest t55 or t60; all of which were suitable (within 5-digits of optimal
runtime) for finding this p54. Then 3140 curves with t55 (B1 = 110M)
limits, 3800+7593 curves with t60 (B1 = 260M) limits. That part alone,
without counting the initial 3t50 miss, was a solid t55.

So this one meets my view of an ecm miss, the curves were actually
run; ecm had its chance to find this p54. The only slight quibble is
a review of what a t55, resp. 2t55 means. That's a 62% chance of
finding a known p55; resp. 80% chance. The probabilistic test finds
just short of 2-out-of-3 p55's; resp. 4-out-of-5 p55's. Since the above
count is a bit short of 2t55, perhaps ecm ought to have found 3-out-of-4
of these p54's. So if we look at the pool of Cunninghams from Nov 2003
(when I started these condor searches) of size c190-c209.99; drop the
ones with a factor below p50; wait until the rest of that initial pool are
factored, and drop the ones with smallest factor p60-or-larger; _then_
we could check to see whether ecm had done better or worse than finding
3-of-4 of the p54/p55's. ("Probablity with sample size one is meaningless.")

For another view, suppose that we knew this C199 had this p54 factor; and
that 10t50 had missed. Would we try to remove this factor (to prob 80%?)
using ecm? Note that the prevous curves have already missed; we don't
get to count them, this would be a new 2t55 (or t55). That's more favorable
than the actual situation. All we actually know is that 10t50 "missed", not
that there's a lurking p54 in this c199; rather that there's nearly certain
that some p53-p57's haven't yet been found among a pool of near-term
sieving candidates. I placed my bet on this one, contributing c. 10% of
the NFS@Home sieving; and I'll continue with subsequent diff < 260's that
have passed 10t50. Better that than adding another 3t50 on a pool of
candidates, most, far-and-away-most, having their factors above p60 and
hard for ecm to find. (This isn't a Mersenne number, ps3 standards don't
apply.)

Not sure that having the first clearcut miss occur on the 50th factorization
matters that much to me (a random NFS@Home contributor...); rather I'm
still glad not to have gotten one from the ones sieved with the 16e siever,
diff 270-and-up. Then I'd have to think about bumping the curve counts up
(depending upon what the condor pc/pool looks like; most of the core2's
seem to have dropped out on the switch to windows7 --- skipping past
vista, which we never used here).

-Bruce
bdodson is offline   Reply With Quote
Old 2010-08-06, 08:39   #49
frmky
 
frmky's Avatar
 
Jul 2003
So Cal

2×34×13 Posts
Default

NFS@Home has completed 5,448+ by SNFS. A 16.1M matrix was solved using 64 computers (256 cores) in a bit under 41 hours. The log is attached.

Code:
prp65 factor: 23371863775658144623538828456854573496104607906605333794952273409
prp141 factor: 698965867837568984299398033395117263633121275562035049049008890847786442548362928940407820059804253774285034574979465129314235059775249213313
Attached Files
File Type: zip 5p448.zip (4.2 KB, 212 views)
frmky is offline   Reply With Quote
Old 2010-10-30, 05:15   #50
Raman
Noodles
 
Raman's Avatar
 
"Mr. Tuch"
Dec 2007
Chennai, India

3×419 Posts
Default

5,415+ c204 lovely split up: p102.p102 wonderful natural RSA like number!

Previous Venus transit was on June 8, 2004
Next Venus transit will be upon June 6, 2012
Yesterday, that Venus crossed up with that position of Inferior Conjunction
5,415+ was started up when it was rather even before its eastern most elongation position

One of my hardest jobs ever
5,415+ c204, SNFS difficulty = 232.0584144 nominally; with a quartic polynomial
is equal to that value for that 332*log(5)
that is being in use, that way. By using parameters as:
Algebraic polynomial = x^4-x^3+x^2-x+1
Rational polynomial = x-5^83

Sieved up from special-q ranges for that values of
110M to 300M (190M range) upon that rational side
with that help of gnfs-lasieve4I15e lattice siever

By using that following parameters scale:->
As,
Code:
n: 409070648357903876177895795121676558808515790800199330391475022633084608638318069844299258901345227277398314550159052254251424043657457443453213674736966457975676435972876537687720951817693078563918161701
m: 10339757656912845935892608650874535669572651386260986328125
c4: 1
c3: -1
c2: 1
c1: -1
c0: 1
skew: 1
type: snfs
rlim: 200000000
alim: 33554431
lpbr: 31
lpba: 29
mfbr: 62
mfba: 58
rlambda: 2.6
alambda: 2.6
That matrix had 14336149 x 14336374 dimensions within it up
that was solved by using that msieve software within 36 days of calendar time,
under that 4 threads (hyperthreading model), singularly,
without making any uses for that MPI interface for that cluster computing facilities at all

Ultimately, for that number
5,415+ its factors are respectively, as follows
Code:
prp102 factor: 542386699809206521167664377679235482113503370942017294492342497725064744615307084697173312788543096361
prp102 factor: 754204792451218350821809245865692448592809318994740143792927132204178344974350929553423826459243126941
for that candidate, c204 its a fantastic split up
like that for a natural RSA number, right then!
that has been encountered by me up, for ever
of course
Attached Files
File Type: zip 5_415P_LA.zip (42.2 KB, 211 views)
Raman is offline   Reply With Quote
Old 2010-11-03, 08:42   #51
Batalov
 
Batalov's Avatar
 
"Serge"
Mar 2008
Phi(4,2^7658614+1)/2

22·23·103 Posts
Default

I was looking forward to use this illustration one day:
nice split! -
Click image for larger version

Name:	nice_split.jpg
Views:	238
Size:	54.0 KB
ID:	5866
Congratulations!
Batalov is offline   Reply With Quote
Old 2010-11-03, 19:11   #52
Raman
Noodles
 
Raman's Avatar
 
"Mr. Tuch"
Dec 2007
Chennai, India

3·419 Posts
Default

Thank you, but why so late?
5 days after that time when I posted with its factors?
Were you searching up for this picture, which you thought that you would post it as soon as you saw with the nice split of the factors for that number immediately? Or that it was a leisurely thought later on?

By the way, what is that nice split over there?
An jumping athlete, who has been photographed with his/her left, right parts of body being partitioned into half, as he jumps over the ground when he keeps up with his own body style?

For everyone, let me know up about the fact that
who maintains up with that odd perfect number search page,
at http://www.oddperfect.org/ -> thus, is it wblipp?

so, and then that Fermat number factors page, at
http://www.prothsearch.net/fermat.html
Is it Mr. Wilfred Keller, or that is it someone else who has taken up in charge for that page, from within this forum itself?

Last fiddled with by Raman on 2010-11-03 at 19:21
Raman is offline   Reply With Quote
Old 2010-11-03, 20:23   #53
wblipp
 
wblipp's Avatar
 
"William"
May 2003
New Haven

44768 Posts
Default

Quote:
Originally Posted by Raman View Post
For everyone, let me know up about the fact that
who maintains up with that odd perfect number search page,
at http://www.oddperfect.org/ -> thus, is it wblipp?
Yes.
wblipp is offline   Reply With Quote
Old 2010-11-04, 05:50   #54
Raman
Noodles
 
Raman's Avatar
 
"Mr. Tuch"
Dec 2007
Chennai, India

3×419 Posts
Default

Quote:
Originally Posted by wblipp View Post
Yes.
Then, why is the status for that number 3,607- which has been factored as of now, not yet been updated, still?
thus, even in spite of reading up with that post !!

Last fiddled with by Raman on 2010-11-04 at 05:58
Raman is offline   Reply With Quote
Old 2010-11-04, 06:40   #55
frmky
 
frmky's Avatar
 
Jul 2003
So Cal

2·34·13 Posts
Default

Patience! Real life interferes.
frmky is offline   Reply With Quote
Reply



Similar Threads
Thread Thread Starter Forum Replies Last Post
7+ table garo Cunningham Tables 86 2021-01-04 22:35
6+ table garo Cunningham Tables 80 2021-01-04 22:33
3+ table garo Cunningham Tables 150 2020-03-23 21:41
5- table garo Cunningham Tables 82 2020-03-15 21:47
6- table garo Cunningham Tables 41 2016-08-04 04:24

All times are UTC. The time now is 15:43.


Fri Jul 16 15:43:44 UTC 2021 up 49 days, 13:31, 1 user, load averages: 2.18, 1.69, 1.64

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.