mersenneforum.org  

Go Back   mersenneforum.org > Factoring Projects > Factoring

Reply
Thread Tools
Old 2015-06-23, 16:58   #1277
jyb
 
jyb's Avatar
 
Aug 2005
Seattle, WA

3×19×31 Posts
Default

Quote:
Originally Posted by R.D. Silverman View Post
The 2/9 rule is a decent approximation. However, its value should be reduced as the numbers get larger.
This should be clear from Dickman's function. This "2/9" value should be a slow decreasing function of N (the composite).
I have never analyzed the exact nature of this function, so I can not say how accurate it is for (say) 100, 150, 200, 250, ....
digits etc.

As the composites get larger once one has done an "initial ECM pass" to say the 50 digit level, the probability that
there is a factor within ECM reach gets SMALLER.

There is no "general rule" that applies uniformly to composites of all sizes. Instead, use the Bayseian methods
I gave in my paper.
Thank you.

Quote:
Originally Posted by R.D. Silverman View Post
"one example did make an impression on me".

This suggests that your understanding of statistics is inadequate.
Or perhaps merely that it's useful to have a reminder now and then that things which happen only rarely do still happen. It's easy to just "plow through" composites with NFS and forget that doing pretesting can sometimes be worthwhile, depending on prior work.

Quote:
Originally Posted by R.D. Silverman View Post
Furthermore "saving a lot of computation" is an exaggeration. How much time was spent on ECM? How much
time would SNFS have taken? Subtract. There is your actual savings. But the EXPECTED savings is much
less because such small factors will be RARE.
No, it's not an exaggeration, and really you have no standing to claim that it was. Fortunately, I have the logs which bear this out. The total time I spent on ECM was 4.4 CPU-hours. This composite had SNFS difficulty 247, which by even the most optimistic estimate would have taken about 3 orders of magnitude more CPU time.

Furthermore, your formula for actual savings is talking about time; I was careful to specify computation, not time. The actual time taken would have been much more reasonable, since it would have been NFS@Home doing the sieving. But the number of CPU-hours would still have been vastly greater with NFS than it turned out to be with ECM.

Quote:
Originally Posted by R.D. Silverman View Post
Stop FIXATING on this 2/9 "rule".
I'm not sure where you get the idea that I'm fixating on it. Fixating would be insisting that it's correct, possibly even in the face of contrary evidence. Whereas I have merely asked, three times now, whether it is in fact a good approximation. And according to you, the answer is a qualified yes. Thank you for answering.

Quote:
Originally Posted by R.D. Silverman View Post
Let me also add:

People get fixated on the ECM successes. They (perhaps) forget about all of the lost time spent when a factor
was NOT found.

However, SNFS succeeds with certainty. If one spends time to run SNFS, the time is never "lost".

Suppose you spend time T with SNFS and get 3 factorizations.

Suppose you spend the same time T with ECM and are able to test (say) 50 candidates to (say) t55.

Unless you expect to find at least 3 factors with ECM, then you have wasted that time. One needs to assess
the proability of succes at level t55 given the amount of effort already spent. If one failed at t50,
it becomes less likely that one will succeed at t55, especially as the composites get larger.

When one has already made a reasonable ECM effort (YMMV regarding 'reasonable') it is better to succeed
with certainty via SNFS than waste further time with ECM. The exception to this guideline is of course the
case where one lacks the resources to run SNFS. The alternative then becomes "run ECM or do nothing".
I must be imagining things. I could have sworn that I wrote:

"And yes, I know that we're talking about probability and expected values over many composites/factors here, so one example should not guide our policy...."
jyb is offline   Reply With Quote
Old 2015-06-23, 20:12   #1278
xilman
Bamboozled!
 
xilman's Avatar
 
"π’‰Ίπ’ŒŒπ’‡·π’†·π’€­"
May 2003
Down not across

2·5,393 Posts
Default

Quote:
Originally Posted by R.D. Silverman View Post
I must be imagining things. I could have sworn that I wrote:

"The exception to this guideline is of course the
case where one lacks the resources to run SNFS. The alternative then becomes "run ECM or do nothing". "
Mea culpa. I missed that.
xilman is offline   Reply With Quote
Old 2015-06-23, 20:53   #1279
R.D. Silverman
 
R.D. Silverman's Avatar
 
Nov 2003

22·5·373 Posts
Default

Quote:



No, it's not an exaggeration, and really you have no standing to claim that it was. Fortunately, I have the logs which bear this out. The total time I spent on ECM was 4.4 CPU-hours.

Idiot.

You got very very very very lucky. Ask yourself instead: What would the time have been to run a full t55
on the number.


This composite had SNFS difficulty 247, which by even the most optimistic estimate would have taken about 3 orders of magnitude more CPU time.

Quote:
Furthermore, your formula for actual savings is talking about time; I was careful to specify computation, not time.
???????

Would someone parse this please? What is "computation" and how does one measure it???????


Quote:

"And yes, I know that we're talking about probability and expected values over many composites/factors here, so one example should not guide our policy...."
But it is clear from your writing that it DOES seem to guide your policy.

Last fiddled with by R.D. Silverman on 2015-06-23 at 20:53
R.D. Silverman is offline   Reply With Quote
Old 2015-06-23, 21:36   #1280
wblipp
 
wblipp's Avatar
 
"William"
May 2003
New Haven

2×7×132 Posts
Default

Quote:
Originally Posted by R.D. Silverman View Post
Would someone parse this please? What is "computation" and how does one measure it???????
I think it is left as exercise for the reader. It is a catch phrase for the thing we are trying to optimize in a problem that hasn't yet been fully structured.

Your observations about "unless you can't do SNFS" and Xilman's observations about subtlety based in various machine capabilities are both windows into this problem of how do we construct a relevant optimization problem to decide "how much ECM before SNFS/GNFS." Should the available computing resources of various capabilities be considered fixed? Are BOINC resources free but limited?

Last fiddled with by wblipp on 2015-06-23 at 21:36
wblipp is offline   Reply With Quote
Old 2015-06-23, 22:06   #1281
R.D. Silverman
 
R.D. Silverman's Avatar
 
Nov 2003

22·5·373 Posts
Default

Quote:
Originally Posted by wblipp View Post
I think it is left as exercise for the reader. It is a catch phrase for the thing we are trying to optimize in a problem that hasn't yet been fully structured.

Your observations about "unless you can't do SNFS" and Xilman's observations about subtlety based in various machine capabilities are both windows into this problem of how do we construct a relevant optimization problem to decide "how much ECM before SNFS/GNFS." Should the available computing resources of various capabilities be considered fixed? Are BOINC resources free but limited?
Good questions. The answer depend on the model.
R.D. Silverman is offline   Reply With Quote
Old 2015-06-23, 23:00   #1282
jyb
 
jyb's Avatar
 
Aug 2005
Seattle, WA

3×19×31 Posts
Default

Quote:
Originally Posted by R.D. Silverman View Post
Idiot.
Hee-hee, you really just can't help yourself, can you Bob? How sad.

Quote:
Originally Posted by R.D. Silverman View Post
You got very very very very lucky. Ask yourself instead: What would the time have been to run a full t55
on the number.
Yes, I did get quite lucky. However, for this example we weren't talking about expected time, we were talking about actual time for the factorization that actually happened. You asked "How much time was spent on ECM?" in order to assess how much time was saved. The expected time is irrelevant to that specific question.

But in any case, why should the time to run a full t55 have any bearing on this? The composite in question had a 49-digit factor.

Quote:
Originally Posted by R.D. Silverman View Post
???????
Would someone parse this please? What is "computation" and how does one measure it???????
Bob, everybody reading this forum, yourself included, knew exactly what I meant by "computation". It is clear from the context that I was referring to CPU time, as opposed to wall clock time, an important distinction when dealing with the massive parallelism that NFS@Home provides. You even quoted me when I said "...which by even the most optimistic estimate would have taken about 3 orders of magnitude more CPU time."


Quote:
Originally Posted by R.D. Silverman View Post
But it is clear from your writing that it DOES seem to guide your policy.
Really? How is that clear? The only thing that I can see guiding my policy based on what I've written is that I use the 2/9 rule, which you have said gives a "decent approximation" of the optimum amount of ECM to use before starting SNFS. I also mentioned a particular example, which I took pains to point out should not be used to guide policy. You can only interpret this as you have if you are either deeply confused or grasping at straws to make me look bad. Knowing your behavior in this forum, I assume it's the latter, but I can't completely rule out the former.

BTW, since I know your next response will consist of nothing but further insults and invective (something else I know from your behavior in this forum), I will choose to exit this dialogue. Feel free to rant, as per usual.
jyb is offline   Reply With Quote
Old 2015-06-24, 02:00   #1283
jyb
 
jyb's Avatar
 
Aug 2005
Seattle, WA

3×19×31 Posts
Default

Quote:
Originally Posted by jyb View Post
But in any case, why should the time to run a full t55 have any bearing on this? The composite in question had a 49-digit factor.
Correction: if the expected time were relevant, then you would have been right to consider a t55. The number may have had a 49-digit factor, but I obviously didn't know that when I was considering whether to run ECM. And the 2/9 "rule" would suggest almost exactly t55. My mistake on that point.
jyb is offline   Reply With Quote
Old 2015-06-24, 07:22   #1284
xilman
Bamboozled!
 
xilman's Avatar
 
"π’‰Ίπ’ŒŒπ’‡·π’†·π’€­"
May 2003
Down not across

2A2216 Posts
Default

Quote:
Originally Posted by jyb View Post
Quote:
Originally Posted by R.D. Silverman View Post
Idiot.
Hee-hee, you really just can't help yourself, can you Bob? How sad.

...

BTW, since I know your next response will consist of nothing but further insults and invective (something else I know from your behavior in this forum), I will choose to exit this dialogue. Feel free to rant, as per usual.
xilman is offline   Reply With Quote
Old 2015-09-15, 16:10   #1285
jyb
 
jyb's Avatar
 
Aug 2005
Seattle, WA

3·19·31 Posts
Default Just curious

For the past 6 weeks, there's been a number on the reservation page reserved to someone going by the name "3^n+2^n". At first I figured well, why not? One name's as good as another. But it recently occurred to me that it's possible someone made a reservation accidentally, intending to enter that into e.g. the factorDB. It further occurred to me that it's even remotely possible that that someone could have been me.

Tom, is there anything that looks like a valid email address associated with that reservation?
jyb is offline   Reply With Quote
Old 2015-09-15, 18:03   #1286
R.D. Silverman
 
R.D. Silverman's Avatar
 
Nov 2003

22·5·373 Posts
Default

Quote:
Originally Posted by jyb View Post
For the past 6 weeks, there's been a number on the reservation page reserved to someone going by the name "3^n+2^n". At first I figured well, why not? One name's as good as another. But it recently occurred to me that it's possible someone made a reservation accidentally, intending to enter that into e.g. the factorDB. It further occurred to me that it's even remotely possible that that someone could have been me.

Tom, is there anything that looks like a valid email address associated with that reservation?
I am finally making progress with 11,9,223+ and 11,9,223-. The LA is about 35% done for the former
(with a single threaded app) and I have built a matrix for the latter. Meanwhile, I continue
sieving with the resources available. [not a lot]
R.D. Silverman is offline   Reply With Quote
Old 2015-10-10, 08:51   #1287
pinhodecarlos
 
pinhodecarlos's Avatar
 
"Carlos Pinho"
Oct 2011
Milton Keynes, UK

3×17×97 Posts
Default

Is the ecmserver still up and running? If so can the IP be shared so I can point out some cores to it.

Thank you in advance,

Carlos

Last fiddled with by pinhodecarlos on 2015-10-10 at 08:51
pinhodecarlos is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
New phi for homogeneous Cunningham numbers wpolly Factoring 26 2016-07-29 04:34
Mathematics of Cunningham Numbers (3rd ed., 2002, A.M.S.) Xyzzy Cunningham Tables 42 2014-04-02 18:31
Don't know how to work on Cunningham numbers. jasong GMP-ECM 6 2006-06-30 08:51
Doing Cunningham numbers but messed up. jasong Factoring 1 2006-04-03 17:18
Need help factoring Cunningham numbers jasong Factoring 27 2006-03-21 02:47

All times are UTC. The time now is 21:53.


Fri Aug 6 21:53:58 UTC 2021 up 14 days, 16:22, 1 user, load averages: 2.99, 2.68, 2.57

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.