mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Factoring (https://www.mersenneforum.org/forumdisplay.php?f=19)
-   -   Factoring humongous Cunningham numbers (https://www.mersenneforum.org/showthread.php?t=5722)

jyb 2015-06-23 16:58

[QUOTE=R.D. Silverman;404640]The 2/9 rule is a decent approximation. However, its value should be reduced as the numbers get larger.
This should be clear from Dickman's function. This "2/9" value should be a slow decreasing function of N (the composite).
I have never analyzed the exact nature of this function, so I can not say how accurate it is for (say) 100, 150, 200, 250, ....
digits etc.

As the composites get larger once one has done an "initial ECM pass" to say the 50 digit level, the probability that
there is a factor within ECM reach gets SMALLER.

There is no "general rule" that applies uniformly to composites of all sizes. Instead, use the Bayseian methods
I gave in my paper.[/QUOTE]
Thank you.

[QUOTE=R.D. Silverman;404640]
"one example did make an impression on me".

This suggests that your understanding of statistics is inadequate.[/QUOTE]

Or perhaps merely that it's useful to have a reminder now and then that things which happen only rarely do still happen. It's easy to just "plow through" composites with NFS and forget that doing pretesting can sometimes be worthwhile, [I]depending on prior work[/I].

[QUOTE=R.D. Silverman;404640]Furthermore "saving a lot of computation" is an exaggeration. How much time was spent on ECM? How much
time would SNFS have taken? Subtract. There is your actual savings. But the EXPECTED savings is much
less because such small factors will be RARE.[/QUOTE]

No, it's not an exaggeration, and really you have no standing to claim that it was. Fortunately, I have the logs which bear this out. The total time I spent on ECM was 4.4 CPU-hours. This composite had SNFS difficulty 247, which by even the most optimistic estimate would have taken about 3 orders of magnitude more CPU time.

Furthermore, your formula for actual savings is talking about time; I was careful to specify computation, not time. The actual time taken would have been much more reasonable, since it would have been NFS@Home doing the sieving. But the number of CPU-hours would still have been vastly greater with NFS than it turned out to be with ECM.

[QUOTE=R.D. Silverman;404640]
Stop FIXATING on this 2/9 "rule".[/QUOTE]
I'm not sure where you get the idea that I'm fixating on it. Fixating would be insisting that it's correct, possibly even in the face of contrary evidence. Whereas I have merely asked, three times now, whether it is in fact a good approximation. And according to you, the answer is a qualified yes. Thank you for answering.

[QUOTE=R.D. Silverman;404643]Let me also add:

People get fixated on the ECM successes. They (perhaps) forget about all of the lost time spent when a factor
was NOT found.

However, SNFS succeeds with certainty. If one spends time to run SNFS, the time is [b]never[/b] "lost".

Suppose you spend time T with SNFS and get 3 factorizations.

Suppose you spend the same time T with ECM and are able to test (say) 50 candidates to (say) t55.

Unless you expect to find at least 3 factors with ECM, then you have wasted that time. One needs to assess
the proability of succes at level t55 given the amount of effort [b]already[/b] spent. If one failed at t50,
it becomes less likely that one will succeed at t55, especially as the composites get larger.

When one has already made a reasonable ECM effort (YMMV regarding 'reasonable') it is better to succeed
with certainty via SNFS than waste further time with ECM. The exception to this guideline is of course the
case where one lacks the resources to run SNFS. The alternative then becomes "run ECM or do nothing".[/QUOTE]
I must be imagining things. I could have sworn that I wrote:

"And yes, I know that we're talking about probability and expected values over many composites/factors here, so one example should not guide our policy...."

xilman 2015-06-23 20:12

[QUOTE=R.D. Silverman;404655]I must be imagining things. I could have sworn that I wrote:

"The exception to this guideline is of course the
case where one lacks the resources to run SNFS. The alternative then becomes "run ECM or do nothing". "[/QUOTE]Mea culpa. I missed that.

R.D. Silverman 2015-06-23 20:53

[QUOTE]



No, it's not an exaggeration, and really you have no standing to claim that it was. Fortunately, I have the logs which bear this out. The total time I spent on ECM was 4.4 CPU-hours.


[/QUOTE]

Idiot.

You got very very very very lucky. Ask yourself instead: What would the time have been to run a full t55
on the number.


This composite had SNFS difficulty 247, which by even the most optimistic estimate would have taken about 3 orders of magnitude more CPU time.

[QUOTE]
Furthermore, your formula for actual savings is talking about time; I was careful to specify computation, not time.
[/QUOTE]

???????

Would someone parse this please? What is "computation" and how does one measure it???????


[QUOTE]

"And yes, I know that we're talking about probability and expected values over many composites/factors here, so one example should not guide our policy...."[/QUOTE]

But it is clear from your writing that it DOES seem to guide your policy.

wblipp 2015-06-23 21:36

[QUOTE=R.D. Silverman;404674]Would someone parse this please? What is "computation" and how does one measure it???????[/QUOTE]

I think it is left as exercise for the reader. It is a catch phrase for the thing we are trying to optimize in a problem that hasn't yet been fully structured.

Your observations about "unless you can't do SNFS" and Xilman's observations about subtlety based in various machine capabilities are both windows into this problem of how do we construct a relevant optimization problem to decide "how much ECM before SNFS/GNFS." Should the available computing resources of various capabilities be considered fixed? Are BOINC resources free but limited?

R.D. Silverman 2015-06-23 22:06

[QUOTE=wblipp;404677]I think it is left as exercise for the reader. It is a catch phrase for the thing we are trying to optimize in a problem that hasn't yet been fully structured.

Your observations about "unless you can't do SNFS" and Xilman's observations about subtlety based in various machine capabilities are both windows into this problem of how do we construct a relevant optimization problem to decide "how much ECM before SNFS/GNFS." Should the available computing resources of various capabilities be considered fixed? Are BOINC resources free but limited?[/QUOTE]

Good questions. The answer depend on the model.

jyb 2015-06-23 23:00

[QUOTE=R.D. Silverman;404674]Idiot.[/QUOTE]
Hee-hee, you really just can't help yourself, can you Bob? How sad.

[QUOTE=R.D. Silverman;404674]You got very very very very lucky. Ask yourself instead: What would the time have been to run a full t55
on the number.[/QUOTE]
Yes, I did get quite lucky. However, for this example we weren't talking about expected time, we were talking about actual time for the factorization that actually happened. You asked "How much time was spent on ECM?" in order to assess how much time was saved. The expected time is irrelevant to that specific question.

But in any case, why should the time to run a full t55 have any bearing on this? The composite in question had a 49-digit factor.

[QUOTE=R.D. Silverman;404674]
???????
Would someone parse this please? What is "computation" and how does one measure it???????
[/QUOTE]
Bob, everybody reading this forum, yourself included, knew exactly what I meant by "computation". It is clear from the context that I was referring to CPU time, as opposed to wall clock time, an important distinction when dealing with the massive parallelism that NFS@Home provides. You even quoted me when I said "...which by even the most optimistic estimate would have taken about 3 orders of magnitude more CPU time."


[QUOTE=R.D. Silverman;404674]But it is clear from your writing that it DOES seem to guide your policy.[/QUOTE]
Really? How is that clear? The only thing that I can see guiding my policy based on what I've written is that I use the 2/9 rule, which you have said gives a "decent approximation" of the optimum amount of ECM to use before starting SNFS. I also mentioned a particular example, which I took pains to point out should [I]not[/I] be used to guide policy. You can only interpret this as you have if you are either deeply confused or grasping at straws to make me look bad. Knowing your behavior in this forum, I assume it's the latter, but I can't completely rule out the former.

BTW, since I know your next response will consist of nothing but further insults and invective (something else I know from your behavior in this forum), I will choose to exit this dialogue. Feel free to rant, as per usual.

jyb 2015-06-24 02:00

[QUOTE=jyb;404684]But in any case, why should the time to run a full t55 have any bearing on this? The composite in question had a 49-digit factor.
[/QUOTE]

Correction: if the expected time were relevant, then you would have been right to consider a t55. The number may have had a 49-digit factor, but I obviously didn't know that when I was considering whether to run ECM. And the 2/9 "rule" would suggest almost exactly t55. My mistake on that point.

xilman 2015-06-24 07:22

[QUOTE=jyb;404684][QUOTE=R.D. Silverman;404674]Idiot.[/QUOTE]Hee-hee, you really just can't help yourself, can you Bob? How sad.

...

BTW, since I know your next response will consist of nothing but further insults and invective (something else I know from your behavior in this forum), I will choose to exit this dialogue. Feel free to rant, as per usual.[/QUOTE]

:popcorn:

jyb 2015-09-15 16:10

Just curious
 
For the past 6 weeks, there's been a number on the reservation page reserved to someone going by the name "3^n+2^n". At first I figured well, why not? One name's as good as another. But it recently occurred to me that it's possible someone made a reservation accidentally, intending to enter that into e.g. the factorDB. It further occurred to me that it's even remotely possible that that someone could have been me.

Tom, is there anything that looks like a valid email address associated with that reservation?

R.D. Silverman 2015-09-15 18:03

[QUOTE=jyb;410345]For the past 6 weeks, there's been a number on the reservation page reserved to someone going by the name "3^n+2^n". At first I figured well, why not? One name's as good as another. But it recently occurred to me that it's possible someone made a reservation accidentally, intending to enter that into e.g. the factorDB. It further occurred to me that it's even remotely possible that that someone could have been me.

Tom, is there anything that looks like a valid email address associated with that reservation?[/QUOTE]

I am finally making progress with 11,9,223+ and 11,9,223-. The LA is about 35% done for the former
(with a single threaded app) and I have built a matrix for the latter. Meanwhile, I continue
sieving with the resources available. [not a lot]

pinhodecarlos 2015-10-10 08:51

Is the ecmserver still up and running? If so can the IP be shared so I can point out some cores to it.

Thank you in advance,

Carlos


All times are UTC. The time now is 23:04.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.