mersenneforum.org > Math A real GIMPS math thought
 Register FAQ Search Today's Posts Mark Forums Read

 2011-05-28, 01:15 #1 davieddy     "Lucan" Dec 2006 England 144638 Posts A real GIMPS math thought (That was a hint about the location of this subforum). Suppose we have one reliable CPU which TFs then LLs, no DCs. The universally(?) accepted wisdom (see "The Math" page") is that having TFed to bit level X, one should TF to bit level X+1 unless the time to TF from X to (X+!) > T/X where T is the time of the LL test (1/X being the probability of a factor between 2^X and 2^(X+1)). Thus (1+1/X)T < total time per exponent < (1 + 2/X)T (extra bit takes twice as long). BUT: By TFing to X+1 instead of X, the probability of the exponent being prime is enhanced by (X+1)/X. This compensates for the extra time (T/X) spent. Am I right? David No :)) The extra time is 2T/X Silly me! I know that "bit level" is the natural way of organizing a TF program (not to mention one's thinking) but if we were to consider each potential factor and ask "What is the optimum cutoff point", would not my thought about probability of being prime shift the answer slightly? Last fiddled with by davieddy on 2011-05-28 at 02:08
 2011-05-28, 01:46 #2 Christenson     Dec 2010 Monticello 6B016 Posts Uhhh...the chances of an exponent being prime are, well, miniscule...about 1 in 500,000, more or less...and they don't depend on how much TF or P-1 or LL testing or anything else is done to them. [Well, there's some evidence that some people have an idea that the exponents of Mersenne primes have certain properties with higher than normal frequencies, and test those exponents first, but once the exponent is chosen, it's primality doesn't depend on the testing for primality that is done] What does change, for the overwhelming majority of exponents, is the effort to prove that they are not in fact primes. The question is how to allocate the effort such that the maximum number of exponents get proven to be indeed composite for a fixed amount of effort. For example, if I test a group of 100 exponents with P-1, and find 10 of them composite (because we have a factor) at a cost of 600GHz days, then there are 20 LL tests (first and double-checks) that no longer have to be carried out. If one LL test costs 60 GHz days, so 20 would cost 1200GHz days, the 600GHz days of P-1 have saved 600GHz days of total effort. Same with TF: If, on average, finding a factor costs less(takes fewer GHz days) than the LL testing, then TF'ing to that level reduces the total cost. Note that I'm ignoring that specialist hardware (GPUs) significantly ease trial factoring, so they can accomplish lots and lots more TF than CPUs can...meaning that GHz day isn't exactly the correct unit of measure for the cost of the computation.
2011-05-28, 02:24   #3
davieddy

"Lucan"
Dec 2006
England

6,451 Posts

Quote:
 Originally Posted by Christenson Uhhh...the chances of an exponent being prime are, well, miniscule...about 1 in 500,000, more or less...and they don't depend on how much TF or P-1 or LL testing or anything else is done to them.
I like to take one point at a time these days.

For a 50M exponent, the chance would be ~1/2,000,000 with no TF or P-1.

If a reliable LL test had been done, the probability would be 0 or 1.

Mr Silverman enjoys this nicety.

David

PS to be pedantic, the chance of GIMPS testing 2^exponent - 1
is 0 unless the exponent is prime:)

Last fiddled with by davieddy on 2011-05-28 at 02:52

2011-05-28, 08:10   #4

"Richard B. Woods"
Aug 2002
Wisconsin USA

22×3×599 Posts

Quote:
 Originally Posted by davieddy By TFing to X+1 instead of X, the probability of the exponent being prime is enhanced by (X+1)/X.
No, because this ignores the likely number of prime factors that 2^exponent-1 has. The probabilities 1/X, 1/(X+1), 1/(X+2), ... 1/(exponent/2) sum to more than 1 (for the numbers we're usually considering) because the Mersenne number is likely to have more than just one prime factor below its square root.

So, TFing to X+1 instead of X affects the probability of finding a factor in the range 1-X (and the probability of finding a factor in range X+1 to exponent/2), but that's not simply inversely proportional to the probability that the Mersenne number is prime, because of the possible existence of multiple factors.

Last fiddled with by cheesehead on 2011-05-28 at 08:11

2011-05-28, 09:06   #5
xilman
Bamboozled!

"πΊππ·π·π­"
May 2003
Down not across

1010910 Posts

Quote:
 Originally Posted by Christenson Note that I'm ignoring that specialist hardware (GPUs) significantly ease trial factoring, so they can accomplish lots and lots more TF than CPUs can...meaning that GHz day isn't exactly the correct unit of measure for the cost of the computation.
Really?

Why should 100 ALUs clocked at 1GHz not be awarded 100 GHz days just because all 100 are located in a single chip, when you seem to be OK with 4 ALUs, also on a single chip, clocked at 2.5GHz being awarded 10 GHz days?

Ok, the numbers are only illustrative and the GPUs are under-rewarded by this simple metric but the question is a serious one.

Paul

2011-05-28, 10:37   #6
davieddy

"Lucan"
Dec 2006
England

6,451 Posts

Quote:
 Originally Posted by cheesehead No, because this ignores the likely number of prime factors that 2^exponent-1 has. The probabilities 1/X, 1/(X+1), 1/(X+2), ... 1/(exponent/2) sum to more than 1 (for the numbers we're usually considering) because the Mersenne number is likely to have more than just one prime factor below its square root. So, TFing to X+1 instead of X affects the probability of finding a factor in the range 1-X (and the probability of finding a factor in range X+1 to exponent/2), but that's not simply inversely proportional to the probability that the Mersenne number is prime, because of the possible existence of multiple factors.
See the "Math" page. (Near the bottom).

The probability of no factor between 2^X and 2^(X+1) is 1 - 1/X
=(X-1)/X.

So the probability of no factor up to exponent/2 is the product
(X-1)/X * X/(X+1) *.....* (exponent/2 - 1)/(exponent/2)
= (X-1)/(exponent/2).

Now replace 1/2 with Euler's gamma (because it tastes better),
and you have the formula which is generally used,
and gives results consistent with Wagstaff's conjecture.

Another way of looking at it is that it seems ~5/8 of
candidates get factored, but finding which ones does
not alter the expected primes in a given range.
Why? Because the probability of the remaining
candidates being prime is enhanced by 8/3.
The same presumably applies to P-1 or ECM or whatever.

David

Last fiddled with by davieddy on 2011-05-28 at 11:00

 2011-05-28, 12:31 #7 davieddy     "Lucan" Dec 2006 England 645110 Posts Physics instinct kicks in Take the curve y = 10 + (x - 4)^2. You want to know the minimum value of y. Evaluate it anywhere near x=4 and you won't be far out. You want to know "at what precise x does the min y occur?" Be bloody careful about what y you choose David Or more practically: "Suck it and see", as an elderly female lab assistant used to advise me whenever I asked her a physics question! i.e. try a few values of x near 4, and note the smallest y. Last fiddled with by davieddy on 2011-05-28 at 13:07
2011-05-28, 12:54   #8
Christenson

Dec 2010
Monticello

24×107 Posts

Quote:
 Originally Posted by xilman Really? Why should 100 ALUs clocked at 1GHz not be awarded 100 GHz days just because all 100 are located in a single chip, when you seem to be OK with 4 ALUs, also on a single chip, clocked at 2.5GHz being awarded 10 GHz days? Ok, the numbers are only illustrative and the GPUs are under-rewarded by this simple metric but the question is a serious one. Paul
Paul
You can measure everything in GHz-Days if you want, but if I understand it correctly, my mid-range GPU earns perhaps 50GHz-Days per calendar day of TF, perhaps 10GHz days per day of LL. My CPU core earns perhaps 3 GHz-Days per day of either LL or TF.

GHz-Days thus measure progress on the computation, not the effort that goes in...and I claim that it is effective effort we want to minimize for the maximum amount of progress.

2011-05-28, 14:08   #9
davieddy

"Lucan"
Dec 2006
England

6,451 Posts

Quote:
 Originally Posted by xilman Really? Why should 100 ALUs clocked at 1GHz not be awarded 100 GHz days just because all 100 are located in a single chip, when you seem to be OK with 4 ALUs, also on a single chip, clocked at 2.5GHz being awarded 10 GHz days? Ok, the numbers are only illustrative and the GPUs are under-rewarded by this simple metric but the question is a serious one. Paul
Quote:
 Originally Posted by Christenson Paul You can measure everything in GHz-Days if you want, but if I understand it correctly, my mid-range GPU earns perhaps 50GHz-Days per calendar day of TF, perhaps 10GHz days per day of LL. My CPU core earns perhaps 3 GHz-Days per day of either LL or TF. GHz-Days thus measure progress on the computation, not the effort that goes in...and I claim that it is effective effort we want to minimize for the maximum amount of progress.
Hey you two.

I started this thread stating one CPU (P4/computer/core/ALU or whatever)
with the sole intention of avoiding this digression discussed in many

Richard has at least partly responded to my point.

Further debate along the line of thought thereby defined(?)
is more than welcome.

David.

2011-05-28, 15:01   #10
davieddy

"Lucan"
Dec 2006
England

6,451 Posts

Quote:
 Originally Posted by davieddy Take the curve y = 10 + (x - 4)^2
OTOH think of the normal distribution and/or Dirac's delta function.

I love statistical mechanics, but it is an acquired taste!

Maxwell's Demon

Last fiddled with by davieddy on 2011-05-28 at 15:02

2011-05-29, 05:27   #11

"Richard B. Woods"
Aug 2002
Wisconsin USA

718810 Posts

Quote:
 Originally Posted by davieddy See the "Math" page. (Near the bottom). The probability of no factor between 2^X and 2^(X+1) is 1 - 1/X =(X-1)/X. So the probability of no factor up to exponent/2 is the product (X-1)/X * X/(X+1) *.....* (exponent/2 - 1)/(exponent/2) = (X-1)/(exponent/2). Now replace 1/2 with Euler's gamma (because it tastes better), and you have the formula which is generally used, and gives results consistent with Wagstaff's conjecture. Another way of looking at it is that it seems ~5/8 of candidates get factored, but finding which ones does not alter the expected primes in a given range. Why? Because the probability of the remaining candidates being prime is enhanced by 8/3. The same presumably applies to P-1 or ECM or whatever. David
Okay. You're right. My intuition failed me there. I'll try to remember.

 Similar Threads Thread Thread Starter Forum Replies Last Post MooMoo2 Lounge 59 2018-01-02 18:37 SarK0Y Miscellaneous Math 44 2011-11-07 18:01 schickel Aliquot Sequences 0 2011-02-21 03:52 BMgf Homework Help 6 2003-12-09 17:13 Kevin Lounge 1 2003-03-10 14:01

All times are UTC. The time now is 03:57.

Tue Oct 20 03:57:00 UTC 2020 up 40 days, 1:07, 0 users, load averages: 1.50, 1.81, 1.80