mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > Math

Reply
 
Thread Tools
Old 2018-08-11, 09:11   #1
preda
 
preda's Avatar
 
"Mihai Preda"
Apr 2015

25338 Posts
Default TF vs. PRP

For a Mersenne candidate with exponent in the 332M range, that has been TF unsuccessfully to 80bits, what is the probability of it having a factor in 80-to-81 bits?

This page https://www.mersenne.org/various/math.php says:
Quote:
Looking at past factoring data we see that the chance of finding a factor between 2^X and 2^(X+1) is about 1/x
Is that statement accurate? is anything more precise available?

Should I say that the best estimate for my question is p = 1/80? Does this match empiric data?

For an example, if I can to PRP in 32days and TF 80-to-81 in 15hours, it seems to me that the TF80-81 is not worth it, if the probability is 1/80 (but it would be worth with a factor probability of 2%). Is that so?
preda is offline   Reply With Quote
Old 2018-08-11, 10:48   #2
science_man_88
 
science_man_88's Avatar
 
"Forget I exist"
Jul 2009
Dumbassville

100000110000002 Posts
Default

Quote:
Originally Posted by preda View Post
For a Mersenne candidate with exponent in the 332M range, that has been TF unsuccessfully to 80bits, what is the probability of it having a factor in 80-to-81 bits?

This page https://www.mersenne.org/various/math.php says:


Is that statement accurate? is anything more precise available?

Should I say that the best estimate for my question is p = 1/80? Does this match empiric data?

For an example, if I can to PRP in 32days and TF 80-to-81 in 15hours, it seems to me that the TF80-81 is not worth it, if the probability is 1/80 (but it would be worth with a factor probability of 2%). Is that so?
see later on:
Quote:
This simple approach isn't quite right. It would give a formula of how_far_factored divided by (exponent divided by 2). However, more rigorous work has shown the formula to be (how_far_factored-1) / (exponent times Euler's constant (0.577...)). In this case, 1 in 91623. Even these more rigourous formulas are unproven.
science_man_88 is offline   Reply With Quote
Old 2018-08-11, 11:24   #3
preda
 
preda's Avatar
 
"Mihai Preda"
Apr 2015

137110 Posts
Default

Quote:
Originally Posted by science_man_88 View Post
see later on:
Thanks for pointing that line out.

Let's see: talking about M(p)=2^p - 1, and 'e' is Euler's constant. TF(k) means "has no factors less than 2^k".

The page says that:
if M(p) is TF(k), then the probability that M(p) is prime is (k-1)/(e*p).

Let x be the probability of not finding a factor in TF from k to k+1. It follows that:
(k - 1)/(e*p) == x * k/(e*p), so
x == (k - 1)/k.

So, the probability of *finding* a factor in TF k to k+1 is (1-x) == 1/k. That is, what we started from.
preda is offline   Reply With Quote
Old 2018-08-11, 14:58   #4
kriesel
 
kriesel's Avatar
 
"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest

123638 Posts
Default What's worthwhile depends on what's saved.

P-1 factoring considers the same effort/reward tradeoff.
Factoring effort is adjusted according to an assumption about how many primality tests are saved by finding a factor, probability of finding a factor as a function of how much factoring effort is expended, and the effort expended per primality test, to maximize effort saved. Prime95 has stated that PRP will be double checked, as LL is. (Both the reliability of the computation and of the user is checked.) LL has error rate ~2% (without Jacobi check) so on average will have ~2.04 LL tests per unfactored exponent; perhaps 2.02 with Jacobi check. (Actually a little more because the 2% third check also has an error rate that may lead to a fourth check etc; an infinite series; 2.040816... or 2.0202..., but that 2% is so imprecisely known, series sums occupy unjustified precision.) PRP with GEC will have 2 + a nonzero but much smaller delta, probably small enough to ignore the delta.

There's sometimes discussion of whether the double checks (and triple etc) should be considered, and equally or not, with one side claiming the double checks happen years later on average on faster hardware and so are not equally relevant, and another side claiming all computing effort is essentially equally valuable or costly at any given point so all factoring effort and primality checks ought be counted and considered in the tradeoff, by the current Ghzd/d rates of current hardware for TF, P-1, primality first test, and primality DC. The latter makes more sense to me. Any attempt at optimization is complicated, since the ratio of TF to primality testing throughput differs among gpus and cpus somewhat, as well as drastically between gpus as a whole and cpus as a whole, and there may not be good statistical data about the population mix of devices being applied in GIMPS, and those values vary for a single device according to .exponent and other variables. (In my little sample of hardware, for example, the TF/LL Ghzd/day ratios range from 11.3 to 15.5 on gpus, 22.5 on igp, and 0.72 to 1.25 on cpus. P-1/LL also varies.)

Efficiency gains in primality testing such as Preda's made periodically in OpenCl on gpuOwL, Prime95 on cpus, etc., also enter into the tradeoff computations, shifting the optimal point, as would efficiency gains in factoring.

Last fiddled with by kriesel on 2018-08-11 at 15:36
kriesel is online now   Reply With Quote
Reply

Thread Tools


All times are UTC. The time now is 18:44.


Fri Jul 16 18:44:48 UTC 2021 up 49 days, 16:32, 1 user, load averages: 5.49, 5.42, 4.84

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.