mersenneforum.org > Math prime density vs. sieving vs. prp time
 Register FAQ Search Today's Posts Mark Forums Read

 2005-05-20, 23:26 #1 jasong     "Jason Goatcher" Mar 2005 DB316 Posts prime density vs. sieving vs. prp time I'm not totally sure what information is required to figure this out, but here's the problem: In the Sierpenski problem and the Riesel problem we are basically trying to combine the k with an n that makes it prime. I'm hoping you can help me come up with a formula that incorporates the sieving efficiency, the odds of finding a prime, the time it takes to do a test, etc. and gives a reasonable(within 5%) answer as to where time would probably be best spent in the project at that time. In the Seventeen or Bust forum(SOB is related to the Sierpenski problem) people keep claiming sieving is way behind in efficiency, but since the main goal is to find the last 10(?) primes I'm not certain this is true. Known statistics(for Seventeen or Bust): -approximately 15 values sieved for every 1,000 billion -about 45 billion numbers processed in a 24 hour period on an average computer(this is a fairly wild guess based on my opinion of what an "average" computer amounts to) -an actual primality test takes about 8.68 days(12,500 minutes) -the odds of a number x being prime 1/log x I'm sure I'm about to hate the limited editing feature in these forums. LInk to SOB stats. Last fiddled with by jasong on 2005-05-20 at 23:39
2006-03-22, 02:35   #2
paulunderwood

Sep 2002
Database er0rr

2·1,811 Posts

Quote:
 n the Seventeen or Bust forum(SOB is related to the Sierpenski problem) people keep claiming sieving is way behind in efficiency, but since the main goal is to find the last 10(?) primes I'm not certain this is true.
The problem with SOB is that the search is open-ended. They will probably never solve it because the last few numbers to be found will be very, very large. This means they can sieve 'til the cows come home and they will not have done enough. Sure finding a prime will eliminate a number and sieving efficiency will go up. Imagine they know the last number is about 100 million digits and each test takes over ten years on current hardware, they'll need to sieve extremely deeply indeed to save time.

SOB have done a great job finding big primes that partially solve the S-problem but I am not holding my breath for the complete solution

2006-03-22, 12:05   #3
R.D. Silverman

Nov 2003

22·5·373 Posts

Quote:
 Originally Posted by paulunderwood The problem with SOB is that the search is open-ended. They will probably never solve it because the last few numbers to be found will be very, very large. This means they can sieve 'til the cows come home and they will not have done enough. Sure finding a prime will eliminate a number and sieving efficiency will go up. Imagine they know the last number is about 100 million digits and each test takes over ten years on current hardware, they'll need to sieve extremely deeply indeed to save time. SOB have done a great job finding big primes that partially solve the S-problem but I am not holding my breath for the complete solution
Bold claims. Please give some mathematical justification for them.

In fact, *if* a prime does exist for each multiplier, then the problem
is not "open-ended". It is definitely finite in duration. Of course, if a prime
doesn't exists for one (or more) multiplier, then of course the project
is open ended.

I expect that a prime under 10M digits will be found for each multiplier.
Maybe 1 multiplier will require over 10M.

Following Sam Wagstaff's heuristic for Mersenne primes gives better
than a 50% chance that k*2^n + 1 WILL have a prime for M < n < 2M
for large M. The current bound is M ~ 1.3M. 8 times this bound is ~ 10M.
The (very rough) probability that a prime will not be found less than 10M
is less than (1/2)^3 for each K. There are 9 remaining candidates.
(For Mersenne primes, the probability is roughly .56)

Determining the probability that k*2^n + 1 is prime for some n between
M and 2M would be an interesting exercize. (heuristic, not proof)

I agree that this is a VERY rough analysis. But it is better than just
"I have a feeling".

2006-03-22, 12:18   #4
alpertron

Aug 2002
Buenos Aires, Argentina

101010001002 Posts

Quote:
 Originally Posted by R.D. Silverman In fact, *if* a prime does exist for each multiplier, then the problem is not "open-ended". It is definitely finite in duration. Of course, if a prime doesn't exists for one (or more) multiplier, then of course the project is open ended.
That's not right. Suppose that the last prime occurs for an exponent about 10^10 (which is highly possible). If computer hardware does not advance enough to attack exponents in this range, nobody will be able to find this prime. So the problem will remain open forever, even if the search range is finite.

Quote:
 Originally Posted by R.D. Silverman I expect that a prime under 10M digits will be found for each multiplier. Maybe 1 multiplier will require over 10M.
From current trends (using extrapolation), *several* multipliers will require exponents above 10^7. See http://www.mersenneforum.org/showthread.php?t=5630 for a thread where Bob failed completely to predict the number of terms of a sequence.

2006-03-22, 12:29   #5
alpertron

Aug 2002
Buenos Aires, Argentina

22·337 Posts

Quote:
 Originally Posted by R.D. Silverman Following Sam Wagstaff's heuristic for Mersenne primes gives better than a 50% chance that k*2^n + 1 WILL have a prime for M < n < 2M for large M. The current bound is M ~ 1.3M. 8 times this bound is ~ 10M. The (very rough) probability that a prime will not be found less than 10M is less than (1/2)^3 for each K. There are 9 remaining candidates. (For Mersenne primes, the probability is roughly .56)
According to http://www.seventeenorbust.com/stats/ the actual exponents being tested are between 9M and 10M, so the numbers have about 3,000,000 digits.

2006-03-22, 12:45   #6
R.D. Silverman

Nov 2003

22×5×373 Posts

Quote:
 Originally Posted by alpertron That's not right. Suppose that the last prime occurs for an exponent about 10^10 (which is highly possible). If computer hardware does not advance enough to attack exponents in this range, nobody will be able to find this prime. So the problem will remain open forever, even if the search range is finite. From current trends (using extrapolation), *several* multipliers will require exponents above 10^7. See http://www.mersenneforum.org/showthread.php?t=5630 for a thread where Bob failed completely to predict the number of terms of a sequence.
"failed completely to predict". And you know this how?

In fact, I did not predict the number of terms. I only said that I had
a heuristic that suggests infinitely many. I did *not* predict the
sequence density. So your claim "failed to predict number of terms"
is complete nonsense. Indeed, I have seen no mathematics that
predicts the number of such pairs is finite. We've only seen some
limited computation. and 2^62 is a LONG way from oo.

If you think I am wrong I would be very happy to hear your
mathematical reasoning.

A (moderately straightforward) application of the "large sieve" shows
that for any integer D, we should have x and x+D with the same
number of prime factors (indeed the same number of total factors)
infinitely often.

{This was a *theorem* in the dissertation of my ex}
The *number* of factors is not *bounded* by the theorem. [this is
important]

Now, when D is small relative to x (D = 1 is small) then if x is smooth
infinitely often (and there are infinitely many smooth numbers by direct
construction} then x+D should have the same number of factors as x
infinitely often. If x has enough factors to be smooth (~log (x) factors)
then so will x + D since log(x) ~ log(x + D) unless D is large.

Note that such pairs become very SCARSE. As John Selfridge said:
we know loglog x goes to inifinity, but noone has ever seen it do so.

If the number of pairs (x, x+1) < M that are smooth grows as loglog M
(or worse!) then finding them will be hard indeed.

This heuristic may be wrong. But is it a lot better than the "compute first,
people in this group.

Apply some *reasoning* BEFORE you compute. I could be very wrong.
My heuristic might be wrong. But I at least applied some reasoning to the
problem. You have shown NONE.

 2006-03-22, 12:59 #7 alpertron     Aug 2002 Buenos Aires, Argentina 22×337 Posts Citing a simpler example, heuristics say that the set of pairs (p, p+D) where both p and p+D is prime should be infinite (by adding probabilities for individual pairs), but when D is odd there is only one pair (p=2). Maybe your heuristic work for the majority of values of D, but not for particular values, such as D=1 in our case. This shows that some computing is necessary to avoid gross errors.
2006-03-22, 13:46   #8
Mystwalker

Jul 2004
Potsdam, Germany

3·277 Posts

Quote:
 Originally Posted by R.D. Silverman I expect that a prime under 10M digits will be found for each multiplier. Maybe 1 multiplier will require over 10M.
William Lipp created a prediction model - unfortunately, I only find his predictions, not the model itself.
So far, the predictions were not that bad. According to that model, it is unlikely that there are more than 2 more primes below 10M digits.

But I welcome every heuristic that indicates an ealier finish.

Quote:
 There are 9 remaining candidates.
Since last december: 8

2006-03-22, 14:43   #9
R.D. Silverman

Nov 2003

22·5·373 Posts

Quote:
 Originally Posted by Mystwalker William Lipp created a prediction model - unfortunately, I only find his predictions, not the model itself. So far, the predictions were not that bad. According to that model, it is unlikely that there are more than 2 more primes below 10M digits. But I welcome every heuristic that indicates an ealier finish. Since last december: 8
I am (VERY VERY!) guilty of a cognitive error. The SoB web page shows
current test limits of ~13 million. I had either misread or mentally
translated that into 1.3 million.

I do expect that the remaining primes (with perhaps 1 exception) will
be found within 3 more doublings of the current lower limit test bound.
I think the smallest untested n is about 5 million (??) which means
testing n to 40 million (12 million digits; 40 log_10(2) ~ 12)

2006-03-22, 15:45   #10
paulunderwood

Sep 2002
Database er0rr

2×1,811 Posts

Quote:
 Bold claims. Please give some mathematical justification for them.
I have, personally, no mathematical reasoning for this. Rather, I rely on greater minds to do the predictions.

Quote:
 n fact, *if* a prime does exist for each multiplier, then the problem is not "open-ended". It is definitely finite in duration. Of course, if a prime doesn't exists for one (or more) multiplier, then of course the project is open ended.
I really meant that the limit to which sieving should be done is "undetermined" because the primes to be found to solve the S-problem have an unknown size. For efficiency does SOB sieve for (last) exponent (of 2) to 40 million (Silverman) or 18 trillion (Lipp)

2006-03-22, 16:27   #11
paulunderwood

Sep 2002
Database er0rr

1110001001102 Posts

Quote:
 Following Sam Wagstaff's heuristic for Mersenne primes gives better than a 50% chance that k*2^n + 1 WILL have a prime for M < n < 2M
This is okay for Mersenne primes, but for the Sierpinski primes the "weights" are much lower.

For 321, I predict a prime at 1.33 times the size of the previous one. How? By plugging known 321 prime exponents into a spreadsheet and getting it to work out the average factor increase.

 Similar Threads Thread Thread Starter Forum Replies Last Post pepi37 Software 1 2017-05-15 17:17 wblipp Math 20 2011-09-07 21:45 cipher Twin Prime Search 25 2009-08-09 19:32 gd_barnes No Prime Left Behind 25 2009-07-30 12:06 jasong Sierpinski/Riesel Base 5 10 2009-01-25 15:56

All times are UTC. The time now is 20:59.

Mon Apr 19 20:59:58 UTC 2021 up 11 days, 15:40, 0 users, load averages: 4.42, 3.85, 3.73