![]() |
![]() |
#1 |
"Jason Goatcher"
Mar 2005
3×7×167 Posts |
![]()
I'm not totally sure what information is required to figure this out, but here's the problem:
In the Sierpenski problem and the Riesel problem we are basically trying to combine the k with an n that makes it prime. I'm hoping you can help me come up with a formula that incorporates the sieving efficiency, the odds of finding a prime, the time it takes to do a test, etc. and gives a reasonable(within 5%) answer as to where time would probably be best spent in the project at that time. In the Seventeen or Bust forum(SOB is related to the Sierpenski problem) people keep claiming sieving is way behind in efficiency, but since the main goal is to find the last 10(?) primes I'm not certain this is true. Known statistics(for Seventeen or Bust): -approximately 15 values sieved for every 1,000 billion -about 45 billion numbers processed in a 24 hour period on an average computer(this is a fairly wild guess based on my opinion of what an "average" computer amounts to) -an actual primality test takes about 8.68 days(12,500 minutes) -the odds of a number x being prime 1/log x I'm sure I'm about to hate the limited editing feature in these forums. LInk to SOB stats. Last fiddled with by jasong on 2005-05-20 at 23:39 |
![]() |
![]() |
![]() |
#2 | |
Sep 2002
Database er0rr
106028 Posts |
![]() Quote:
SOB have done a great job finding big primes that partially solve the S-problem but I am not holding my breath for the complete solution ![]() |
|
![]() |
![]() |
![]() |
#3 | |
"Bob Silverman"
Nov 2003
North of Boston
750610 Posts |
![]() Quote:
In fact, *if* a prime does exist for each multiplier, then the problem is not "open-ended". It is definitely finite in duration. Of course, if a prime doesn't exists for one (or more) multiplier, then of course the project is open ended. I expect that a prime under 10M digits will be found for each multiplier. Maybe 1 multiplier will require over 10M. Following Sam Wagstaff's heuristic for Mersenne primes gives better than a 50% chance that k*2^n + 1 WILL have a prime for M < n < 2M for large M. The current bound is M ~ 1.3M. 8 times this bound is ~ 10M. The (very rough) probability that a prime will not be found less than 10M is less than (1/2)^3 for each K. There are 9 remaining candidates. (For Mersenne primes, the probability is roughly .56) Determining the probability that k*2^n + 1 is prime for some n between M and 2M would be an interesting exercize. (heuristic, not proof) I agree that this is a VERY rough analysis. But it is better than just "I have a feeling". |
|
![]() |
![]() |
![]() |
#4 | ||
Aug 2002
Buenos Aires, Argentina
149210 Posts |
![]() Quote:
Quote:
|
||
![]() |
![]() |
![]() |
#5 | |
Aug 2002
Buenos Aires, Argentina
22·373 Posts |
![]() Quote:
|
|
![]() |
![]() |
![]() |
#6 | |
"Bob Silverman"
Nov 2003
North of Boston
1D5216 Posts |
![]() Quote:
In fact, I did not predict the number of terms. I only said that I had a heuristic that suggests infinitely many. I did *not* predict the sequence density. So your claim "failed to predict number of terms" is complete nonsense. Indeed, I have seen no mathematics that predicts the number of such pairs is finite. We've only seen some limited computation. and 2^62 is a LONG way from oo. If you think I am wrong I would be very happy to hear your mathematical reasoning. A (moderately straightforward) application of the "large sieve" shows that for any integer D, we should have x and x+D with the same number of prime factors (indeed the same number of total factors) infinitely often. {This was a *theorem* in the dissertation of my ex} The *number* of factors is not *bounded* by the theorem. [this is important] Now, when D is small relative to x (D = 1 is small) then if x is smooth infinitely often (and there are infinitely many smooth numbers by direct construction} then x+D should have the same number of factors as x infinitely often. If x has enough factors to be smooth (~log (x) factors) then so will x + D since log(x) ~ log(x + D) unless D is large. Note that such pairs become very SCARSE. As John Selfridge said: we know loglog x goes to inifinity, but noone has ever seen it do so. If the number of pairs (x, x+1) < M that are smooth grows as loglog M (or worse!) then finding them will be hard indeed. This heuristic may be wrong. But is it a lot better than the "compute first, and think about the mathematics later" approach already taken by people in this group. Apply some *reasoning* BEFORE you compute. I could be very wrong. My heuristic might be wrong. But I at least applied some reasoning to the problem. You have shown NONE. |
|
![]() |
![]() |
![]() |
#7 |
Aug 2002
Buenos Aires, Argentina
22·373 Posts |
![]()
Citing a simpler example, heuristics say that the set of pairs (p, p+D) where both p and p+D is prime should be infinite (by adding probabilities for individual pairs), but when D is odd there is only one pair (p=2).
Maybe your heuristic work for the majority of values of D, but not for particular values, such as D=1 in our case. This shows that some computing is necessary to avoid gross errors. |
![]() |
![]() |
![]() |
#8 | ||
Jul 2004
Potsdam, Germany
83110 Posts |
![]() Quote:
So far, the predictions were not that bad. According to that model, it is unlikely that there are more than 2 more primes below 10M digits. But I welcome every heuristic that indicates an ealier finish. ![]() Quote:
![]() |
||
![]() |
![]() |
![]() |
#9 | |
"Bob Silverman"
Nov 2003
North of Boston
11101010100102 Posts |
![]() Quote:
current test limits of ~13 million. I had either misread or mentally translated that into 1.3 million. I do expect that the remaining primes (with perhaps 1 exception) will be found within 3 more doublings of the current lower limit test bound. I think the smallest untested n is about 5 million (??) which means testing n to 40 million (12 million digits; 40 log_10(2) ~ 12) |
|
![]() |
![]() |
![]() |
#10 | ||
Sep 2002
Database er0rr
2×33×83 Posts |
![]() Quote:
Quote:
![]() |
||
![]() |
![]() |
![]() |
#11 | |
Sep 2002
Database er0rr
2·33·83 Posts |
![]() Quote:
For 321, I predict a prime at 1.33 times the size of the previous one. How? By plugging known 321 prime exponents into a spreadsheet and getting it to work out the average factor increase. |
|
![]() |
![]() |
![]() |
Thread Tools | |
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Sieving and LLR-in in same time | pepi37 | Software | 1 | 2017-05-15 17:17 |
Prime Density Correction Terms? | wblipp | Math | 20 | 2011-09-07 21:45 |
Thanks PG for Sieving, now time to pick a new n? | cipher | Twin Prime Search | 25 | 2009-08-09 19:32 |
Prime k-value density rating and factorizations | gd_barnes | No Prime Left Behind | 25 | 2009-07-30 12:06 |
Optimal Sieving Time | jasong | Sierpinski/Riesel Base 5 | 10 | 2009-01-25 15:56 |