![]() |
prime density vs. sieving vs. prp time
I'm not totally sure what information is required to figure this out, but here's the problem:
In the Sierpenski problem and the Riesel problem we are basically trying to combine the k with an n that makes it prime. I'm hoping you can help me come up with a formula that incorporates the sieving efficiency, the odds of finding a prime, the time it takes to do a test, etc. and gives a reasonable(within 5%) answer as to where time would probably be best spent in the project at that time. In the Seventeen or Bust forum(SOB is related to the Sierpenski problem) people keep claiming sieving is way behind in efficiency, but since the main goal is to find the last 10(?) primes I'm not certain this is true. Known statistics(for Seventeen or Bust): -approximately 15 values sieved for every 1,000 billion -about 45 billion numbers processed in a 24 hour period on an average computer(this is a fairly wild guess based on my opinion of what an "average" computer amounts to) -an actual primality test takes about 8.68 days(12,500 minutes) -the odds of a number x being prime 1/log x I'm sure I'm about to hate the limited editing feature in these forums. [url=http://www.seventeenorbust.com/stats/]LInk to SOB stats.[/url] |
[QUOTE]n the Seventeen or Bust forum(SOB is related to the Sierpenski problem) people keep claiming sieving is way behind in efficiency, but since the main goal is to find the last 10(?) primes I'm not certain this is true.
[/QUOTE] The problem with SOB is that the search is open-ended. They will probably never solve it because the last few numbers to be found will be very, very large. This means they can sieve 'til the cows come home and they will not have done enough. Sure finding a prime will eliminate a number and sieving efficiency will go up. Imagine they know the last number is about 100 million digits and each test takes over ten years on current hardware, they'll need to sieve extremely deeply indeed to save time. SOB have done a great job finding big primes that partially solve the S-problem but I am not holding my breath for the complete solution :evil: |
[QUOTE=paulunderwood]The problem with SOB is that the search is open-ended. They will probably never solve it because the last few numbers to be found will be very, very large. This means they can sieve 'til the cows come home and they will not have done enough. Sure finding a prime will eliminate a number and sieving efficiency will go up. Imagine they know the last number is about 100 million digits and each test takes over ten years on current hardware, they'll need to sieve extremely deeply indeed to save time.
SOB have done a great job finding big primes that partially solve the S-problem but I am not holding my breath for the complete solution :evil:[/QUOTE] Bold claims. Please give some mathematical justification for them. In fact, *if* a prime does exist for each multiplier, then the problem is not "open-ended". It is definitely finite in duration. Of course, if a prime doesn't exists for one (or more) multiplier, then of course the project is open ended. I expect that a prime under 10M digits will be found for each multiplier. Maybe 1 multiplier will require over 10M. Following Sam Wagstaff's heuristic for Mersenne primes gives better than a 50% chance that k*2^n + 1 WILL have a prime for M < n < 2M for large M. The current bound is M ~ 1.3M. 8 times this bound is ~ 10M. The (very rough) probability that a prime will not be found less than 10M is less than (1/2)^3 for each K. There are 9 remaining candidates. (For Mersenne primes, the probability is roughly .56) Determining the probability that k*2^n + 1 is prime for some n between M and 2M would be an interesting exercize. (heuristic, not proof) I agree that this is a VERY rough analysis. But it is better than just "I have a feeling". |
[QUOTE=R.D. Silverman]In fact, *if* a prime does exist for each multiplier, then the problem is not "open-ended". It is definitely finite in duration. Of course, if a prime doesn't exists for one (or more) multiplier, then of course the project is open ended.[/QUOTE]
That's not right. Suppose that the last prime occurs for an exponent about 10^10 (which is highly possible). If computer hardware does not advance enough to attack exponents in this range, nobody will be able to find this prime. So the problem will remain open forever, even if the search range is finite. [QUOTE=R.D. Silverman]I expect that a prime under 10M digits will be found for each multiplier. Maybe 1 multiplier will require over 10M.[/QUOTE] From current trends (using extrapolation), *several* multipliers will require exponents above 10^7. See [url]http://www.mersenneforum.org/showthread.php?t=5630[/url] for a thread where Bob failed completely to predict the number of terms of a sequence. |
[QUOTE=R.D. Silverman]Following Sam Wagstaff's heuristic for Mersenne primes gives better
than a 50% chance that k*2^n + 1 WILL have a prime for M < n < 2M for large M. The current bound is M ~ 1.3M. 8 times this bound is ~ 10M. The (very rough) probability that a prime will not be found less than 10M is less than (1/2)^3 for each K. There are 9 remaining candidates. (For Mersenne primes, the probability is roughly .56)[/QUOTE] According to [url]http://www.seventeenorbust.com/stats/[/url] the actual exponents being tested are between 9M and 10M, so the numbers have about 3,000,000 digits. |
[QUOTE=alpertron]That's not right. Suppose that the last prime occurs for an exponent about 10^10 (which is highly possible). If computer hardware does not advance enough to attack exponents in this range, nobody will be able to find this prime. So the problem will remain open forever, even if the search range is finite.
From current trends (using extrapolation), *several* multipliers will require exponents above 10^7. See [url]http://www.mersenneforum.org/showthread.php?t=5630[/url] for a thread where Bob failed completely to predict the number of terms of a sequence.[/QUOTE] "failed completely to predict". And you know this how? In fact, I did not predict the number of terms. I only said that I had a heuristic that suggests infinitely many. I did *not* predict the sequence density. So your claim "failed to predict number of terms" is complete nonsense. Indeed, I have seen no mathematics that predicts the number of such pairs is finite. We've only seen some limited computation. and 2^62 is a LONG way from oo. If you think I am wrong I would be very happy to hear your mathematical reasoning. A (moderately straightforward) application of the "large sieve" shows that for any integer D, we should have x and x+D with the same number of prime factors (indeed the same number of total factors) infinitely often. {This was a *theorem* in the dissertation of my ex} The *number* of factors is not *bounded* by the theorem. [this is important] Now, when D is small relative to x (D = 1 is small) then if x is smooth infinitely often (and there are infinitely many smooth numbers by direct construction} then x+D should have the same number of factors as x infinitely often. If x has enough factors to be smooth (~log (x) factors) then so will x + D since log(x) ~ log(x + D) unless D is large. Note that such pairs become very SCARSE. As John Selfridge said: we know loglog x goes to inifinity, but noone has ever seen it do so. If the number of pairs (x, x+1) < M that are smooth grows as loglog M (or worse!) then finding them will be hard indeed. This heuristic may be wrong. But is it a lot better than the "compute first, and think about the mathematics later" approach already taken by people in this group. Apply some *reasoning* BEFORE you compute. I could be very wrong. My heuristic might be wrong. But I at least applied some reasoning to the problem. You have shown NONE. |
Citing a simpler example, heuristics say that the set of pairs (p, p+D) where both p and p+D is prime should be infinite (by adding probabilities for individual pairs), but when D is odd there is only one pair (p=2).
Maybe your heuristic work for the majority of values of D, but not for particular values, such as D=1 in our case. This shows that some computing is necessary to avoid gross errors. |
[QUOTE=R.D. Silverman]I expect that a prime under 10M digits will be found for each multiplier.
Maybe 1 multiplier will require over 10M.[/quote] William Lipp created a prediction model - unfortunately, I only find his [url=http://www.free-dc.org/forum/showpost.php?p=66799&postcount=3]predictions[/url], not the model itself. So far, the predictions were not that bad. According to that model, it is unlikely that there are more than 2 more primes below 10M digits. But I welcome every heuristic that indicates an ealier finish. :wink: [quote]There are 9 remaining candidates.[/QUOTE] Since last december: 8 :smile: |
[QUOTE=Mystwalker]William Lipp created a prediction model - unfortunately, I only find his [url=http://www.free-dc.org/forum/showpost.php?p=66799&postcount=3]predictions[/url], not the model itself.
So far, the predictions were not that bad. According to that model, it is unlikely that there are more than 2 more primes below 10M digits. But I welcome every heuristic that indicates an ealier finish. :wink: Since last december: 8 :smile:[/QUOTE] I am (VERY VERY!) guilty of a cognitive error. The SoB web page shows current test limits of ~13 million. I had either misread or mentally translated that into 1.3 million. I do expect that the remaining primes (with perhaps 1 exception) will be found within 3 more doublings of the current lower limit test bound. I think the smallest untested n is about 5 million (??) which means testing n to 40 million (12 million digits; 40 log_10(2) ~ 12) |
[QUOTE]Bold claims. Please give some mathematical justification for them.[/QUOTE]
I have, personally, no mathematical reasoning for this. Rather, I rely on greater minds to do the predictions. [QUOTE]n fact, *if* a prime does exist for each multiplier, then the problem is not "open-ended". It is definitely finite in duration. Of course, if a prime doesn't exists for one (or more) multiplier, then of course the project is open ended.[/QUOTE] I really meant that the limit to which sieving should be done is "undetermined" because the primes to be found to solve the S-problem have an unknown size. For efficiency does SOB sieve for (last) exponent (of 2) to 40 million (Silverman) or 18 trillion (Lipp) :question: |
[QUOTE]Following Sam Wagstaff's heuristic for Mersenne primes gives better
than a 50% chance that k*2^n + 1 WILL have a prime for M < n < 2M[/QUOTE] This is okay for Mersenne primes, but for the Sierpinski primes the "weights" are much lower. For 321, I predict a prime at 1.33 times the size of the previous one. How? By plugging known 321 prime exponents into a spreadsheet and getting it to work out the average factor increase. |
| All times are UTC. The time now is 01:27. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.