![]() |
|
|
#1 |
|
Just call me Henry
"David"
Sep 2007
Cambridge (GMT/BST)
23·3·5·72 Posts |
It is reasonably often commented on that different bases are more prime that other bases. Base 3 has often been thought to produce more primes than most others for example.
I have written a pfgw script to test the ks upto 100 for all bases upto the common size limit of 2^1000. I am currently upto base 87 on the riesel side and I thought I would post my results to see what people think. Bases 3, 19 and 49 are winning currently all with a prime 18.8% of the time. Any ideas for improvement? I know the results are slightly skewed on some bases currently by the lack of the detection of algebraic factors. I will post my code tomorrow when I have tidied it up a little. |
|
|
|
|
|
#2 |
|
Account Deleted
"Tim Sorbera"
Aug 2006
San Antonio, TX USA
17·251 Posts |
Base 3 should have many more primes than larger bases due to the simple fact that 3^n grows much more slowly than 100^n. Are you accounting for this? (I know there're still other effects, but this is a major one, and from reading your post I don't see anything about it)
Edit: Now I see that you said it's for less than 2^1000, not b^1000, which puts it much closer, but there still might be some minor effects from this. Something I'd like to see, in addition to this, is something that sees how many primes should be expected given the weights, etc. vs how many primes are really there, and if there are any trends visible from that. Last fiddled with by Mini-Geek on 2010-04-07 at 18:06 |
|
|
|
|
|
#3 | |
|
Just call me Henry
"David"
Sep 2007
Cambridge (GMT/BST)
23·3·5·72 Posts |
Quote:
I have however taken note of than in my spreadsheet. All bases and ks have been tested in the same size range upto whatever the equivelent in each base is of 2^1000. The fact that this means the weight is lower for the higher bases means they stand just as good chance of being good. In fact I have just noticed that base 113 has beaten all the others by far and has a prime 19.7% of the time upto a size of 2^1000 which would be ~113^146. Here is an updated file with bases upto 129 covered. |
|
|
|
|
|
|
#4 | |
|
Just call me Henry
"David"
Sep 2007
Cambridge (GMT/BST)
16F816 Posts |
Quote:
Any estimates we have on how many primes should be expected are based on something and probably have similar flaws to what we are experiencing here. However they are probably base on a large range of types of numbers so they should be of some help. I will add that to the spreadsheet tomorrow unless better suggestions come up. |
|
|
|
|
|
|
#5 |
|
May 2007
Kansas; USA
33×5×7×11 Posts |
How was the weight obtained?
If you only test bases to 2^1000, then you are unfairly negatively skewing high bases. That's because there will be less n-values where the form produces a value of <= 2^1000. To be more specific: q = k*b^n-/+1 will have many more primes where q <= 2^1000 when b=2 than when b=1024. Gary |
|
|
|
|
|
#6 | |
|
Just call me Henry
"David"
Sep 2007
Cambridge (GMT/BST)
23·3·5·72 Posts |
Quote:
The weight was obtained by a counter in the pfgw script. Base 163 has also beaten base 3 by having a prime 19.0% of the time. Edit: I just snatched the formula from Gary's odds of prime spreadsheet and about 7% of numbers are supposed to be prime. I think something must have gone wrong. Here is an updated spreadsheet with bases upto 271. Last fiddled with by henryzz on 2010-04-08 at 07:33 |
|
|
|
|
|
|
#7 |
|
Just call me Henry
"David"
Sep 2007
Cambridge (GMT/BST)
23·3·5·72 Posts |
Time to release my script to the world. There are lots of improvements that could be made but it basically works.
|
|
|
|
|
|
#8 | |
|
May 2007
Kansas; USA
289B16 Posts |
Quote:
On the odds of prime spreadsheet, you can't use a simple average of the n-value because the odds of prime don't reduce in a linear fashion as we get larger. You'd need to attach the spreadsheet that you used to come up with the 7% for me to see why there is the discrepancy. One more thing: The spreadsheet is not accurate at very low n-ranges. I'd suggest it only for n>1000. That may be the issue more than anything. Last fiddled with by gd_barnes on 2010-04-08 at 21:42 |
|
|
|
|
|
|
#9 | |
|
Just call me Henry
"David"
Sep 2007
Cambridge (GMT/BST)
23×3×5×72 Posts |
Quote:
I have attached my latest spreadsheet which contains all riesel ks upto 1024 and my feeble attempt at an estimate of how often a prime should normally be found based on Gary's odds of prime spreadsheet. |
|
|
|
|
|
|
#10 | |
|
Just call me Henry
"David"
Sep 2007
Cambridge (GMT/BST)
23×3×5×72 Posts |
Quote:
![]() I have attached it set up for base 2 as used on my spreadsheet. It gets the same answer. Any suggestions how I can do better? |
|
|
|
|
|
|
#11 | |
|
May 2007
Kansas; USA
33·5·7·11 Posts |
Quote:
The best example that I can think of to demonstrate this is: Let's say you wanted to approximate the number of primes from the simple numbers 2 to 1,000,000. You decide to do this by using an approximation for the chance of prime for the average of the two, i.e. ~500,000, for the entire range. This estimate would come in way too low because the chance of prime is so much higher for, say, 2 to 100 or 2 to 1000, than it is for anything else. The drop in # of primes is not linear. It drops rapidly at first such that attempting to take the average expectation from right in the middle of the entire range would not work. Short of fairly complex statistical analysis or calculus, there is not an easy way. It's very manual but here is the best way that I've come up with using the spreadsheet with the following example: Let's use a range of n=10K to 100K and I want to approximate the number of primes that I should expect from the entire range. What you want to do is break it up into smaller ranges like n=10K-20K, 20K-30K, etc. Be sure and divide your # of candidates by how many ranges that you are breaking it into. Get the expected # of primes from each range and add those all together. That should give you a much more accurate result. I can say that you'll get a much higher estimate because the reduction is not linear. If you want to refine it further, you could do n=1K pieces as in n=10K-11K, 11K-12K, etc. Does that make sense? Now, one thing here: Your percentage is WAY lower than it should be, i.e. less than half. I suspect using the above method will not more than double your percentage. Based on that, I think something else is wrong. The first that I suspect is that the spreadsheet does not accurately estimate very low n-values. I suggest using it for n>1000 or higher. I'll play around with the spreadsheet that you attached and see if I see anything. Gary Last fiddled with by gd_barnes on 2010-04-09 at 22:32 |
|
|
|
|
![]() |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Bases > 1030 and k's > CK | gd_barnes | Conjectures 'R Us | 132 | 2021-01-09 05:58 |
| k*b^n+/-1, Bases 271 and 11971 | robert44444uk | Math | 26 | 2021-01-08 07:08 |
| Numbers in Other Bases are Belong to Us | Stargate38 | Lounge | 44 | 2020-10-24 11:33 |
| Other Bases? | wblipp | GPU Computing | 50 | 2012-10-11 13:23 |
| Different bases = different times? | roger | Information & Answers | 1 | 2007-04-25 14:35 |