![]() |
how are odds computed?
ive been wondering this for a long time
when you look at status there is a 1:###### chance you of finding a prime how is that number computed becuase i noticed it changes from number to number. is there a way to see what your teams total chances of finding a number are? |
If I read the source code in the commona.c module correctly, the estimate for a first-time LL test of 2^n-1 that has been (unsucessfully) trial-factored to 2^b and P-1 factored to the default limits for exponent n is that it has
1 chance in ( n / ( (b-1) * 1.803) ) of being prime. E.g., if you're LL testing M29998799 after it's been (unsucessfully) TF'd to 2^67 and P-1'd to B1,B2 = 340000,7650000 or thereabouts, the estimate is about 1:252,000 that it'll be prime. |
okay
is there a simple way to b1 and b2 it? |
The simplest way is to let Prime95/mprime choose B1,B2. They will if you use Test= or Pfactor= in the worktodo.ini file.
- - - If you want to use Pminus1= in worktodo, so that you specify B1,B2 yourself: Look in the Prime95 help file. Go to "How to use Prime95", then "Setting up available memory". In that section you can find the following table: [font=monospace]Exponent Minimum Reasonable Desirable 6,000,000 12MB 23MB 33MB 10,000,000 19MB 36MB 53MB 33,000,000 65MB 125MB 185MB[/font] Decide whether "Minimum", "Reasonable", or "Desirable" fits your situation, then set your Prime95 available memory number accordingly and remember the category (M/R/D) for the next step. Next, look in the Pminus1.txt file (after downloading Pminus1.zip and unzipping it). Each line lists an exponent, B1 limit, and B2 limit. Go to where the exponents are similar to the one you want to try factoring. Look at the various B1,B2 combinations that have already been tried for exponents similar to yours. If your available memory is "Minimum", look for lines where B1 = B2, such as [code]29922019,435000,435000 29922619,430000,430000[/code]and choose your B1,B2 similar to those. If your available memory is "Reasonable", look for lines where B2 is roughly 8-12 times as large as B1, such as [code]29908709,295000,2728750 29910631,310000,3487500[/code]and choose your B1,B2 similar to those. If your available memory is "Desirable" (big), look for lines where B2 is roughly 15-25 times as large as B1, such as [code]29908399,345000,8711250 29909807,330000,6187500 29910637,340000,7735000[/code]and choose your B1,B2 similar to those. You may see "oddball" lines with very small B1 such as [code]30000041,1000,1000 30597587,30,4000000[/code] Such low B1 values mean that P-1 using those limits has very, very little chance of finding a factor. Try them if you want to experiment, but don't hold your breath waiting for a factor. Among the small exponents you'll see lines with very large B1 (and B2) such as 4290000000. If you want to experiment with those on larger exponents, please just breathe normally ... until you run out of patience waiting for your program to get even 1% done. :) |
This highlights one of the things you should do if you change the amount of memory in your computer. I just recently put 1 gig of memory in mine up from 256 meg. Just after the memory add, I finished a LL test and started a new one. When my computer reached the p-1 test, it chose very low boundaries because Prime95 was still set to 64 meg. I upped it to 256meg for Prime95 and now the boundaries are more reasonable. Note that I had to stop Prime95 and then re-start it to use the higher boundaries.
Fusion :ermm: |
so the more memory i provide for testing the more accurate my odds are?
one last question is it me or does the odds get higher ( less desireable) the larger your exponent gets? |
[QUOTE=crash893]so the more memory i provide for testing the more accurate my odds are?[/quote]If you use Test= or Pfactor= in the worktodo.ini file, then the more memory you say is available to Prime95/mprime, the farther (higher B2) it will look for a factor, which gives you a higher chance of finding one.
If you use Pminus1= in the worktodo.ini file, then the more memory you say is available to Prime95/mprime, the faster it will get to the B1,B2 bounds that you specified. [quote]is it me [/quote]Yes. Prime95/mprime checks the userID and if it's "crash893" it says the odds are higher than it says they are for anyone else ... [u][i]NOT[/i][/u]! [quote]or does the odds get higher ( less desireable) the larger your exponent gets?[/QUOTE]Yes (sincerely), the odds get higher for larger exponents. That's why GIMPS checked the small ones first! As to whether higher odds are less desireable: The higher the odds, the rarer and more valuable if it is prime, and the greater the glory in finding one!! And isn't the Olympic Games motto something like "faster, higher, farther" (in Latin) ? |
odds are computed very carefully.
|
[QUOTE=cheesehead]
Yes. Prime95/mprime checks the userID and if it's "crash893" it says the odds are higher than it says they are for anyone else ... [u][i]NOT[/i][/u]! Yes (sincerely), the odds get higher for larger exponents. That's why GIMPS checked the small ones first! As to whether higher odds are less desireable: The higher the odds, the rarer and more valuable if it is prime, and the greater the glory in finding one!! And isn't the Olympic Games motto something like "faster, higher, farther" (in Latin) ?[/QUOTE] I KNEW IT j/k my thoughts on it were ( and im sure youll be able to tell i have no math backround from this statement) its seems to me that you know a prime is going to be in a certian block of numbers and as you complete numbers the odds should go up becuase you have elinated canadates. i know that you really cant use historical prime data to perdict where the next ones are (if you could we would probably be doing that) but you cant you get a rough idea of where they are with in a few 1000 exponents ( or million again i really dont know) |
[QUOTE=crash893]its seems to me that you know a prime is going to be in a certain block of numbers and as you complete numbers the odds should go up because you have eliminated canadates.[/QUOTE]
Lots of people have this same intuition. It's so common that it has a name - gambler's paradox. But it's wrong. Here's a simpler scenario that helps some people understand why it's wrong. Suppose we are rolling a pair of dice, and we looking for box cars (double sixes). You know that the probability on any roll is 1/36, one in thirty-six or one to thirty-five, depending on how like to express odds. Now suppose you have thrown the dice 24 time with no box cars. Do you think "there must be a box car in the next 12 rolls, so the odds are now 1 in 12?" If yes, then do you think after 35 failures the next roll is guaranteed to be box cars? It's always 1 in 36. Even after 72 failures, it's 1 in 36. A common error in reasoning is "it's got to average 1 in 36 in the long run, and it is below average now, so it's got to be above average from here." The error is that you do not return to average by compensation, you return to average by swamping out the early results. If you roll 1000 box cars in the next 36000 rolls, the average of 1000 in 36024 is pretty close. After 36,000 rolls it is even closer as 10,000 out of 36,024. You get to the long run average without any change in results because the long run is as long as needed. |
[QUOTE=wblipp]Lots of people have this same intuition. It's so common that it has a name - gambler's paradox. But it's wrong.
Here's a simpler scenario that helps some people understand why it's wrong. Suppose we are rolling a pair of dice, and we looking for box cars (double sixes). You know that the probability on any roll is 1/36, one in thirty-six or one to thirty-five, depending on how like to express odds. Now suppose you have thrown the dice 24 time with no box cars. Do you think "there must be a box car in the next 12 rolls, so the odds are now 1 in 12?" If yes, then do you think after 35 failures the next roll is guaranteed to be box cars? It's always 1 in 36. Even after 72 failures, it's 1 in 36. A common error in reasoning is "it's got to average 1 in 36 in the long run, and it is below average now, so it's got to be above average from here." The error is that you do not return to average by compensation, you return to average by swamping out the early results. If you roll 1000 box cars in the next 36000 rolls, the average of 1000 in 36024 is pretty close. After 36,000 rolls it is even closer as 10,000 out of 36,024. You get to the long run average without any change in results because the long run is as long as needed.[/QUOTE] I understand gamblers paradox ( belive me i understand after atlantic city) but i think there is a key diffrences with gambeling you could win once and never win again forever with primes we know that there are more out there we just dont know how far a part they are. |
[QUOTE=crash893]
but i think there is a key difference with gambling you could win once and never win again forever with primes we know that there are more out there we just dont know how far a part they are.[/QUOTE] As Pooh Bear said to Rabbit, "They're the same thing." You can look at it as "We know there are more box cars to be rolled, we just don't know how far apart they are." Or you can look at it as "although we have a pretty good heuristic, it's possible that we will never find another Mersenne Prime." You can focus on the unlikely "never again" scenario, or you can focus on the "random distance between events" aspect, but both descriptions apply equally well (or equally badly) to BOTH dice rolling and prime hunting. |
[QUOTE=crash893]as you complete numbers the odds should go up becuase you have elinated canadates.[/quote]Here we get confused with saying the odds go "up or "down" or "higher" or "lower".
When you previously wrote "the odds get higher ( less desireable)", you meant (I think) that the odds went from, for example, 1:252,000 to 1:400,000. That can be expressed two opposite ways. You could say the odds went "higher" from 252000-to-1 "up" to 400000-to-1. But you could instead look at the odds as fractions: the odds went "lower" from 1/252000 "down" to 1/400000. It seemed to me you were referring to the many-to-1 view, not the fraction view when you wrote "get higher (less desireable)". Now, when you write, "as you complete numbers the odds should go up" it seems to me that you are using the fraction view. The fraction's value goes up as its denominator goes down (from 1/252000 to 1/170000, for example). Am I right in thinking that here you're using the opposite view from the one you used earlier? [quote]with gambeling you could win once and never win again forever[/quote]Actually, at a legal regulated casino, the games aren't supposed to be rigged to keep you from ever winning again, assuming you have enough patience and stake to keep playing. (But that doesn't mean you'll make a long-term profit.) [quote]with primes we know that there are more out there we just dont know how far a part they are.[/QUOTE]No, we don't [u]know[/u] that there are more Mersenne primes. (No regulatory agency :-) There are good arguments that they go on forever, but no one has actually [u]proven[/u] that. |
[QUOTE=crash893]its seems to me that you know a prime is going to be in a certian block of numbers
and as you complete numbers the odds should go up becuase you have elinated canadates.[/QUOTE]If we knew a prime was going to be in a certain, fixed, block of numbers, then you're right: the odds would keep improving as we eliminate the duds. But hunting primes in unexplored territory is like rolling dice - you just can't tell exactly how many of a particular type will turn up in a game. In blackjack you have a different situation. You know exactly what cards there are at the start of a game. To some extent it is possible to watch the odds change by counting the cards. Casinos don't like people who do that. |
[QUOTE=cheesehead]The simplest way is to let Prime95/mprime choose B1,B2. They will if you use Test= or Pfactor= in the worktodo.ini file.
- - - If you want to use Pminus1= in worktodo, so that you specify B1,B2 yourself: Look in the Prime95 help file. Go to "How to use Prime95", then "Setting up available memory". In that section you can find the following table: [font=monospace]Exponent Minimum Reasonable Desirable 6,000,000 12MB 23MB 33MB 10,000,000 19MB 36MB 53MB 33,000,000 65MB 125MB 185MB[/font] Decide whether "Minimum", "Reasonable", or "Desirable" fits your situation, then set your Prime95 available memory number accordingly and remember the category (M/R/D) for the next step. Next, look in the Pminus1.txt file (after downloading Pminus1.zip and unzipping it). Each line lists an exponent, B1 limit, and B2 limit. Go to where the exponents are similar to the one you want to try factoring. Look at the various B1,B2 combinations that have already been tried for exponents similar to yours. If your available memory is "Minimum", look for lines where B1 = B2, such as [code]29922019,435000,435000 29922619,430000,430000[/code]and choose your B1,B2 similar to those. If your available memory is "Reasonable", look for lines where B2 is roughly 8-12 times as large as B1, such as [code]29908709,295000,2728750 29910631,310000,3487500[/code]and choose your B1,B2 similar to those. If your available memory is "Desirable" (big), look for lines where B2 is roughly 15-25 times as large as B1, such as [code]29908399,345000,8711250 29909807,330000,6187500 29910637,340000,7735000[/code]and choose your B1,B2 similar to those. You may see "oddball" lines with very small B1 such as [code]30000041,1000,1000 30597587,30,4000000[/code] Such low B1 values mean that P-1 using those limits has very, very little chance of finding a factor. Try them if you want to experiment, but don't hold your breath waiting for a factor. Among the small exponents you'll see lines with very large B1 (and B2) such as 4290000000. If you want to experiment with those on larger exponents, please just breathe normally ... until you run out of patience waiting for your program to get even 1% done. :)[/QUOTE] what i was more going after was that the prime95.exe seems to know the odds from the second you download the workunit it is not doing any sort of long complex math to do it either so ( or if it is its doing it very quickly) |
[QUOTE=crash893]what i was more going after was that the prime95.exe seems to know the odds from the second you download the workunit.[/QUOTE]
Sigh. It's always a disappointment when you try to teach somebody how to think about a problem, but they ignore the new tools you have shown them. Assuming that you are engaged in a discussion, you have to be arguing that "Odds in prime hunting are different from odds in rolling box cars because ...." In this context, your unstated argument has to be "because its easy to calculate odds in prime hunting and hard to calculate odds in rolling dice." William |
[QUOTE=crash893]( or if it is its doing it very quickly)[/QUOTE]Yes, computers are fast.
|
[QUOTE=cheesehead]Yes, computers are fast.[/QUOTE]
Not everyone - mine hasn't moved at all in the last 2 years! Lame duck... |
[QUOTE=wblipp]Sigh.
It's always a disappointment when you try to teach somebody how to think about a problem, but they ignore the new tools you have shown them. Assuming that you are engaged in a discussion, you have to be arguing that "Odds in prime hunting are different from odds in rolling box cars because ...." In this context, your unstated argument has to be "because its easy to calculate odds in prime hunting and hard to calculate odds in rolling dice." William[/QUOTE] I really dont think there is a need to talk down to anyone here. |
[QUOTE=cheesehead]If I read the source code in the commona.c module correctly, the estimate for a first-time LL test of 2^n-1 that has been (unsucessfully) trial-factored to 2^b and P-1 factored to the default limits for exponent n is that it has
1 chance in ( n / ( (b-1) * 1.803) ) of being prime. E.g., if you're LL testing M29998799 after it's been (unsucessfully) TF'd to 2^67 and P-1'd to B1,B2 = 340000,7650000 or thereabouts, the estimate is about 1:252,000 that it'll be prime.[/QUOTE] so is it 1: 29998799/((34000-1)*1.803)) im still confused over the variables in this also i would like to point out that im not trying to find a pattern or anything like that. i have a programing class and we were told to make a simple program that uses variable ( that the user types in ) and produces a result |
[QUOTE=crash893]so is it 1: 29998799/((34000-1)*1.803))[/QUOTE]
Not quite. Your 34000-1 interprets cheesehead's b-1, not unreasonably, as meaning B1 minus 1, whereas I think you'll find he meant the level to which it has been trial factored minus 1. So if your number has been trial factored to the 67 bit level (your worktodo.ini file will tell you; the entry will say something like Test=29998799,67,1) then your calculation is 1: 29998799 / ((67-1) * (1.803)) giving you odds of 1: 252094. |
[QUOTE=Numbers]Not quite. Your 34000-1 interprets cheesehead's b-1, not unreasonably, as meaning B1 minus 1, whereas I think you'll find he meant the level to which it has been trial factored minus 1. So if your number has been trial factored to the 67 bit level (your worktodo.ini file will tell you; the entry will say something like Test=29998799,67,1) then your calculation is 1: 29998799 / ((67-1) * (1.803)) giving you odds of 1: 252094.[/QUOTE]
awsome thanks |
Just out of interest, the 1.803 is a constant in the program if the p-1 test has already been completed.
Before the p-1 test is done then the formula used has the alternative constant 1.733. This would also be the case if for some reason you skipped the p-1 test altogether and never ran it. As an exercise or for your project you might like to run a loop for different values of the bit depth and look at how the odds vary for a given exponent depending on how far your trial factored it. |
Here is an example for an exponent I am currently LL testing.
constant: 1.733 1.803 trialfactor without p-1 after p-1 bitdepth 34843141 34843141 32 648570 623390 33 628302 603909 34 609262 585608 35 591343 568385 36 574447 552145 37 558491 536808 38 543396 522299 39 529096 508555 40 515530 495515 41 502641 483127 42 490382 471343 43 478706 460121 44 467573 449420 45 456947 439206 46 446792 429446 47 437079 420110 48 427780 411172 49 418868 402606 50 410319 394389 51 402113 386501 52 394228 378923 53 386647 371636 54 379352 364624 55 372327 357872 56 365557 351365 57 359029 345090 58 352731 339036 59 346649 333191 60 340774 327543 61 335094 322084 62 329601 316804 63 324285 311695 64 319137 306747 65 314151 301954 66 309318 297309 67 304631 292804 This is consistent with the way prime95 calculates odds (I made them integers for readability) and its telling me 1 in 292804 as expected. This is quoted as an estimate of the probability. I do not know in fact how accurate the formula is (possibly it is only valid for certain rough exponent size range and may be unsuitable for 100 million digit primes which would be bigger exponents than current testing). |
| All times are UTC. The time now is 08:46. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.