![]() |
|
|
#12 | |
|
Jun 2003
7×167 Posts |
Quote:
Note to the original author: We are not saying that an LL test is several million times faster than trial division as a method of primality testing. That would be adding just 6-7 zeros at the end of the speed number. If it was several billion times faster, that would be adding 9-10 zeros. Several trillion times faster would add 12-13 zeros. Instead of using a single computer, we could distribute the task across many. Suppose every atom in the universe is a computer. Some estimates put that number at about 1080 That would add about 80 zeros to the speed factor. The LL test is much faster than that. Words cannot express how much faster adding several million zeros to the speed number is. The human mind cannot picture it. Now turn that around. If an LL test takes three weeks, think how long it would take to prove our numbers prime using TF alone. Last fiddled with by Mr. P-1 on 2013-06-21 at 11:10 |
|
|
|
|
|
|
#13 | ||
|
Jun 2003
7·167 Posts |
Quote:
Quote:
What actually happens is, unless the client has sufficient memory to make using the Brent-Suyama extension worthwhile (multi-gigabytes on numbers in the current test range), more memory increases the speed at which the client is able to complete stage 2 to particular limits. However, the client takes this into account when computing those limits, preferring to increase the limits (thereby increasing the chance of finding a factor) in exchange for spending more time, when the memory allocation is generous. I could so happen, I suppose, that the effect of this is to exactly cancel out the time saving, but I don't think it does, and I can think of no reason why it should. |
||
|
|
|
|
|
#14 | ||||
|
Jun 2003
7×167 Posts |
Quote:
If we don't find a factor, then we need to use another method to determine whether the number is prime or composite. That method is the LL test. Quote:
The LL algorithm itself is quite simple: 1. Start with the number 4. 2. Square it and subtract 2. 3. Repeat step 2 p-2 times, where p is the exponent you are testing. 4. If the result is divisible by Mp then Mp is prime. If not then Mp is composite. What makes this feasible (i.e., millions of zeros faster than TF) is that it requires only about p operations, while TF would require about 2p operations. The algorithm is simple enough that any high-school programmer could implement it. (In fact, one did). Implementing it efficiently is another matter, as is understanding why the algorithm works in the first place. A good explanation can be found here, though you will need a certain level of mathematical sophistication to be able to understand it. Quote:
Of course we only do this as long as the odds are in our favour. Over millions of tests, the project comes out ahead. Roulette isn't really a gample for the casino, is it? Quote:
|
||||
|
|
|
|
|
#15 |
|
May 2013
East. Always East.
11·157 Posts |
If you were to give a person just enough information about what a prime number is, and asked them to tell you how they would determine the primality of a number, they would probably give you the most naive way that exists. Dividing a number k by every number up to k and seeing if one of them returns a whole number.
Next, one might realize that you don't need to divide a number by, for example, 4. If a division by 4 gives a whole number, then so does a division by 2. A much better way is to divide by every prime number up to k. Next, a person might notice that if, for example, you were to divide the number 173 by 17, you would get a smaller than 17 because 17 x 17 is 289. You don't even need to try 17 because if it DID give you a whole number, it would be smaller than 17 and you already tried all the ones smaller than 17! You can then eliminate all the prime numbers equal to or larger than the square root of k. We've gone from trying all numbers up to k to trying all the prime numbers up to the square root of k. This is essentially what trial factoring is. It's a very brute force kind of method. My video card, a GTX 670, can factor 260892163-1 to 274 in about 5 hours. That is checking every single prime number up to roughly 18 sextillion. It takes about twice as much time to check every factor between 2x and 2x+1 as it does 2x-1 and 2x. So, going up to 275 would take another 5 hours. 276 would take 10 hours. Assuming the trend continues, the time it takes to trial factor is O(2n). To check 260892163, you would need to go up to 230446082 which is 230446008 times longer than 10 hours. It would take my video card around 230446011 hours to do that. The universe is about 247 hours old. Last fiddled with by TheMawn on 2013-06-22 at 00:14 |
|
|
|
|
|
#16 |
|
May 2013
East. Always East.
11×157 Posts |
It would take my video card around 230446011 hours to [try every prime factor for M60892163]. The universe is about 247 hours old.
I want to say this in it's own post since people might not read the whole thing I posted just previous to this. Last fiddled with by TheMawn on 2013-06-22 at 00:18 |
|
|
|
|
|
#17 | |
|
If I May
"Chris Halsall"
Sep 2002
Barbados
2×112×47 Posts |
Quote:
While a few hundred first time LL tests were assigned without P-1 pre-done after the last Mersenne Prime was announced, we're back to the point where all LL tests are assigned with P-1 already done. |
|
|
|
|
|
|
#18 | |
|
Romulan Interpreter
"name field"
Jun 2011
Thailand
41·251 Posts |
Quote:
If you read the link I provided above, about how GIMPS intelligently chooses the B1 and B2 limit, you will see that the story is "time related". If you spend the time T to do LL test, then maybe you can save two times T if you find a factor. So, how long time (x% of T) should you spend doing TF and P-1? It depends of the chance to find a factor. If you have a method with a higher chance to find a factor, it makes sense to chase that method longer (in time). So, logically, it is viceversa than one should expect. But you can't chase that chance too long, as your odds improve only logarithmically, reported to the time you spend doing it (P-1, in our case). So, in case a factor is not found, you lost precious time, which would have been better invested in doing LL from the start. Therefore there is a "cutoff" somewhere, and that cutoff (B1 and B2) have to be intelligently chosen, to maximize the chance of finding a factor in a limited time (x% of T), but the time is still the limit. So, at the end, the time variation is tiny. Adding more memory to the stage 2, you will work about the same time on that P-1 assignment, but you might get luckier. Last fiddled with by LaurV on 2013-06-22 at 05:08 |
|
|
|
|
![]() |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| TF factor as work type: F-PM1? | dh1 | Information & Answers | 2 | 2016-05-17 04:43 |
| Type of work to get? | Bispen | Information & Answers | 3 | 2016-01-27 16:46 |
| Best type of work for my cpu | Unregistered | Information & Answers | 11 | 2013-05-17 05:22 |
| Type of work to get? | ZFR | Information & Answers | 7 | 2011-09-17 08:43 |
| LL no factoring work type | edorajh | Information & Answers | 1 | 2010-04-16 16:55 |