![]() |
|
|
#1 |
|
Jan 2016
1112 Posts |
Hey there, i got another question.
I was looking at M40292051. in 2008 on the 15th of jan someone found a residue meaning this is not a prime. Seems fine to me, but then i see 3 pieces of history: 2015-02-25 Gordon Spence NF no factor from 2^71 to 2^72 2014-12-25 Mark Rose NF no factor from 2^70 to 2^71 2014-12-22 Andrew B NF no factor from 2^69 to 2^70 Why are these done? isn't a double check enough? because trial factoring this number would not gather any real new information. so either there were 3 manual TF's on this number or it was assigned to these people after it was tested (unverified). am i misunderstanding this or is this extra work that is not necessary? |
|
|
|
|
|
#2 |
|
"David"
Jul 2015
Ohio
10058 Posts |
A double check needs to happen to confirm that the first residue is correct, however if we find a factor we know this number is not prime so we can skip the second check.
The DC may take several days but a fast card can check a bit further for factors in about an hour. There is roughly a 1 in 72 chance of finding a factor at that level, so we are better off trying that first before doing a DC. In 72 hours of trial factoring we can eliminate the need for one DC which saves time overall. |
|
|
|
|
|
#3 |
|
If I May
"Chris Halsall"
Sep 2002
Barbados
260316 Posts |
The reason this is done is it is more efficient to do additional Trial Factoring by GPUs to a certain level. The goal is to find an actual factor before the candidate is assigned to a CPU for the confirming second LL run (double check).
|
|
|
|
|
|
#4 | ||
|
Basketry That Evening!
"Bunslow the Bold"
Jun 2011
40<A<43 -89<O<-88
3×29×83 Posts |
Quote:
Quote:
Since roughly 2010-2012, GPUs have become sufficiently general and powerful that, since TF is so much faster than LL on GPUs, around that time period, it became effective to do additional trial factoring with GPUs beyond what had already been done by CPUs -- even on exponents that already had a check, as finding a factor would eliminate the need for the double check. Of course this "DCTF" doesn't run TF to as many bits as LLTF before the first test, but again, since the first test was run before the era of GPUs, it became retroactively worth it for GPUs to do additional factoring, and that's what you see. Any exponent that truly already had two matching residues will not have had any additional trial factoring by GPU. |
||
|
|
|
|
|
#5 |
|
Jan 2016
7 Posts |
Thanks for the replies guys, really appreciate it. seems logical
|
|
|
|
|
|
#6 |
|
"Brian"
Jul 2007
The Netherlands
7·467 Posts |
As an aside, a minor additional task of the project is to actually find factors of Mersenne numbers which are already known to be composite, ideally to completely factorise them. But such complete factorisations are generally feasible only for really tiny Mersenne numbers, and indeed as low as M1277 we encounter a composite Mersenne number for which no factor is known at all. This factorisation of known composites has nothing to do with prime hunting. I guess it is done because listing at least one of a number's factors is a more esthetically satisfying way of showing its compositeness than a non-zero residue from a primality test.
However, that was not the reason for the extra trial factoring for the number you gave. The above answers give you that. |
|
|
|
![]() |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Need an extra brain? Try an external one! | ewmayer | Lounge | 3 | 2009-02-04 20:51 |
| extra factors for P+1 ? | Pascal Ochem | GMP-ECM | 8 | 2007-02-10 17:36 |
| Extra Terrestrials | eepiccolo | Soap Box | 25 | 2005-02-03 09:34 |
| Your Extra Credit | JuanTutors | Puzzles | 5 | 2004-08-30 05:58 |
| Extra Stuff... | Xyzzy | Lounge | 11 | 2003-09-15 23:22 |