mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   PrimeNet (https://www.mersenneforum.org/forumdisplay.php?f=11)
-   -   Are all LL-Test Assignments always already factor tested (https://www.mersenneforum.org/showthread.php?t=16242)

MersenneLover 2011-11-18 12:05

Are all LL-Test Assignments always already factor tested
 
Hey Mathematicans,

I have a little question about Prime95.
When I get an assignment (lets say for the 100 million digit numbers),
can I assume that i only get numbers that have been factor tested before?

Or can it happen that I get assignments for an Lucas Lehmer Test where there has not been done a factor test before?


Daniel

davieddy 2011-11-18 12:58

Welcome to the Forum!

All Mersenne numbers with exponent < 1 Billion have been TFed to
at least 64 bits.
A ridiculous number of factors have been (and are being) found.
However, before starting a LL test, several more levels should
be done (by GPU these days), so the factoring effort >62M to date
is practically worthless: the time needed doubles with each bit level.

David

Uncwilly 2011-11-18 13:54

To clarify things a bit.

Any assignment that you get for L-L testing has had [B]some[/B] Trial Factoring effort applied to it already. As Daviddy indicated, all 'active' exponents' (those with no known factor and not previously L-L tested) have been test for factors up to [B]at least[/B] the 64 bit level.
For the exponents that will yield a 100M digit number, the first ~4300 that have no known factor are already at the 71 bit level [B]or higher[/B]. If you ask PrimeNet to assign you one of these exponents to L-L test, your machine will also finish any outstanding TF work on the number below 77 bits (unless you have changed one of the particular settings in Prime95). And if possible you may do P-1 testing as well.

Daviddy's comment about >62M and GPU's does not fully apply as stated. Since most of the workers with big GPU firepower are working around the leading edge of the normal L-L test range, we don't have much in the way of GPU help for the 100M digit numbers. Currently most of the effort in that range is being put forward by those about to do a L-L test or myself (it appears).

axn 2011-11-18 14:14

[QUOTE=davieddy;279056]However, before starting a LL test, several more levels should
be done (by GPU these days), so the factoring effort >62M to date
is practically worthless: the time needed doubles with each bit level.
[/QUOTE]
No one asked about that. Quit impersonating RDS. Already!

Dubslow 2011-11-18 19:51

Just fyi, we do trial factoring to certain numbers, as well as P-1 factoring. If we say trial factored to 71 bits, we mean there are no factors for the Mersenne number that are less than 2^71. That's why TF'ing from 2^71 to 2^72 takes half as much work to go from 72 bits to 73 bits. P-1 factoring is a different method that finds factors with certain other properties, often finding factors higher than Trial Factoring can reasonably do. At any range, you will find assignments that have a variety of factoring efforts put into it. If you look at worktodo.txt, you can find out how factored your LL test numbers are. The file will have lines that look like this:

Test=ABC2349860879ABF9039407CABD,332479193,71,0

That means you're testing the exponent 332479193, which has no factors below 2^71, and has not had P-1 factoring done. Uncwilly said the preferred bounds are 77 bits, and we always to P-1, so chances are your computer will automatically trial factor from 2^71 to 2^77, then perform a P-1 test to find factors. If those don't work, then it will perform the LL test. If you don't want to do this factoring yourself, then the "First time LL test" work type option will typically give you exponents that require no more TF and have had P-1 completed, so you wouldn't need to do them.

petrw1 2011-11-18 20:12

[QUOTE=Dubslow;279097]Test=ABC2349860879ABF9039407CABD,332479193,71,0.[/QUOTE]

If this is a REAL Assignment ID please remove it

chalsall 2011-11-18 20:38

[QUOTE=petrw1;279099]If this is a REAL Assignment ID please remove it[/QUOTE]

Does that look like a real AID?

[CODE]mysql> select MD5(now() + rand());
+----------------------------------+
| MD5(now() + rand()) |
+----------------------------------+
| 52a2e1abe76810cd332b99e5a96d0c13 |
+----------------------------------+
1 row in set (0.00 sec)[/CODE]

Dubslow 2011-11-18 21:09

I should think it's pretty obvious where I switched between typing only letters and only numbers

davieddy 2011-11-19 20:04

[QUOTE=davieddy;279056]Welcome to the Forum!

All Mersenne numbers with exponent < 1 Billion have been TFed to
at least 64 bits.
A ridiculous number of factors have been (and are being) found.
However, before starting a LL test, several more levels should
be done (by GPU these days), so the factoring effort >62M to date
is practically worthless: the time needed doubles with each bit level.

David[/QUOTE]

[QUOTE=axn;279061]No one asked about that. Quit impersonating RDS. Already![/QUOTE]

I shall take this to be teasing!
I will replace "ridiculous" with "huge" in the interests
of distancing my tone from that of RDS, but otherwise
my post stands as a succinct answer to the OP:

(S)He was asking whether a 100M digit number would be
assigned without adequate TF/P-1.
The answer (as UnWilly "clarified") is "Yes, invariably ATM.
71 bits is "practically worthless" as regards the effort needed
to prepare for an LL"

David

Uncwilly 2011-11-19 22:59

[QUOTE=davieddy;279209](S)He was asking whether a 100M digit number would be assigned without adequate TF/P-1.[/quote]You assumed 'fully' factored. (Which I believe to be the OP's intent.)

[QUOTE=davieddy;279209]The answer (as Un[B][COLOR="Red"]c[/COLOR][/B]Willy "clarified") is "Yes, invariably ATM. 71 bits is "practically worthless" as regards the effort needed to prepare for an LL"[/QUOTE]I would dispute your assertion. A few facts and figures. Let's deal with the range of exponents from 332192831 to 332399999. There are currently 4230 'live' exponents in that range. There are 6304 exponents that have been factored out (5 by P-1) (less than 115 at bit levels above 71, includes the P-1's). There are 294 more expected to be removed by TF'ing all to 79 bits (maybe 1-2% more by P-1).

10570 total in range
6304 factored = 59.6% factored out (at 73.5 bit level average for the remaining exponents)

4230 - 294 = 3936 (left after TF)
3936 - 151 (those that remain that have had P-1) = 3785
3785 * 0.985 (figuring 1.5% removal by P-1) = 3728 LL's to be done.

6842 total exponent factored out (estimated) = 64.7% factored out (by going to 79 bit [2 higher than the original level, thanks to GPU's] and P-1)

71 bits was a minimum level for the range, the first 1500 exponents average over 76 bits and the first 1000 are over 77.

I don't think that it is "practically worthless".

If 1/(bit level) = chance of factor, holds true 90% of the exponents to be removed will happen below 71 bits. The math above shows less than 10% removal from the existing pool.
:farley:

davieddy 2011-11-19 23:52

[QUOTE=davieddy;279209]71 bits is "practically worthless" [B]as regards the[/B] [B]effort[/B] [B]needed[/B]
[B]to prepare for an LL[/B]"[/QUOTE]

Of course most factors are found at or below 71 bits, but
the time needed to do this is the same as that for raising
the limit from 71 to 72.

Please read my posts before "clarifying"(?) them.

David


All times are UTC. The time now is 10:20.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.