![]() |
![]() |
#1 |
Bemusing Prompter
"Danny"
Dec 2002
California
2,351 Posts |
![]()
Some of us have been redoing P-1 on exponents with only stage 1 done. But because it's not (always) possible to register such assignments, this creates the risk of stepping on toes. Therefore, I decided to create this thread to coordinate such factoring efforts. Feel free to share which ranges you're working on and any interesting factors you find.
![]() I'll start: I have three machines that are redoing P-1 factoring:
All three computers are alternating between normal P-1 factoring and rerunning P-1 on exponents without stage 2 done. Last fiddled with by ixfd64 on 2018-03-12 at 04:04 |
![]() |
![]() |
![]() |
#2 |
1976 Toyota Corona years forever!
"Wayne"
Nov 2006
Saskatchewan, Canada
23·34·7 Posts |
![]()
For me:
- 50-59M range - For any .1M range that has more than 1999 unfactored - For exponents that the current P1 has B1=B2 (and not excessively large) I am running PMinus1 with B1=1000000 B2=20000000 (1M,20M) I expect to be at this all of 2018 but if anything changes....I'll post |
![]() |
![]() |
![]() |
#3 |
If I May
"Chris Halsall"
Sep 2002
Barbados
24BA16 Posts |
![]()
I currently have four of my machines working in the 40M to 49M ranges (inclusive). They focus on a particular 0.1M range at a time. Currently they're working 45.7M, and will then focus on 44.2M (in about a week). I try to reserve them from Primenet so people see this activity (not always possible, since some of them have already had a DC).
For anyone who is interested, I'm letting mprime decide the bounds, based on four LL assignments being saved (doesn't make sense, I know, but it's my kit). For 45.7M I've so far run 198 tests, and found 7 factors. |
![]() |
![]() |
![]() |
#4 |
"Victor de Hollander"
Aug 2011
the Netherlands
23×3×72 Posts |
![]()
Most of my unreserved P-1 effort was in the range 1.5M - 1.7M (B1=10e6 B2=200e6), which I'm currently running ECM on (B1=50,000).
Also I know Jocelyn Larouche is doing P-1 in the region below 4M. |
![]() |
![]() |
![]() |
#5 |
Jul 2004
Milan, Ita
3·61 Posts |
![]()
I've two older machines slooowly doing P-1 on expos having B1<150k & B2<1M
just for fun... |
![]() |
![]() |
![]() |
#6 |
Banned
"Luigi"
Aug 2002
Team Italia
23·599 Posts |
![]()
I am doing P-1 testing from time to time, taking e xponents that have had poor or no stage 2 prefeŕring smaller ones.
|
![]() |
![]() |
![]() |
#7 |
Bemusing Prompter
"Danny"
Dec 2002
California
92F16 Posts |
![]()
Update: I'm done with the 43.6 and 77.2m ranges for the time being. The MacBook Pro is now redoing exponents in the 44.1m range.
Chris: I see that you've reserved a few exponents in the 44.1m range as well. Are you planning to do more? |
![]() |
![]() |
![]() |
#8 |
1976 Toyota Corona years forever!
"Wayne"
Nov 2006
Saskatchewan, Canada
453610 Posts |
![]()
As you may have noticed I am on a P-1 binge recently.
I have full time P-1 running on: 2 x 2-core PC's 6 x 4-core PC's Some have as much as 16GB of RAM; a couple only 4G. I have noticed over the past years of doing P-1 that more RAM makes a big difference in Phase 2. Simple observations have shown that running 480 relative primes in batch of 10 takes noticeably longer than that same run in batches of 120 for example. (I wouldn't be surprised if the following has been noted before and I missed it)... So that got me to thinking that, especially for the PC's with 4GB or 8GB of RAM it should complete more total tests per week if I ran 2 workers of 2 cores each rather than 4 workers with 1 core each. The phase 1 may be slightly slower but the phase 2 should be enough faster that it more than makes up for it; faster because with only 2 workers fighting for RAM they each will get a lot more and can process more relative primes. Opinions? |
![]() |
![]() |
![]() |
#9 |
Dec 2002
32·89 Posts |
![]()
I work in the 10M to 25M range. Currently in the 11M range and the 22M range.
|
![]() |
![]() |
![]() |
#10 |
Jul 2004
Milan, Ita
3·61 Posts |
![]() |
![]() |
![]() |
![]() |
#11 |
If I May
"Chris Halsall"
Sep 2002
Barbados
2×3×1,567 Posts |
![]()
Keep in mind that when you're redoing P-1 work you should expect a lower probability than what Prime95/mprime reports. As a rough guide, I subtract from what is reported as expected by what the previous run's probability was.
When I'm redoing poorly P-1'ed work (read: no Stage 2) I tell mprime that the test will save four LL tests, and give it between 10GB and 12GB of RAM to use. Doesn't make sense, I know, but it's my kit and electrons. On average in the 4xM range I get about a 3% success rate. |
![]() |
![]() |
![]() |
Thread Tools | |
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Redoing factoring work done by unreliable machines | tha | Lone Mersenne Hunters | 23 | 2016-11-02 08:51 |
Sieving reservations and coordination | gd_barnes | No Prime Left Behind | 2 | 2008-02-16 03:28 |
Sieved files/sieving coordination | gd_barnes | Conjectures 'R Us | 32 | 2008-01-22 03:09 |
P-1 factoring Q&A thread | Unregistered | Software | 27 | 2005-06-11 05:32 |
5.98M to 6.0M: redoing factoring to 62 bits | GP2 | Lone Mersenne Hunters | 0 | 2003-11-19 01:30 |