mersenneforum.org Coordination thread for redoing P-1 factoring
 Register FAQ Search Today's Posts Mark Forums Read

 2018-03-12, 04:03 #1 ixfd64 Bemusing Prompter     "Danny" Dec 2002 California 2·3·397 Posts Coordination thread for redoing P-1 factoring Some of us have been redoing P-1 on exponents with only stage 1 done. But because it's not (always) possible to register such assignments, this creates the risk of stepping on toes. Therefore, I decided to create this thread to coordinate such factoring efforts. Feel free to share which ranges you're working on and any interesting factors you find. I'll start: I have three machines that are redoing P-1 factoring:A dual-core laptop working on exponents around 43.9m and 47.9m A quad-core MacBook Pro working on exponents around 43.6m A quad-core desktop working on exponents around 47.6m and 77.2m All three computers are alternating between normal P-1 factoring and rerunning P-1 on exponents without stage 2 done. Last fiddled with by ixfd64 on 2018-03-12 at 04:04
 2018-03-12, 17:31 #2 petrw1 1976 Toyota Corona years forever!     "Wayne" Nov 2006 Saskatchewan, Canada 2·3·773 Posts Repost from the prior thread For me: - 50-59M range - For any .1M range that has more than 1999 unfactored - For exponents that the current P1 has B1=B2 (and not excessively large) I am running PMinus1 with B1=1000000 B2=20000000 (1M,20M) I expect to be at this all of 2018 but if anything changes....I'll post
 2018-03-12, 17:53 #3 chalsall If I May     "Chris Halsall" Sep 2002 Barbados 100101011011002 Posts I currently have four of my machines working in the 40M to 49M ranges (inclusive). They focus on a particular 0.1M range at a time. Currently they're working 45.7M, and will then focus on 44.2M (in about a week). I try to reserve them from Primenet so people see this activity (not always possible, since some of them have already had a DC). For anyone who is interested, I'm letting mprime decide the bounds, based on four LL assignments being saved (doesn't make sense, I know, but it's my kit). For 45.7M I've so far run 198 tests, and found 7 factors.
 2018-03-12, 18:04 #4 VictordeHolland     "Victor de Hollander" Aug 2011 the Netherlands 49816 Posts Most of my unreserved P-1 effort was in the range 1.5M - 1.7M (B1=10e6 B2=200e6), which I'm currently running ECM on (B1=50,000). Also I know Jocelyn Larouche is doing P-1 in the region below 4M.
 2018-03-12, 23:23 #5 ric     Jul 2004 Milan, Ita 18410 Posts I've two older machines slooowly doing P-1 on expos having B1<150k & B2<1Min the range 12.2M to 12.4M in the range 15M to 15.3M just for fun...
 2018-03-13, 11:03 #6 ET_ Banned     "Luigi" Aug 2002 Team Italia 113168 Posts I am doing P-1 testing from time to time, taking e xponents that have had poor or no stage 2 prefeÅ•ring smaller ones.
 2018-03-22, 21:01 #7 ixfd64 Bemusing Prompter     "Danny" Dec 2002 California 45168 Posts Update: I'm done with the 43.6 and 77.2m ranges for the time being. The MacBook Pro is now redoing exponents in the 44.1m range. Chris: I see that you've reserved a few exponents in the 44.1m range as well. Are you planning to do more?
 2018-03-23, 02:27 #8 petrw1 1976 Toyota Corona years forever!     "Wayne" Nov 2006 Saskatchewan, Canada 2×3×773 Posts Opinions or observations please... As you may have noticed I am on a P-1 binge recently. I have full time P-1 running on: 2 x 2-core PC's 6 x 4-core PC's Some have as much as 16GB of RAM; a couple only 4G. I have noticed over the past years of doing P-1 that more RAM makes a big difference in Phase 2. Simple observations have shown that running 480 relative primes in batch of 10 takes noticeably longer than that same run in batches of 120 for example. (I wouldn't be surprised if the following has been noted before and I missed it)... So that got me to thinking that, especially for the PC's with 4GB or 8GB of RAM it should complete more total tests per week if I ran 2 workers of 2 cores each rather than 4 workers with 1 core each. The phase 1 may be slightly slower but the phase 2 should be enough faster that it more than makes up for it; faster because with only 2 workers fighting for RAM they each will get a lot more and can process more relative primes. Opinions?
 2018-04-04, 09:39 #9 tha     Dec 2002 3·271 Posts I work in the 10M to 25M range. Currently in the 11M range and the 22M range.
2018-04-05, 13:12   #10
ric

Jul 2004
Milan, Ita

23·23 Posts

Quote:
 Originally Posted by ric in the range 12.2M to 12.4M
324 cands, expected prob ~2.5%, 3 new factors (0.9%).
Meh!

Right now working on 12.0M to 12.2M, then will extend to from 12.4M to 13M

2018-04-05, 14:30   #11
chalsall
If I May

"Chris Halsall"
Sep 2002

22×5×479 Posts

Quote:
 Originally Posted by ric 324 cands, expected prob ~2.5%, 3 new factors (0.9%). Meh!
Keep in mind that when you're redoing P-1 work you should expect a lower probability than what Prime95/mprime reports. As a rough guide, I subtract from what is reported as expected by what the previous run's probability was.

When I'm redoing poorly P-1'ed work (read: no Stage 2) I tell mprime that the test will save four LL tests, and give it between 10GB and 12GB of RAM to use. Doesn't make sense, I know, but it's my kit and electrons. On average in the 4xM range I get about a 3% success rate.

 Similar Threads Thread Thread Starter Forum Replies Last Post tha Lone Mersenne Hunters 23 2016-11-02 08:51 gd_barnes No Prime Left Behind 2 2008-02-16 03:28 gd_barnes Conjectures 'R Us 32 2008-01-22 03:09 Unregistered Software 27 2005-06-11 05:32 GP2 Lone Mersenne Hunters 0 2003-11-19 01:30

All times are UTC. The time now is 20:21.

Mon May 17 20:21:26 UTC 2021 up 39 days, 15:02, 0 users, load averages: 2.57, 3.01, 3.00