mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > Software

Reply
 
Thread Tools
Old 2018-02-08, 19:09   #1
kruoli
 
kruoli's Avatar
 
"Oliver"
Sep 2017
Porta Westfalica, DE

23·5·29 Posts
Default Intermediate P-1 results

Hey, everybody!

Since P-1 is a huge part of GIMPS to find factors and there is now GCD powered by GMP, I was thinking about this: What would it be like to be able to do one or a few GCDs of intermediate results? Of course, this does not need to be a mandatory option. I'm not sure how the calculation of \[b^a - 1 \text{ mod } n, \text{ where } a = \prod_{p \text{ prime},\ p < B_1} p^{e_p} \text{ with } e_x = \lfloor \log_x B_1 \rfloor\] is done internally in Prime95. Maybe one can find a factor when checking for the result of \(c_\text{max} > c \in \mathbb{N}\) \[\text{GCD}(b^{a_c}-1,n), \text{ where } a_c = \prod_{p \text{ prime},\ p < \left\lceil \frac{B_1 \cdot c}{c_\text{max}} \right\rceil} p^{e_p} \text{ with } e_x = \lfloor \log_x B_1 \rfloor\] with some predifined \(c_\text{max}\).

This is especially usefull when using huge bounds and the factor can be found early on. This could increase willingness to use higher bounds.

Greetings,
Oliver!

Last fiddled with by Nick on 2018-02-09 at 11:43 Reason: Amended at kruoli's request
kruoli is offline   Reply With Quote
Old 2018-02-08, 19:18   #2
GP2
 
GP2's Avatar
 
Sep 2003

13·199 Posts
Default

I'm not sure how long P−1 will continue to play a huge part.

The larger exponents get, the easier it is to do TF to higher bit depths. On the other hand, the larger exponents get, the longer P−1 takes.

At some point there will be diminishing returns to doing P−1 because most of the factors it might reasonably be expected to find will already have been found by TF.

I was doing P−1 in the 85M range, and had a long dry spell. I started again recently, but wonder if it's still worth it.
GP2 is offline   Reply With Quote
Old 2018-02-09, 06:33   #3
S485122
 
S485122's Avatar
 
"Jacob"
Sep 2006
Brussels, Belgium

22×33×17 Posts
Default

When I look at the success rate of the P-1 my computers do (about 4 % on the current 85M range) and the time they spend on it compared to LL tests it is still worthwhile. The kind of factors are different : you will have to TF a long time to find a 80 or 90 bits factor in that range (85 bits average on my work in the 80M-85M range.)

So, yes P-1 is still useful. The whole project cannot be reduced to TF.

Jacob
S485122 is offline   Reply With Quote
Old 2018-02-09, 07:11   #4
kruoli
 
kruoli's Avatar
 
"Oliver"
Sep 2017
Porta Westfalica, DE

23·5·29 Posts
Exclamation

Can a moderator please change \(\frac{B_1 \cdot c_\text{max}}{c}\) to \(\frac{B_1 \cdot c}{c_\text{max}}\)? Otherwise it would be useless.
kruoli is offline   Reply With Quote
Old 2018-02-09, 07:41   #5
kruoli
 
kruoli's Avatar
 
"Oliver"
Sep 2017
Porta Westfalica, DE

100100010002 Posts
Default

My experience with P-1 is like Jacob's (S485122), in that there is a measurable use of it (and there is no decline to be expected, as far as I believe). Since TF requires double the work per bit level, I assume we can go a few levels higher in the next years, but definitely not something like 10 levels. Take a M85,000,000 exponent as example. PrimeNet wants factoring to \(2^{71}\), GPU72 to \(2^{75}\). Using Jacob's numbers, you would need to go up at least to \(2^{85}\) to find half of the factors that P-1 should find. Doing that would require an estimated 92,185 GHzd in total. Going to larger exponent, e.g. M850,000,000, would only drop that cost by a factor of 10, but in that exponent range, \(2^{85}\) is already the bit level limit of GPU72, so would have to go even further.

I like the approach of finding an optimum between prefactoring and doing two LL-tests. P-1 is a big part of finding that approach!

GP2; When I look at your P-1 stat page, it seems like you use really high bounds (an average of 258,6 GHzd per assignment). Of course, that's nice for a higher probability of finding a factor, but that's nearly the cost of a LL in the 85M range. But a success rate of more than 20 % is really nice! So I do not get completely how your "dry spell"* happened.

*= The german word for that, Durststrecke, is much more descriptive, it combines Durst (thirst) and Strecke (route), so it mentions the urge (thirst) to reach something and the way left behind not being successful.
kruoli is offline   Reply With Quote
Old 2018-02-09, 16:49   #6
GP2
 
GP2's Avatar
 
Sep 2003

13·199 Posts
Default

Quote:
Originally Posted by kruoli View Post
GP2; When I look at your P-1 stat page, it seems like you use really high bounds (an average of 258,6 GHzd per assignment).
For a time I was doing P−1 on very small exponents. I experimented with doing stage 1 with mprime and doing stage 2 with GMP-ECM. So that probably skews the averages completely.

Right now I am only doing the automatically assigned P−1 exponents with automatically assigned B1 and B2 bounds, which are currently in the 85M range.
GP2 is offline   Reply With Quote
Old 2018-06-24, 13:44   #7
kriesel
 
kriesel's Avatar
 
"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest

3·5·457 Posts
Default

Quote:
Originally Posted by GP2 View Post
I'm not sure how long P−1 will continue to play a huge part.

The larger exponents get, the easier it is to do TF to higher bit depths. On the other hand, the larger exponents get, the longer P−1 takes.

At some point there will be diminishing returns to doing P−1 because most of the factors it might reasonably be expected to find will already have been found by TF.

I was doing P−1 in the 85M range, and had a long dry spell. I started again recently, but wonder if it's still worth it.
Take a look at the source code for prime95 or CUDAPm1. There's code in there to compute estimates of the odds of finding a factor and adjust the bounds to maximize the probability of a factor times the amount of time saved if one is found by avoiding n primality tests, where n is a number at the right of the assignment line (2 usually as PrimeNet issues them for manual assignments). In other words, it computes estimates of what's the rational better's best bets, within the memory constraints.
kriesel is online now   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
intermediate file write error TaliaW Software 14 2019-03-15 17:56
just an intermediate arithmetic sum MattcAnderson MattcAnderson 0 2017-05-08 02:32
.bu file intermediate results zabig Information & Answers 3 2013-01-16 00:42
Intermediate FFT runlenghts smh Software 1 2006-03-22 22:21
Intermediate Files ndpowell Software 3 2005-06-20 22:57

All times are UTC. The time now is 04:49.


Mon Oct 3 04:49:51 UTC 2022 up 46 days, 2:18, 1 user, load averages: 1.09, 1.22, 1.23

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2022, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.

≠ ± ∓ ÷ × · − √ ‰ ⊗ ⊕ ⊖ ⊘ ⊙ ≤ ≥ ≦ ≧ ≨ ≩ ≺ ≻ ≼ ≽ ⊏ ⊐ ⊑ ⊒ ² ³ °
∠ ∟ ° ≅ ~ ‖ ⟂ ⫛
≡ ≜ ≈ ∝ ∞ ≪ ≫ ⌊⌋ ⌈⌉ ∘ ∏ ∐ ∑ ∧ ∨ ∩ ∪ ⨀ ⊕ ⊗ 𝖕 𝖖 𝖗 ⊲ ⊳
∅ ∖ ∁ ↦ ↣ ∩ ∪ ⊆ ⊂ ⊄ ⊊ ⊇ ⊃ ⊅ ⊋ ⊖ ∈ ∉ ∋ ∌ ℕ ℤ ℚ ℝ ℂ ℵ ℶ ℷ ℸ 𝓟
¬ ∨ ∧ ⊕ → ← ⇒ ⇐ ⇔ ∀ ∃ ∄ ∴ ∵ ⊤ ⊥ ⊢ ⊨ ⫤ ⊣ … ⋯ ⋮ ⋰ ⋱
∫ ∬ ∭ ∮ ∯ ∰ ∇ ∆ δ ∂ ℱ ℒ ℓ
𝛢𝛼 𝛣𝛽 𝛤𝛾 𝛥𝛿 𝛦𝜀𝜖 𝛧𝜁 𝛨𝜂 𝛩𝜃𝜗 𝛪𝜄 𝛫𝜅 𝛬𝜆 𝛭𝜇 𝛮𝜈 𝛯𝜉 𝛰𝜊 𝛱𝜋 𝛲𝜌 𝛴𝜎𝜍 𝛵𝜏 𝛶𝜐 𝛷𝜙𝜑 𝛸𝜒 𝛹𝜓 𝛺𝜔