mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   PrimeNet (https://www.mersenneforum.org/forumdisplay.php?f=11)
-   -   P-1 factoring anyone? (https://www.mersenneforum.org/showthread.php?t=11101)

cheesehead 2011-11-03 05:06

[QUOTE=bcp19;276950]I have a P-1 question... I was running P95 with 2.5 gig available and figured I had more available so switched to 3.0 gig available and noticed that the P-1 changed bounds when the memory was increased. It was B1=3290000, B2=76492500 at 2.5 and is now B1=3320000, B2=8300000 and when I calculate the amount of time left to finish, the completion time has jumped by over 20% (36 vs 45). Does this change in memory after 12% completion and subsequent bound change cause a problem in the P-1?[/QUOTE]No. During stage 1, as long as progress hasn't reached the lower B1 yet, extending it to a higher B1 is mostly just a matter of keeping on doing what it's been doing, for longer.

If your calculation had reached stage 2, extending B1 would require discarding all the stage 2 work so far and going back to extend stage 1. So, IIRC without looking at the source code (it's been a while), prime95 will not change the stage 1 bound if you bump up the available memory after stage 2 starts, but might extend B2. (I could be wrong.)

[quote]Is it normal for more memory to cause longer times?[/quote]It is when prime95 is choosing its own bounds (Pfactor= or Test= in worktodo). In that case, it tries to optimize the probability of finding a factor, and going to higher bounds will do that, at the cost of a longer run time.

If, instead, the user specified the bounds (Pminus1= instead of Pfactor= or Test= in the worktodo), then allocating more memory will allow prime95 to use more stage 2 auxiliary work areas to eliminate some duplication of calculations, which will speed it up.

(If you're wondering why prime95 doesn't use the extra memory for more stage 2 workareas when it's choosing its own bounds -- well, it does ... but it also raises the B2 even more than that, so that it winds up spending more total time running with the extra (and thus, faster) workareas. There are tradeoffs involved.)

bcp19 2011-11-03 06:26

Thanks for the explanation, I'm still a bit new to this and don't really understand the processes involved that well. I'll leave it with the higher memory then.

Mr. P-1 2011-11-03 06:39

[QUOTE=cheesehead;276957]No. During stage 1, as long as progress hasn't reached the lower B1 yet, extending it to a higher B1 is mostly just a matter of keeping on doing what it's been doing, for longer.

If your calculation had reached stage 2, extending B1 would require discarding all the stage 2 work so far and going back to extend stage 1. So, IIRC without looking at the source code (it's been a while), prime95 will not change the stage 1 bound if you bump up the available memory after stage 2 starts, but might extend B2. (I could be wrong.)[/QUOTE]

You are. In fact, B1 is fixed from the start of the calculation. If you change your memory setting during stage 1, it will calculate a different bound, but it will still - without telling you - use the original bound stored in the save file.

As soon as stage 1 is complete, the stage 2 bound becomes fixed. Again, if you restart stage 2, whether due to a change in available memory or for any other reason, new bounds will be calculated, but the program will still revert to those stored in the save file. This time it will tell you if these bounds are different.

[QUOTE]It is when prime95 is choosing its own bounds (Pfactor= or Test= in worktodo). In that case, it tries to optimize the probability of finding a factor, and going to higher bounds will do that, at the cost of a longer run time.[/QUOTE]

Yes. With more memory, the cost per iteration is reduced, which means that it is worthwhile doing more iterations. The overall effect is to increase the running time, though this is worth it because you have a greater chance of finding factors.

cheesehead 2011-11-03 07:29

[QUOTE=Mr. P-1;276968]You are. In fact, B1 is fixed from the start of the calculation. If you change your memory setting during stage 1, it will calculate a different bound, but it will still - without telling you - use the original bound stored in the save file.[/QUOTE]I keep thinking I saw some code that did a "catch-up" (going back to include all the prime powers between old and new B1s) when B1 was increased.

petrw1 2011-11-03 15:21

[QUOTE=cheesehead;276973]I keep thinking I saw some code that did a "catch-up" (going back to include all the prime powers between old and new B1s) when B1 was increased.[/QUOTE]

I have on ocassion seen a message similar to "New B1 value ignored. Using B1 from save file instead" in my worker windows on startup with a new memory allocation.

bcp19 2011-11-03 15:22

[QUOTE=Mr. P-1;276968]You are. In fact, B1 is fixed from the start of the calculation. If you change your memory setting during stage 1, it will calculate a different bound, but it will still - without telling you - use the original bound stored in the save file.

As soon as stage 1 is complete, the stage 2 bound becomes fixed. Again, if you restart stage 2, whether due to a change in available memory or for any other reason, new bounds will be calculated, but the program will still revert to those stored in the save file. This time it will tell you if these bounds are different.



Yes. With more memory, the cost per iteration is reduced, which means that it is worthwhile doing more iterations. The overall effect is to increase the running time, though this is worth it because you have a greater chance of finding factors.[/QUOTE]

You just went and confused me again :/ If the bounds are not changed, even though the program restarts the worker and reports them changed, the variations I saw then make no sense.

Worker 1 is running a P-1 stage 1 on a 322M exponent, Workers 2,3&4 were doing ECM's. Worker 1 was completing .21-.22% every 6700 sec. Worker 3 switches to 60M TF and 8 min into the 150 min between W1's output and Worker 2 was switches to a P-1 on a 52M exp 66min in, Worker 1 shows 6550 sec, next iteration Worker 1 completes in 6389 sec.

Using the above times and %'s, at 6700 per 'tick' the entire run should take 36.9 days, at 6389, 35.2 days. As expected, switching from ECM's to TF/P-1 on other workers showed a drop in time spent per tick.

Worker 4 switches from Curve 2 Stage 2 on ECM of F22 to Curve 3 Stage 1 after 3 'ticks' from Worker 1 and time per 'tick' on W1 drops to around 5850 sec.

Memory is changed from 2.5 to 3.0 gigs available and W1 restarts. W1 now shows .13-.14% completion every 5750-5820 sec. Using 5785 sec and .14% completion, this shows total run time of 47.82 days.

The extended bound theory would explain the extention of the completion time, but I am at a loss if as you say it doesn't change anything.

Jwb52z 2011-11-03 21:25

Does anyone here have any idea why my account would now show a TF where I'd only ever told it to "Do what seems best" or "P-1" specifically? I've never witnessed my client do a TF, ever, and I don't remember seeing this show up in my account information before today.

James Heinrich 2011-11-03 22:19

Do you happen to know which exponent it says you TF'd? It's [i]possible[/i] that PrimeNet somehow misinterpreted a found factor as coming from TF rather than P-1, although this is far less likely with automatic results submission than manual submission, and usually happens the other way (TF factor misinterpreted as P-1) if it did. Of course, it's also possible that "Whatever makes sense" actually did assign you a TF assignment at some point. Checking results.txt may shed some light on the matter.

KyleAskine 2011-11-03 22:50

I know my main Desktop PC did a big run of TFs a while ago when I had it on 'Whatever makes sense'. I thought it was cool at the time because I liked the relatively quick turnaround for these, but now that I know more, it was probably a waste of time.

I think that Primenet probably shouldn't hand out TF unless it is an old computer (too old to DC), or someone specifically requests it. But I could be wrong.

If you want to see the numbers I got assigned you can look at the blue here:
[url]http://mersenne-aries.sili.net/index.php?showuserexponents=kyleaskine&usercompid=339[/url]

It was mostly in late September.

Christenson 2011-11-03 23:05

On that old computer, the thought is either DCs or ECM if it doesn't have enough memory to do P-1. We're not unhappy about the TF effort, it's just that GPUs, even my relatively cheap GTX440, are significantly more effective than CPUs at it.

Or just retire it, get it an ubuntu or xubuntu disk, and use it for everything else you do on the computer.....

KyleAskine 2011-11-03 23:33

[QUOTE=Christenson;277057]On that old computer, the thought is either DCs or ECM if it doesn't have enough memory to do P-1. We're not unhappy about the TF effort, it's just that GPUs, even my relatively cheap GTX440, are significantly more effective than CPUs at it.

Or just retire it, get it an ubuntu or xubuntu disk, and use it for everything else you do on the computer.....[/QUOTE]

I don't know if you are talking to me or someone else, but I have an i5-2500k with 16gig of ram. This computer is certainly not old.

And yes, once I learned more I started dedicating two cores to P-1 for the cause. I may add a third as soon as the current LL's that it is doing on two cores finish.


All times are UTC. The time now is 23:08.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.