mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   PrimeNet (https://www.mersenneforum.org/forumdisplay.php?f=11)
-   -   P-1 factoring anyone? (https://www.mersenneforum.org/showthread.php?t=11101)

Xyzzy 2010-11-03 13:14

[QUOTE]Or you can try to set the memory limit for each worker individually, (this would probably be best: you don't have to worry about when they do it - they'll each just take 256 MB when needed, whether that means you use 0, 256, or 512 MB for stage 2 at that moment) but this might not be working yet (see [URL]http://www.mersenneforum.org/showthread.php?p=232097#post232097[/URL] and the link in it, and the post after it, for more info).[/QUOTE]We have given up on trying to allocate memory to each core and now run four separate instances of the client on each computer. It is a bit more work to set up but it works perfectly.

Mini-Geek 2010-11-03 13:52

[QUOTE=Xyzzy;235430]We have given up on trying to allocate memory to each core and now run four separate instances of the client on each computer. It is a bit more work to set up but it works perfectly.[/QUOTE]

From my tests and what was supposed to be fixed, I think it's working right in the latest version. Might be worth another try. :smile:

lorgix 2010-11-13 08:27

[QUOTE=Xyzzy;235430]We have given up on trying to allocate memory to each core and now run four separate instances of the client on each computer. It is a bit more work to set up but it works perfectly.[/QUOTE]

Does it not work?

What's the problem? Seems like a pretty basic thing; if it's not working I'm assuming getting it fixed would have some priority.

Mr. P-1 2010-11-27 17:03

[QUOTE=davieddy;231836]Hmm: food for thought.

This may or not be an oversimplification, but
let's agree that the probability of finding a factor increases with
the time spent on P-1.
But the rate of finding factors increases up to a certain time spent
per exponent then decreases again.
If our goal was simply to find as many factors as possible, we would
operate at the peak of the curve.
But for a given exponent, it is worth spending extra time as long as
the time<probability of factor*duration of 2 LLtests.[/QUOTE]

In fact, B1 and B2 are chosen so as to minimise the expected time spent on that exponent. This happens where delta-time = delta-probability-of-factor * duration of 2 LL tests.

[QUOTE]Am I missing something?[/QUOTE]

Back in the days when P-1 was what you did before doing an LL on the same exponent, this was the correct optimisation. For those taking P-1 assignments now, it isn't. Rather you should maximise the expected time saving per exponent multiplied by the rate at which you do them. Here the "expected time saving per exponent" is how much less time overall is expected to be spent on the exponent, given that you did an efficient P-1, as compared to the relatively inefficient P-1 the LL-testing machine is expected to otherwise have to do. It's worth spending slightly less time on each exponent in order to be able to do more of them.

petrw1 2010-11-27 20:05

[QUOTE=Mr. P-1;238908]It's worth spending slightly less time on each exponent in order to be able to do more of them.[/QUOTE]

I would have thought it would be a toss up.... spending less time on each slightly increases the number you can do in a fixed unit of time but also slightly decreases the odds of finding a factor????

James Heinrich 2010-11-27 20:25

[QUOTE=petrw1;238946]I would have thought it would be a toss up.... spending less time on each slightly increases the number you can do in a fixed unit of time but also slightly decreases the odds of finding a factor????[/QUOTE]The key is "increase the number [i]you[/i] can do" -- you, as in a P-1 enthusiast with lots of memory allocated, vs random LL-tester who only has 8MB allocated and will do a poorer job of P-1.

Mr. P-1 2010-11-27 21:15

[QUOTE=petrw1;238946]I would have thought it would be a toss up.... spending less time on each slightly increases the number you can do in a fixed unit of time but also slightly decreases the odds of finding a factor????[/QUOTE]

It's a toss up, only if you limit consideration to that particular exponent. The limits are chosen so that the cost of doing the last iteration exactly matches the expected benefit of doing that iteration, (within the limits of our ability to calculate these costs and benefits, of course).

What's missing from the analysis is the opportunity cost of that iteration. Instead of doing it, we could have been doing a much more profitable earlier iteration of the next P-1 assignment.

[b]Edited[/b] to add: "Profitable" with respect of the metric I specified before, i.e., compared to the machine which would otherwise be doing the P-1 test.

lorgix 2010-12-07 08:50

Does anyone know (Mini-Geek for example) how one can determine how much memory is needed for prime95 to use Suyama's extension to a particular level? E=6 or E=12 e.g.

[B]Specifically;[/B]
let's assume Suyama's extension isn't a horrible waste, and that I have enough memory for E=12 which I want to use. [B]Given the exponent, how much memory is needed?



P.S.[/B] speaking of P-1; when my P-1 percentile for the last year rose to a certain level, my percentile for the last year started being higher than that for "lifetime". In conclusion; it appears that about the top sixth has been slacking off this last year or so (roughly).

KingKurly 2010-12-07 15:59

Looking over the top-500 contributor list to P-1 for the last 365 days, I've noticed some interesting entries. There is one person who has 320.235 GHz-days but only two assignments completed, and there are two people who have 283.642 GHz-days and still only two assignments completed. How large are these numbers that they're P-1ing that they are getting *SO* much credit per assignment? I have several cores set to the plain old P-1 large, and most assignments have been in the 53M range recently. I get about 4 GHz-days per assignment, and my most powerful machine turns out about one a day, with two cores running P-1 and 6GB assigned to GIMPS -- admittedly probably more RAM than is really necessary, but hey... are you guys really gonna complain? :)

Any thoughts, ideas, opinions, insight? Thanks in advance.

James Heinrich 2010-12-07 16:22

[QUOTE=KingKurly;240519]How large are these numbers that they're P-1ing that they are getting *SO* much credit per assignment?[/QUOTE]Probably manually-assigned 100M tests. For example, I'm currently working on M333000091, which gives [url=http://mersenne-aries.sili.net/credit.php?worktype=P-1&exponent=333000091&f_exponent=&b1=3365000&b2=93378750&numcurves=&factor=&frombits=&tobits=&submitbutton=Calculate]~207GHz-days credit[/url] (but takes a couple months to complete).

petrw1 2010-12-07 16:23

[QUOTE=KingKurly;240519]Looking over the top-500 contributor list to P-1 for the last 365 days, I've noticed some interesting entries. There is one person who has 320.235 GHz-days but only two assignments completed, and there are two people who have 283.642 GHz-days and still only two assignments completed. How large are these numbers that they're P-1ing that they are getting *SO* much credit per assignment? I have several cores set to the plain old P-1 large, and most assignments have been in the 53M range recently. I get about 4 GHz-days per assignment, and my most powerful machine turns out about one a day, with two cores running P-1 and 6GB assigned to GIMPS -- admittedly probably more RAM than is really necessary, but hey... are you guys really gonna complain? :)

Any thoughts, ideas, opinions, insight? Thanks in advance.[/QUOTE]

There is a group working in the 100 Million Digit Range: 332,19x,xxx.

According to these factoring limits:
[QUOTE][url]http://www.mersenne.org/report_factoring_effort/?exp_lo=332194531&exp_hi=332194531&bits_lo=0&bits_hi=999&txt=1&B1=Get+Data[/url][/QUOTE]and applying those parms here:
[QUOTE][url]http://mersenne-aries.sili.net/credit.php?worktype=P-1&exponent=332195321&f_exponent=&b1=3255000&b2=67541250&numcurves=&factor=&frombits=&tobits=&submitbutton=Calculate[/url][/QUOTE]

This one P-1 got over 169 Ghz Days


All times are UTC. The time now is 22:58.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.