mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Data (https://www.mersenneforum.org/forumdisplay.php?f=21)
-   -   New milestone (https://www.mersenneforum.org/showthread.php?t=7082)

S485122 2010-07-20 06:34

[QUOTE=petrw1;222057]Though these are all DC why do some show LL,% in the stage,% column and some just a %?[/QUOTE]You can reserve a double check as a first time test. There were some postings to that effect by a user who changes his "DoubleCheck=AID..." lines to "Test=AID..." so as to increase his LL test standings instead of his DC's one. For GIMPS it is the same : the exponent is assigned to that user for LL testing.

Jacob

davieddy 2010-07-20 08:50

I don't see why the latest milestone has provoked all this bruhaha
about double checks. These are progressing in the same orderly fashion
they have always done (~2M increase in exponent p.a.).
What is more alarming (see my thread "Is GIMPS grinding to a halt?")
is that the first time LL wavefront seems to have refused to jump
the 51M hurdle, despite the P-1 effort that has gone into that range.

BTW as an incentive for P-1 and the last bit (or 2?) of TF, why not
acknowledge the person who performed this valuable (if thankless) task,
should the number turn out to be prime?

David

PS Isn't it time to consider passing on intermediate residues
from returned partially completed tests?

PPS Can P-1 and/or TF be "torture tests"?

henryzz 2010-07-20 10:40

Maybe we should force pcs with trusted exponents to report every week and not to have a days to go of more than ~90 days.
IMO there is no way that the person who will take 763 days to finish 19632763 should have a trusted exponent. If he keeps that estimate he will have taken 1,388 days to finish an exponent that will have been a trusted exponent for most of that time. Trusted exponents should be aiming to finish in 6 months or less.
I agree with cheesehead when he says that nontrusted pcs should have assignments much further up like 30M for dcs.

Primeinator 2010-07-20 14:42

[QUOTE=Prime95;222048]Assignment rules updated.

For now, 25% of all "whatever makes the most sense" assignments will be double-checks. It would be great if the eagle-eyed readers of this sub-forum would analyze our first-LL and double-check rates over the coming months to suggest any changes.

Since first LL tests take ~4 times longer than dchks, this change represents committing only 7% of "whatever makes sense"'s resources.[/QUOTE]


Awesome!

[QUOTE=Prime95;222053]I know I shouldn't do this:

[url]http://mersenne.org/assignments[/url] is a grossly inefficient (you'll have to ask SQLServer programmers why) report. DO NOT use this during the first 10 minutes of the hour. DO NOT use this for a large number of assignments. Acceptable ranges are 21 to 23 million or a range of a few thousand in the first LL or TF areas.


New expiration rules coming soon.[/QUOTE]

The first exponent on this list now has a verified LL. I'm wondering... is it possible for us lowly beings to generate this type of report or is that reserved for individuals with super powers such as yourself?

Prime95 2010-07-20 15:53

[QUOTE=Primeinator;222092]Is it possible for us lowly beings to generate this type of report or is that reserved for individuals with super powers such as yourself?[/QUOTE]

Only I can do it. You need to know the database layout, PHP, and have server access.

This report has actually been available since 2008 but it can put a nasty load on the server. It is very similar to the old v4 reports.

I just tweaked it to limit the output to 1000 assignments. This should help avoid loading the server down when there a lot of assignments in the specified range.

Prime95 2010-07-20 15:56

[QUOTE=petrw1;222057]Though these are all DC why do some show LL,% in the stage,% column and some just a %?[/QUOTE]

Yes, this is the difference between a v4 and v5 client. A v4 client reporting 50% could be 50% through the LL or it could be 50% through the TF or P-1 preceeding an LL test!

axn 2010-07-20 16:02

[QUOTE=Prime95;222110]Only I can do it. You need to know the database layout, PHP, and have server access.

This report has actually been available since 2008 but it can put a nasty load on the server. It is very similar to the old v4 reports.[/QUOTE]

It would be better if the query is executed against the full dataset and cached in a standalone table (like once a day), and then run the report against the cache. No one really needs up to the minute version of this report. Or is that how it is done already?

garo 2010-07-20 16:30

axn, the problem with your solution is that it would require the status of all assigned exponents to be generated once a day. With dynamic querying, as long as the number of queries per day is small, you end up with less load on the server.

axn 2010-07-20 16:54

[QUOTE=garo;222124]axn, the problem with your solution is that it would require the status of all assigned exponents to be generated once a day. With dynamic querying, as long as the number of queries per day is small, you end up with less load on the server.[/QUOTE]

Possibly. It would help if we have sample numbers.

When it comes to load's on server, you should optimize for worst-case behavior even at the cost of average case behavior.

But even this can be addressed, if the cache can be updated incrementally. i.e Find the new assignments (Inserts), status updates (Updates), and completed / unreserved assignments (Deletes) since the last cache update.

chalsall 2010-07-20 16:59

[QUOTE=garo;222124]axn, the problem with your solution is that it would require the status of all assigned exponents to be generated once a day. With dynamic querying, as long as the number of queries per day is small, you end up with less load on the server.[/QUOTE]

Here's an alternative suggestion.

Create a new SQL table specifically for this query. Update it on an exponent by exponent basis when the client checks in. Thus the data would be "real-time", and the query would be light (we're talking less than 100,000 rows for both DC and LL).

Yeah, this would break the "fully normalized" structure of the database. But sometimes the ideal of full normalization doesn't actually map well to the real world. And the "full query" (or multiple sub-range queries) would only have to be run once to initially populate the new table.

The reason this query is so "heavy" has to be because of multiple joins through multiple tables.

chalsall 2010-07-20 17:28

[QUOTE=chalsall;222126]The reason this query is so "heavy" has to be because of multiple joins through multiple tables.[/QUOTE]

Thinking about this a bit further, I don't fully understand why this query is necessarily so heavy. But then I don't know the PrimeNet DB Schema.

Before creating a new table as I suggested above, have the SQL developers looked at their indexes, and ensured this query is currently optimal?

Have they looked at the SQL "use index" option to specifically tell the DB Engine what index(es) to use (sometimes the Engine makes a sub-optimal choice if more than one index is available).

Lastly, sometimes sub-queries can be your friend for speed....

(My apologies in advance if you've already looked at all these angles. Trying to help, not offend.)


All times are UTC. The time now is 01:24.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.