mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Information & Answers (https://www.mersenneforum.org/forumdisplay.php?f=38)
-   -   P-1 Rankings (https://www.mersenneforum.org/showthread.php?t=13956)

Rodrigo 2010-09-23 15:38

P-1 Rankings
 
Hello,

I've been donating computer cycles to GIMPS for a little less than two months now.

In that time, the two PCs that are dedicated to LL (I have two others dedicated to TF and LL-D) have accumulated four P-1 results as a prelude to starting LL work on their numbers.

What puzzles me is that these four (count 'em, 4) results, adding up to a grand total of 11 GHz-days, have already put me at the 58th percentile in the P-1 rankings.

Is there really so little P-1 work being done, that one can jump over half the listed participants with just four results?? At the end of last month I had a mere three results and that had already zoomed me up to the 47th percentile.

What's up with P-1 ?

Rodrigo

Mini-Geek 2010-09-23 16:31

[QUOTE=Rodrigo;231107]Is there really so little P-1 work being done, that one can jump over half the listed participants with just four results?? At the end of last month I had a mere three results and that had already zoomed me up to the 47th percentile.[/QUOTE]
Yes. As your experience demonstrates. :smile:
Most of the users listed (on any of the lists, not just P-1) never bothered to do more than two or three assignments, and since they are (mostly) older than your assignments, they were (usually) a bit less work and so a bit less credit per assignment. A good deal of the people even have 0 credit, (as of this writing, all 541 tied at rank 3903, or 541/(3903+541)=~12.2% of the people) so a single result, no matter how small, will put you in at least the 12th percentile. The upper portions of it (80, 90, 99 percentile) are much harder to reach.

petrw1 2010-09-23 16:59

[QUOTE=Mini-Geek;231113]The upper portions of it (80, 90, 99 percentile) are much harder to reach.[/QUOTE]

Yes I am closing in on 1,000 P-1s lifetime and 800 for the past year.
This puts me in the 99th Percentile in both; however so would 250 lifetime and 275 for the last year.

But only about 20 results will get you to 90th lifetime and about 10 will get to to 80th on both lists.

On the other hand if you are looking for top 100 or top 10 you will need about 80 and 1,000 respectively for last 365 days standings AND about 650 and 5,000 respectively for the Lifetime standings......MUCH more of a stretch.

ATH 2010-09-23 17:54

[QUOTE=petrw1;231116]On the other hand if you are looking for top 100 or top 10 you will need about 80 and 1,000 respectively for last 365 days standings AND about 650 and 5,000 respectively for the Lifetime standings......MUCH more of a stretch.[/QUOTE]

How much is that in "1976 Toyota Corona years"? :razz: Sorry couldn“t resist.

petrw1 2010-09-23 19:23

[QUOTE=ATH;231123]How much is that in "1976 Toyota Corona years"? :razz: Sorry couldn“t resist.[/QUOTE]

Almost exactly: 2^43,112,609-1 :showoff:

Brian-E 2010-09-24 00:11

[QUOTE=Mini-Geek;231113][...]Most of the users listed (on any of the lists, not just P-1) never bothered to do more than two or three assignments[...][/QUOTE]
Of course it's not always a question of not "bothering" to do more. :smile: Some of us have dabbled with P-1 (or other categories) in the past and then come to the conclusion that our resources are better used for a different work-type from the point of view of the project as a whole. In my case I realised that the memory I was allowing for P-1 factoring was less than most users with the result that I could be missing factors which other contributors would find. So I switched to LL double-checks.

Rodrigo 2010-09-25 20:48

Mini-Geek,

Thanks for the explanation, it makes sense.

Rodrigo

Rodrigo 2010-09-25 22:14

[QUOTE=petrw1;231116]Yes I am closing in on 1,000 P-1s lifetime and 800 for the past year.
This puts me in the 99th Percentile in both; however so would 250 lifetime and 275 for the last year.

But only about 20 results will get you to 90th lifetime and about 10 will get to to 80th on both lists.

On the other hand if you are looking for top 100 or top 10 you will need about 80 and 1,000 respectively for last 365 days standings AND about 650 and 5,000 respectively for the Lifetime standings......MUCH more of a stretch.[/QUOTE]
petrw1,

Thanks for the different angle.

I was going to comment on this to Mini-Geek, but I had to make sure of my calculations first so I ended up leaving my reply to him kind of brief. What he said led me to wonder how many PCs it would take to catch up to "curtisc," the leader in the overall "top producers" rankings.

Fortunately they have a Website, where they report having more than 850 computers participating, yielding 875,000 or so GHz-Days of work in the last year.

Now that my main computers have turned in their first LL results, I have numbers to work with for some back-of-the-envelope comparisons. Both are dual-core Pentiums, one a 2.20 GHz processor and the other a 2.00. Betwen the two of them they're averaging roughly 1.6 GHz-day per core, which would total 584 GHz-Days per year, per core. That's a starting point.

So if I were to go out and buy 160 PCs featuring Intel i7-980 six-core processors :smile:, with each core running at 3.33 GHz (i.e., about 1.57 times faster than my 2.2 and 2.0 GHz machines), I figure I could output about 917 GHz-days per core in a year. Multiplied by six, that's just about 5500 GHz-Days per year for each machine. Times 160, it's 880,000 GHz-Days, and presto! -- I'm in first place next year. :squash: Piece of cake.

Of course there are details to be taken into account and adjustments to be made to the numbers -- and I do recall reading something about memory bandwidth limitations for the 980 -- but I'm just musing over coffee. :coffee: One can dream...

Rodrigo

Mini-Geek 2010-09-26 12:51

I think you're underestimating i7-980s.
[url]http://www.mersenne.org/report_benchmarks/?exover=1&exbad=1&specific_cpu=4369094[/url] (benchmarks of i7-980s)
[url]http://mersenne-aries.sili.net/throughput.php?cpu=Intel(R)+Core(TM)+i7+CPU+965+%40+3.20GHz|256|8064&mhz=3600[/url] (the closest thing the site has to an i7-980, with a speed such that the 3072K benchmark runs at the same speed as an i7-980)
An i7-980 at 3.3 GHz can do about 3.6949 Ghz days per day at roughly the FFT size of current first-time LL candidates (3072K). Since that's for 25.11 benchmarks, we can add 10%-25% to that if we use v26 (for my i5, which has a similar architecture, 3072K was 25% faster). So 4.06439 to 4.618625 GHz days per day. That comes to 1484.5 to 1687 GHz days per year per core. Multiplied by six cores, 8907 to 10122 Ghz days per year per CPU. To hit your goal of 880,000 Ghz days per year, you'd need about 99 to 87 CPUs.
One i7-980 is $1000. Let's say you could build each system for $1400 (leaving a bit for decent MB and RAM, but very barebones...actually I don't know if it could be done for that cheap). That means you can plan to spend $121,800 to $138,600 on the computers, plus plenty more time and money on setting up the computers, replacing DOA parts, and paying for the electricity to run and cool this monster farm.
This is, like your calculations, quite back-of-the-envelope, but I think it's a better estimate than yours. :smile: It'd probably be far more cost-effective to use a cheaper CPU, like a quad i5 or i7, but it's still amazing that with around 100 CPUs you might be able to match curtisc!

Rhyled 2010-09-26 22:56

A few statistics
 
[QUOTE=Rodrigo;231107]What's up with P-1 ?
Rodrigo[/QUOTE]

I'm another relative noob (4 months on Prime95 so far) and noticed the same thing. Having allocated just one core to P-1, I find myself up at the 97th percentile for P-1. That surprised me.

When I dug into it, I noticed that the first time primalty tests get the most GHz-Days, while the trial factoring gets the most results. No great suprises there. Here's the current distribution of assignments, and how many GHz-Days were spent over the last year per category
[CODE]
[FONT=Courier New] TF P-1 LL LL-D (Data retrieved 9-26-10)[/FONT]
[FONT=Courier New]GHz-Days 11% 4% 83% 12%[/FONT]
[FONT=Courier New]Assignments 57% 1% 31% 12%[/FONT]
[/CODE]

Not to my surprise, LL work gets the most cpu time, but trial factoring has the bulk of the assignments.

I like P-1 work because it puts my i7 processor to serious use, and still give me results every day and a half or so. I got so tired of waiting 3 weeks just to see "LL done - not a prime". At least this way I get some timely feedback and the occasional semi-success (factor found).

Rodrigo 2010-09-27 04:31

[QUOTE=Mini-Geek;231513]I think you're underestimating i7-980s.
[/QUOTE]
Mini-Geek,

Wow, your numbers make it look even better than I thought. It IS amazing.

A hundred computers, neatly arranged side by side on a series of shelves. Maybe if I win the Publishers Clearing House sweepstakes...!

To go off on a bit of a tangent -- It sounds like you're happy with Prime95 v26. Would you say that it's ready for "prime" time (so to speak -- sorry, I couldn't resist) ?

Rodrigo

Rodrigo 2010-09-27 04:33

My own avatar!
 
Hey, I posted my reply to Mini-Geek and saw that I've received my very own avatar!

My sincere Thank You to the forum gods. :bow:

Rodrigo

Rodrigo 2010-09-27 04:45

[QUOTE=Rhyled;231565]
When I dug into it, I noticed that the first time primalty tests get the most GHz-Days, while the trial factoring gets the most results. No great suprises there. Here's the current distribution of assignments, and how many GHz-Days were spent over the last year per category
[CODE]
[FONT=Courier New] TF P-1 LL LL-D (Data retrieved 9-26-10)[/FONT]
[FONT=Courier New]GHz-Days 11% 4% 83% 12%[/FONT]
[FONT=Courier New]Assignments 57% 1% 31% 12%[/FONT]
[/CODE]

[/QUOTE]
Rhyled,

Very cool -- where did you find this information, or how did you derive it?

[QUOTE=Rhyled;231565]
I like P-1 work because it puts my i7 processor to serious use, and still give me results every day and a half or so. I got so tired of waiting 3 weeks just to see "LL done - not a prime". At least this way I get some timely feedback and the occasional semi-success (factor found).[/QUOTE]

I know what you're saying. If you're doing mostly LLs like I am, you're slowly drifting down in the rankings for several weeks, and then you suddenly leap up by hundreds of places.

Not that I planned it this way when I set them up, but my PCs have their assignments distributed such that over time the results will come in at a more steady rate. My little P233 notebook, doing TF work, gets me a little morsel of GHz credit on a daily basis, which helps to keep me going in-between the big feasts when an LL test is done.

Rodrigo

Rhyled 2010-09-28 00:49

Citing Sources
 
[QUOTE=Rhyled;231565][CODE]
[FONT=Courier New] TF P-1 LL LL-D (Data retrieved 9-26-10)[/FONT]
[FONT=Courier New]GHz-Days 11% 4% 83% 12%[/FONT]
[FONT=Courier New]Assignments 57% 1% 31% 12%[/FONT]
[/CODE][/QUOTE]

I knew I should have cited my sources. The GHz-Days statistics come from the Top Producers - Totals Overall report. I backed out the GHz-Days for each type of work for each user, summed them up and normalized them. Spreadsheets are marvelous tools.

For the Assignment breakdown, I pulled the report from PrimeNet Summary - Work Distribution Map, added up the active assignments and renormalized those. The time frame of the two rows of statistics don't match up - one is a year, the other a snapshot, but I feel that the average work distribution won't vary significantly over the year. Besides, it's the only data I can get at easily without running a report for each type of data.

The spreadsheet is too large (1.5 MB) for the forum's 244 KB upload limit but I can email it to you if you wish. Just send me a private message with your email address if you've got a really bad case of insomnia you wish to cure.

Rodrigo 2010-09-28 03:38

[QUOTE=Rhyled;231693]The GHz-Days statistics come from the Top Producers - Totals Overall report. I backed out the GHz-Days for each type of work for each user, summed them up and normalized them. Spreadsheets are marvelous tools.
[/QUOTE]
Rhyled,

Thanks for offering to send the spreadsheet, but it won't be necessary. I was mainly curious to learn how you came up with the figures, and you explained that well.

Chance to learn here -- Did you have to input the numbers yourself from those long columns of data, or is there an automated way to convert them into something that a spreadsheet program will read?

Rodrigo

cheesehead 2010-09-28 03:47

[QUOTE=Rhyled;231565]
[CODE]
[FONT=Courier New] TF P-1 LL LL-D (Data retrieved 9-26-10)[/FONT]
[FONT=Courier New]GHz-Days 11% 4% 83% 12%[/FONT]
[FONT=Courier New]Assignments 57% 1% 31% 12%[/FONT]
[/CODE]Not to my surprise, LL work gets the most cpu time, but trial factoring has the bulk of the assignments.
[/QUOTE]P-1 should actually account for more than 1% of the assignments because it's often done as the first part of an LL assignment that hadn't yet had the default P-1 performed. The 1% for P-1-only assignments doesn't include those, but the GHz-Days figure is based on P-1 [I]result reports[/I] which include those generated from LL assignments. (That also explains P-1's misleadingly high 4-to-1 ratio of credit percentage to assignment percentage on that report.)

S485122 2010-09-28 05:11

[QUOTE=Rodrigo;231712]Chance to learn here -- Did you have to input the numbers yourself from those long columns of data, or is there an automated way to convert them into something that a spreadsheet program will read ? Rodrigo[/QUOTE]Most Spreadsheet programs have a feature to convert text to columns. In Excel 2003 f.i. you find that feature in the Data menu... There are also text handling programs in *nix OS that would do the trick. Another possibility is writing a program and using text handling functions...

Jacob

Mini-Geek 2010-09-28 12:00

[QUOTE=Rhyled;231693]The GHz-Days statistics come from the Top Producers - Totals Overall report. I backed out the GHz-Days for each type of work for each user, summed them up and normalized them.[/QUOTE]

Did you account for the fact that sometimes a user's long name pushes its data out of alignment with the rest? If so, how? If not, it probably didn't really have a significant effect on the data. I ask because I can't figure a good way to do it. I'm sure some regular expression could do it quite nicely, but I'm not too skilled at those, and don't know exactly how to go from a regex to columns in a spreadsheet.

Rodrigo 2010-09-28 14:53

[QUOTE=S485122;231720]Most Spreadsheet programs have a feature to convert text to columns. In Excel 2003 f.i. you find that feature in the Data menu... There are also text handling programs in *nix OS that would do the trick. Another possibility is writing a program and using text handling functions...

Jacob[/QUOTE]
Jacob,

Very neat. And I did learn something, thanks!

Rodrigo

chalsall 2010-09-28 15:14

[QUOTE=Mini-Geek;231746]I ask because I can't figure a good way to do it. I'm sure some regular expression could do it quite nicely, but I'm not too skilled at those, and don't know exactly how to go from a regex to columns in a spreadsheet.[/QUOTE]

Here's a code snippit in Perl to transform the data into CSV format which can be imported into any spreadsheet. "$File" is the name of the status data in plain text (not HTML) format. (I *love* regular expressions!!! :smile:)

[CODE] open(IN, $File);

while (<IN>) {
if (/^\s*(\d*)\s*(.*)\s+(\d+\.\d*)\s*(\d*)\s*(\d*)\s*\|(.*)/) {
# print STDERR "Matched on $_";
$Rank = $1;
$Name = $2;
$GHzDays = $3;
$Attempts = $4;
$Successes = $5;
$Deltas = $6;
print "$1,\"$2\",$3,$4,$5\n";
}
}[/CODE]

Obviously you don't need to do all the variable assignements (e.g. "$Rank = $1;") if you're just going to use this to convert to CSV (via the "print" to STDOUT). This is pulled from one of my scripts that processes the data internally rather than simply exporting for processing by another tool.

Also, please note that extracting the 90, 30, 7 and 1 day range changes from $Deltas is a little tricky, and is left as an exercise to those interested.... :yucky:

Mini-Geek 2010-09-29 00:48

[QUOTE=chalsall;231768]Here's a code snippit in Perl to transform the data into CSV format which can be imported into any spreadsheet.[/QUOTE]

Cool snippet, you're obviously more skilled in the way of regex than I. :bow:
The deltas aren't being parsed for me. I didn't really care to try to use them, but I wasn't expecting it to be blank.
Also, it should be noted that on the Top Producers page it only gets the rank, name, and Ghz Days, not any of the other info. Not a big deal to me, just noting it. It's such a different format, it might be pretty hard to write for.

I made some minor modifications to it, and thought I'd share the different version.
[CODE] open(IN, $ARGV[0].'.txt');
open(OUT, '>' .$ARGV[0].'.csv');

print OUT "Rank,Name,\"Ghz Days\",Attempts,Successes,Deltas\n";
while (<IN>) {
if (/^\s*(\d*)\s*(.*)\s+(\d+\.\d*)\s*(\d*)\s*(\d*)\s*\|(.*)/) {
$Rank = $1;
$Name = $2;
$GHzDays = $3;
$Attempts = $4;
$Successes = $5;
$Deltas = $6;
print OUT "$1,\"$2\",$3,$4,$5\n";
$i++;
if ($i % 500 == 0) {
print "on line $i\n";
}
}
}[/CODE]This is usable from the command line. I'll assume you call the file rankings.pl. Copy/paste the rankings in a file like ll.txt. Then run "perl rankings.pl ll" (or, if you've got it set up to all trigger right with just this: "rankings ll"). It will give you a status update to the screen every 500 lines, (so you know it's working and have an idea of where it is) and it will save the CSV version to ll.csv. You can then open that with any spreadsheet program. :smile:
[QUOTE=Rhyled;231565][CODE]
[FONT=Courier New] TF P-1 LL LL-D (Data retrieved 9-26-10)[/FONT]
[FONT=Courier New]GHz-Days 11% 4% 83% 12%[/FONT]
[FONT=Courier New]Assignments 57% 1% 31% 12%[/FONT]
[/CODE][/QUOTE]

11+4+83+12=110%
Any idea why your data shows that? I recrunched the data (from the "hourly report generated Sep 28 2010 11:00PM UTC") using chalsall's parsing script, and I found some ratios for GHz-days...that add to 100% :smile::
P-1: 3.71%
LL: 74.61%
DC: 11.12%
TF: 9.80%
ECM: 0.77%

I also compared the number of tests, as given by the "Attempts" line, and got very different results from you. I think this may be because what I got the numbers from records every step of the way, whereas yours looked only at current assignments (perhaps).
P-1: 0.25%
LL: 0.14%
DC: 0.09%
TF: 98.33%
ECM: 1.19%

Oddly, I found a relatively small, but still significant, discrepancy between the summed GHz-Days for the Overall report vs the sums of the individual reports. The individual reports sum to 7812823.76 but the overall report sums to 7868772.93, a difference of 55949.17 GHz-Days. That's about 0.7%. Anyone have a guess as to the reason?

Rhyled 2010-09-29 01:47

Banging head on wall
 
[QUOTE=Mini-Geek;231831]11+4+83+12=110%
Any idea why your data shows that?[/QUOTE]

Sigh - because my denominator was only 3 out of the 4 categories. Stupid mistake. The corrected version, based on my dataset is:
[CODE]
[FONT=Courier New] TF P-1 LL LL-D [/FONT]
[FONT=Courier New]GHz-Days 10% 4% 75% 11% [/FONT]
[FONT=Courier New]Assignments 57% 1% 31% 12%[/FONT] (Current distribution)
[/CODE]

As for text handling, I use some shortcuts, especially for one-shot efforts like this. Simply choosing "paste special - text" into Excel got rid of all the annoying arrows and kept it to one line per member. I noticed the right hand side of the report was then fixed format, so all I had to do was find the second "|" in each string and extract substrings based on that position.

I decided not to include ECM figures because I don't associate those with Mersenne primes and it was somewhat difficult to back them out of the Work Distribution List report.

I'm glad you ran the individual reports - I wasn't so motivated. I knew the total attempts on trial factorings would be high, but I didn't expect quite 98%. At this rate, we should have all the trial factoring done for the 100M prime search a decade before we get serious numbers of LL candidates run.

Mini-Geek 2010-09-29 12:29

[QUOTE=Rhyled;231838]Sigh - because my denominator was only 3 out of the 4 categories. Stupid mistake.[/QUOTE]

Oh, I see. Now our numbers (for GHz-Days) match, as I'd expect.
[QUOTE=Mini-Geek;231831]The deltas aren't being parsed for me. I didn't really care to try to use them, but I wasn't expecting it to be blank.[/QUOTE]

I noticed why, and it's stupidly simple: The deltas are in $6, and are saved to $Deltas. The output line only goes up to $5. So they're, basically, intentionally being ignored.
I also noticed that a lot of the code you included, while useful if you're planning to use the data in more Perl code, was useless and unused for me. Here's an updated version of my modification:
[CODE] open(IN, $ARGV[0].'.txt');
open(OUT, '>' .$ARGV[0].'.csv');
print OUT "Rank,Name,GHz-Days,Attempts,Successes\n";
while (<IN>) {
if (/^\s*(\d*)\s*(.*)\s+(\d+\.\d*)\s*(\d*)\s*(\d*)\s*\|(.*)/) {
print OUT "$1,\"$2\",$3,$4,$5\n";
$i++;
if ($i % 500 == 0) {
print "on line $i\n";
}
}
}[/CODE]I decided to exclude the Deltas completely, including the column header for it. Same usage as before.

chalsall 2010-09-29 13:56

[QUOTE=Mini-Geek;231900]I also noticed that a lot of the code you included, while useful if you're planning to use the data in more Perl code, was useless and unused for me.[/QUOTE]

Yes, as I mentioned in my post. I left them in as they're useful if your going to process the data further in the script, and I thought it was also a good way of documenting what the regex extracted into what temporary variables.

Also, you'd correctly commented that this doesn't work on the "Totals Overall" report. For anyone who's interested, here's code for that report:

[CODE] if (/^\s*(\d+)\s*(.*)\s+(\d+\.\d*)\s*\|(.*)\|(.*)$/) {
$Rank = $1;
$Name = $2;
$GHzDays = $3;
$Deltas = $4;
$Percentages = $5;
}[/CODE]

Note that the $Percentages variable still needs to be broken down into the six possible values.

chalsall 2010-09-30 17:32

[QUOTE=Mini-Geek;231831]Oddly, I found a relatively small, but still significant, discrepancy between the summed GHz-Days for the Overall report vs the sums of the individual reports. The individual reports sum to 7812823.76 but the overall report sums to 7868772.93, a difference of 55949.17 GHz-Days. That's about 0.7%. Anyone have a guess as to the reason?[/QUOTE]

A thought just came to me, which might explain this...

Did you run your analysis from a full dataset of each work type (and overall) ("Customize"... "End Rank" = 10000 results in 6705 records with GHzDays > 0.000 for the overall report as of right now, for example), or only the reports' default top 1000?

If the latter, this might explain what you observed. If the former, I have no idea....

Mini-Geek 2010-09-30 19:41

[QUOTE=chalsall;232096]A thought just came to me, which might explain this...

Did you run your analysis from a full dataset of each work type (and overall) ("Customize"... "End Rank" = 10000 results in 6705 records with GHzDays > 0.000 for the overall report as of right now, for example), or only the reports' default top 1000?

If the latter, this might explain what you observed. If the former, I have no idea....[/QUOTE]

It was with all results, which is the default before you click Customize. When you click Customize, it changes to 1000. In checking that out, I just noticed the reason for the difference: I only took from the given links under Top Producers, but there's another category, visible under Customize: ECM on Fermat numbers! I guess I figured the ECM link included both, (or just forgot about ECM on Fermat) but it specifically says "ECM on small Mersenne numbers". When you click Customize, you get the option to see ECM on Fermat numbers. I'd have to rerun all the numbers to get a perfect record, but the current GHz-Days for the last year of ECM on Fermat numbers is 56187.37. That's a difference of just 238.2 from the last time I ran the report, which can probably be attributed to the recent work done. So I'd say it's almost certainly the only significant cause of the difference I observed.
So ECM on Fermat is about 0.71% of the total GHz-Days, which is just a little less than ECM on Mersenne.

chalsall 2010-09-30 20:04

Mini-Geek" "It was with all results, which is the default before you click Customize.

I'm not entirely sure you are correct here.

For empirical evidence, do all of the default queries provide more than 1000 records (other than, perhaps, ECM-F, which provides the full donation in less than 1000 records)?

If they don't provide more than 1000 records, then your claim you're working from the full data sets is clearly false.

Mini-Geek 2010-09-30 20:08

[QUOTE=chalsall;232109]Mini-Geek" "It was with all results, which is the default before you click Customize.

I'm not entirely sure you are correct here.

For empirical evidence, do all of the data sets provide more than 1000 records (other than, perhaps, ECM-F)?

If they don't provide more than 1000 records, then your claim you're working from the full data sets is clearly false.[/QUOTE]

None contain exactly 1000, most more, some less. I'm quite sure it's not limited to 1000, or any other obvious number. Here are the counts, (from line counts of the text files, which equates to the number of users, not the rank all the ones with 0 credit tie at) just to clarify/verify:
All: 7629
P-1: 4430
TF: 4048
LL: 3066
DC: 2394
ECM: 458
ECM-F: 139
As you can see, only the two ECMs have under 1001 people. With the now-marginal difference, I'm pretty darn sure there's nothing else being missed.
Now that all the reports are showing as the 7:00 PM report, I can recalculate. I'll do that now and either edit or post, hopefully I'll now see exactly 0 unaccounted for. :smile:

Mini-Geek 2010-09-30 20:29

[QUOTE=Mini-Geek;232111]Now that all the reports are showing as the 7:00 PM report, I can recalculate. I'll do that now and either edit or post, hopefully I'll now see exactly 0 unaccounted for. :smile:[/QUOTE]

Well, not exactly 0, but plenty close enough for my purposes: 0.03 GHz-Days apart this time! (7870898.66 in my sum vs 7870898.63 in the total report)

Here are the new GHz-Days percentages (out of all GIMPS work):
P-1: 3.68%
LL: 74.04%
DC: 11.08%
TF: 9.74%
ECM: 0.76%
ECM-F: 0.71%

And Attempts percentages (out of all GIMPS work):
P-1: 0.25%
LL: 0.14%
DC: 0.09%
TF: 98.29%
ECM: 1.19%
ECM-F: 0.04%

And something new: The ratio of Successes to Attempts in that category. This has a different meaning for each category, but still fun to compare. :smile:
P-1: 4.70%
LL: 0.00%
DC: 93.39%
TF: 2.32%
ECM: 0.59%
ECM-F: 0.01%

And just for the record: none of these categories 'just happened' to have 1000 results. They're all the full rankings. Also, this was all based off of the Sep 30, 7:00 PM hourly report.

chalsall 2010-09-30 22:57

[QUOTE=Mini-Geek;232115]Well, not exactly 0, but plenty close enough for my purposes: 0.03 GHz-Days apart this time! (7870898.66 in my sum vs 7870898.63 in the total report)[/QUOTE]

Thanks for your work here Mini-Geek. It answers fully a question many have had.

The minor difference you've found working on the full publicly available dataset is probably explained by the fact that PrimeNet rounds all individual records to 0.001 GHzDays.


All times are UTC. The time now is 10:30.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.