![]() |
How are GHz-Days calculated?
I know someone who was running Prime95 on their i7-920 (2.8GHz, four physical cores) for several weeks doing LL double test checks for 8 to 12 hours per day. They got roughly 450 GHz-Days which made sense. That came to 40 days or so worth of non-stop work.
I use my laptop for school but since I'm off for four months, instead of leaving it sitting around, I've got it propped up to open the vent a bit and have been running some trial factoring on it. I've been getting about 3.5 GHz days on assignments that take about a day and a half (which adds up since the processor runs at 2.3GHz). Now, I've done a bit of reading here and there and the important bits are, first, that Prime95 version 27 has some tweaks which make it run MUCH faster on third generation Intel core i series processors, and second, that ivy bridge is considerably faster, clock per clock than its predecessors. All in all, it would seem that my "latest tech" processor pulls a lot more weight for the same frequency. Not that I'm actually upset about this, but I thought it seemed a bit 'unfair' that someone running 100 GHz worth of core 2 duo could get five times the credit as someone running 20GHz worth of Ivy Bridge, even though they're accomplishing the same amount of work (by virtue of the newer tech just being a faster part). A couple of days ago, I finished my first set of four LL's in the upper sixty millions. At 4.6GHz, running for 14 days on four cores, I would have expected ~250GHz-Days but to my (happy) surprise, I got about 500. Also, my GPU managed to complete about 300 GHz-Days per day doing trial factoring. I've decided that the CPU credit is probably not determined by the frequency times the number of days it took to run the work, but determined for the actual assignment (before it's even sent out). How is the amount of credit for a job determined, then? Which hardware is it based on? |
A GHz-Day is based on a Intel Core 2 day, I think.
|
[QUOTE=Unregistered;340634]I've decided that the CPU credit is probably not determined by the frequency times the number of days it took to run the work, but determined for the actual assignment (before it's even sent out).[/QUOTE]... (or after it's been reported). The point is that the CPU credit is indeed determined only by what the actual assignment is, unaffected by what type of equipment runs it or the elapsed wall clock time in which it was accomplished.
[QUOTE=Unregistered;340634]I thought it seemed a bit 'unfair' that someone running 100 GHz worth of core 2 duo could get five times the credit as someone running 20GHz worth of Ivy Bridge, even though they're accomplishing the same amount of work (by virtue of the newer tech just being a faster part).[/QUOTE]Here's a more appropriate analysis: GIMPS's progress is not measured by wall clock time spent on assignments; it's measured by how much work (arithmetical computation) has been done. Thus, basing "GHz-days credit" (for those who care about such) on what was actually accomplished (from GIMPS's point of view) rather than on how much wall clock time was spent in accomplishing it, is the "fair" way. GIMPS isn't in the business of rating how "good" your system is, or how fast or how slow. It's in the business of accomplishing a certain set of arithmetical calculations. Per unit of wall clock time, a 100 GHz system does five times the work of a 20 GHz system, disregarding type of processor, and assuming the type of work is the same for both. Therefore, it's quite fair to award five times as much "GHz-days credit" (for, say, a month elapsed time's work) to the former than to the latter. Now, each type of processor has a different set of characteristics. Some processors can accomplish a certain type of work, such as TF, faster than others at the same GHz clock rate. Some GIMPS participants find it useful to have information on those differences, so GIMPS tabulates the various benchmark results for them. Other participants won't care. Some participants care about making the most efficient use (in the sense of accomplishing assignments in a given elapsed time) of the processors at their disposal. They might not be as happy doing what their processors are relatively less efficient at doing [I](if they know that!)[/I] than when doing what their processors are relatively more efficient at doing. Such participants are well-advised to do what makes them happiest. Other participants (such as me) are not as concerned about a processor's relative efficiency as they are about what type of work is accomplished. (I do what I'm happiest at doing, regardless of whether someone else might scorn my system as "slow". I take my processor's relative efficiency into account, but only as one, not-most-important factor,) [quote]How is the amount of credit for a job determined, then? Which hardware is it based on?[/quote]As kracker indicated, it's based on how long it takes an Intel Core 2 to accomplish that type of assignment. (BTW, every several years that definition is updated to whatever processor type is then more modern and widely used. The original GIMPS GHz-day, back in the 1990s, was based on performance of the Intel Pentium CPU that our project's founder was himself using at that time.) |
[QUOTE=Unregistered;340634]
How is the amount of credit for a job determined, then? Which hardware is it based on?[/QUOTE] Why does it matter how it is calculated? Why does the amount of "credit" matter? If one participates in these kinds of computations one should do it because of interest in the art of computation or interest in the subject. Participating merely to gather "cpu credit" is a poor motivation. Furthermore, measuring one's worth by such "credit" only assures that people with the most hardware get the most credit. Do you really believe that someone who has more hardware than you (especially if it is not their [i]own[/i] hardware) deserves more credit than you do? OTOH, if one participates because one wants to [i]learn[/i] some computer science or mathematics, that is a worthwhile motivation. But such people do not care about gathering "credits". |
[QUOTE=R.D. Silverman;340681]Why does it matter how it is calculated?[/QUOTE]Why is it so important to you to intrude your sour opinion of "credit" into so many threads?
Are you unable to stem a compulsion? Is it because you want to persuade everyone else to adopt the same life guidelines that you personally follow? What harm, exactly, could "credits" do to GIMPS that would outweigh the harm done by your sour rants in driving people out of this project? -- Or would you rather see such folks leave GIMPS than have them participate for the "wrong" reasons? |
In my opinion, credit is a nice way of giving an electronic "thankyou". Most people appreciate thanks.
|
[QUOTE=Brian-E;340713]In my opinion, credit is a nice way of giving an electronic "thankyou". Most people appreciate thanks.[/QUOTE]
Not like you can do anything with them though. Oh wait, you "could" say: "Haha, look I'm making/producing more than you!":devil: |
[QUOTE=R.D. Silverman;340681]Why does it matter how it is calculated?
Why does the amount of "credit" matter? If one participates in these kinds of computations one should do it because of interest in the art of computation or interest in the subject.[/QUOTE] if the credit is based on computation rates an interest in it is an interest in computation and math because you could gauge the rate credit/day change some systems might have a faster rate of change therefore be better at the computations, the differences could then be quantified, it could also spark a interest in other areas. |
[QUOTE=R.D. Silverman;340681]If one participates in these kinds of computations one should do it because of interest in the art of computation orinterest in the subject.[/QUOTE]
Wrong. Your error is in assuming that everybody should have the same motivations as you have. |
Not what I had in mind
Whoa ho ho there folks. Let's not get too carried away.
@Kracker: Thank you. The answer I was hoping for. I was glad to get the extra info too but this answered the question perfectly. @Cheesehead: I feel like there may have been a slight misunderstanding but that must be on my end because looking back I can't find what seemed to me like an accusation of being a credits hog. Haha. At any rate, I do agree that 100GHz of computing should get five times the credit as 20GHz of computing assuming the same amount of work, clock per clock. I think I missed that the first time I read through (you mentioned it a bit further down) so it seemed like you were telling me that my 20 GHz worth of i5-3570k should make a fifth of the credit of 100GHz worth of pentium despite the former being leaps and bounds ahead of the latter, clock per clock, and making much more progress. Like I said, now that I've re-read, I see that we are in perfect agreement. @R.D. Silverman: I don't want to start a nerd rage fest here, but I do want to throw in my two cents. There really isn't anything for you to take back, and I'm not asking you to backtrack, but perhaps you could chill out, just a little. You're being too defensive. I don't know how to quote, so I'll do it the old fashioned way. I quote: "...one should do it because of..." DOES in fact sound to me like you're saying one's motivations are wrong. The claim that you're telling us to have the same motivations as you is only as much of a stretch as your own assumption; I quote: "Participating merely to gather 'cpu credit' is a poor motivation." On its own, it doesn't look like much, but paired with "Why does it matter how it is calculated?" and "Why does the amount of 'credit' matter?" it does seem like you've made the same "mistake" wblipp made. The cynic in me feels like you assumed that all I wanted was CPU credits and you felt the need to express your disdain for someone like that. Fair enough. I'll give you the benefit of the doubt. To be honest, I pretty much do only care about maximizing CPU credits, but only because I now know that CPU credits are directly related to the actual amount of work that has been accomplished. Cheesehead hit the nail on the head. I'm just interested in getting the most out of my hardware, for the sake of the project. When I found out my GPU does a hundred times faster trial factoring than both CPU cores of my laptop put together, I immediately set the laptop up to gather P-1 work instead. At least this is something no other hardware can do way, way faster. If the project just took a GHz for what it was, then I wouldn't really care what kind of work I do. I would choose whatever needed doing most. But because I've been given a meaningful metric for the quantity of work I accomplish, I can choose whatever kind of work that my hardware does best and get the most mileage out of my parts. So, as you see, while what I want does really boil down to just credits, I do actually take this project seriously. And for the record, yes I do think that someone using more hardware or faster hardware should get more credit. They are moving the project along faster. As for people getting credit for using hardware that doesn't belong to them, well, let's just say I don't think Curtis Cooper is picking up very many ladies at the bar by lying to them about how much CPU credit HE PERSONALLY gets for GIMPS. He probably sees it like most of us; a metric for the quantity of work we accomplish. |
Closing thread - it can only go downhill from here. This debate has been argued many times before.
|
| All times are UTC. The time now is 23:23. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.