View Single Post
Old 2012-08-20, 18:29   #7
pinhodecarlos
 
pinhodecarlos's Avatar
 
"Carlos Pinho"
Oct 2011
Milton Keynes, UK

3×1,523 Posts
Default

Quote:
Originally Posted by VBCurtis View Post
If the CUDA-LLR becomes more efficient per-watt than CPU, should we stop running it on CPUs because that is now a waste of energy? Why does this matter?
First question: Yes.
Second question: For you it doesn't matter but for me it does. It's the principal of energy saving or producing more with less energy, that's what we call energy efficiency.

Anyway, I am not criticizing you, you can do whatever you want, it's your money not mine. I just want people to understand at the end of the month when they look at the electricity bill if it was an advantage to run llr on a GPU or on a CPU. You can compare for two different months, one only running a GPU and one running a CPU, taking in consideration the candidates tested. The ratio Watt per candidate is still better for a CPU.

Edit: Now I am trying to understand which CPU is more efficient of these three: i7-3770K vs i7-2600K vs i7-3930K because I want to help on testing some k=5 candidates. Everything points to the i7-3770K, better ROI, better Watt/candidate.

Last fiddled with by pinhodecarlos on 2012-08-20 at 18:43
pinhodecarlos is offline   Reply With Quote