mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > Hardware > GPU Computing

Reply
 
Thread Tools
Old 2014-08-03, 17:43   #12
TheMawn
 
TheMawn's Avatar
 
May 2013
East. Always East.

11·157 Posts
Default

Quote:
Originally Posted by Robert_JD View Post
Any other PSU that has a non Gold star rating, and/or is rated at less than 800 watts is potentially courting with disaster in my opinion.
I would say this is a bit harsh. I have a 750 Watt 80+ Silver PSU, so I fail both of your tests, yet I have had zero issues. I use customer reviews more than anything, and I read some of the bad ones just to see why they get 1/5 or 2/5 reviews.

I would shy away from anything that doesn't have a 80+ certification at all, but 80+ Bronze should be okay. Check out http://en.wikipedia.org/wiki/80_Plus...certifications

I would stick to the well known manufacturers to start with. Corsair is my first stop, although I wouldn't be afraid to go Seasonic, Thermaltake, or Coolermaster.

It all depends on your Wattage, though. For me, if I'm running my 750W at 50%, which is 375W, the difference between Silver and Gold is 2% which means I would waste 7.5 Watts less if I had a Gold rated PSU. The excess heat won't be making the difference.

On the other hand, if you're looking at 1300W and close to 100% load, a bronze PSU is losing 234W which has to be somehow dissipated where a Platinum PSU is only losing 143W. (not to mention 90W does add up to a few $$ over the years).
TheMawn is offline   Reply With Quote
Old 2014-08-04, 02:07   #13
Mark Rose
 
Mark Rose's Avatar
 
"/X\(‘-‘)/X\"
Jan 2013

22·733 Posts
Default

If you'll be running full out, the additional cost of a Gold or Platinum power supply will pay for itself over Bronze or unrated PSU in a year or so. It's not such a big win for typical desktop use, but for anyone on this site it almost certainly would be.

Last fiddled with by Mark Rose on 2014-08-04 at 02:07
Mark Rose is offline   Reply With Quote
Old 2014-08-04, 02:52   #14
kladner
 
kladner's Avatar
 
"Kieren"
Jul 2011
In My Own Galaxy!

27AE16 Posts
Default

Quote:
Originally Posted by Mark Rose View Post
If you'll be running full out, the additional cost of a Gold or Platinum power supply will pay for itself over Bronze or unrated PSU in a year or so. It's not such a big win for typical desktop use, but for anyone on this site it almost certainly would be.
I switched from an overloaded 750 W Bronze, to a moderately adequate 1000 W Gold, and consumption went down very noticeably. I can't give figures, as it was quite a while ago, but it did make me sit up and take notice of the Kill a Watt.

The change above took place when I added a GTX 570 to a 460 already in use. All of a sudden BSODs were all too common.

Stepping up to a kilowatt PSU was plenty then. Things got tighter when I swapped out the GTX 460 for a 580, especially with overclocks.
kladner is offline   Reply With Quote
Old 2014-08-04, 03:28   #15
Robert_JD
 
Robert_JD's Avatar
 
Sep 2010
So Cal

2×52 Posts
Default

Quote:
Originally Posted by TheMawn View Post
I would say this is a bit harsh. I have a 750 Watt 80+ Silver PSU, so I fail both of your tests, yet I have had zero issues. I use customer reviews more than anything, and I read some of the bad ones just to see why they get 1/5 or 2/5 reviews.

I would shy away from anything that doesn't have a 80+ certification at all, but 80+ Bronze should be okay. Check out http://en.wikipedia.org/wiki/80_Plus...certifications

I would stick to the well known manufacturers to start with. Corsair is my first stop, although I wouldn't be afraid to go Seasonic, Thermaltake, or Coolermaster.

It all depends on your Wattage, though. For me, if I'm running my 750W at 50%, which is 375W, the difference between Silver and Gold is 2% which means I would waste 7.5 Watts less if I had a Gold rated PSU. The excess heat won't be making the difference.

On the other hand, if you're looking at 1300W and close to 100% load, a bronze PSU is losing 234W which has to be somehow dissipated where a Platinum PSU is only losing 143W. (not to mention 90W does add up to a few $$ over the years).
Can't really argue with what you just stated - Obviously, every rig is unique, especially in extrapolation of overall power requirements. Nevertheless, according my recent experience, I'm inclined with sticking with high wattage, 1K plus PSU's, ( 5yr warranty is a must) even it that means engaging in wattage overkill, i.e., 3770K Ivy Bridge, 8GB memory, one 2TB drive, and one Asus Titan drawing 250 watts.
Robert_JD is offline   Reply With Quote
Old 2014-08-04, 03:37   #16
LaurV
Romulan Interpreter
 
LaurV's Avatar
 
Jun 2011
Thailand

7·1,373 Posts
Default

Quote:
Originally Posted by Robert_JD View Post
and one Asus Titan drawing 250 watts.
This only with DP disabled (mfaktc). If you enable DP (cudaLucas), it may go to 280 or 300, and the "unlocked" one may go to 330W.

Last fiddled with by LaurV on 2014-08-04 at 03:37
LaurV is offline   Reply With Quote
Old 2014-08-04, 05:56   #17
Robert_JD
 
Robert_JD's Avatar
 
Sep 2010
So Cal

2×52 Posts
Default

Quote:
Originally Posted by LaurV View Post
This only with DP disabled (mfaktc). If you enable DP (cudaLucas), it may go to 280 or 300, and the "unlocked" one may go to 330W.
For about 90 percent of the time, I use the latest CUDALUCAS 2.0.5 release on BOTH of my Titans, ( EVGA black, and Asus ) with DP enabled running at stock specs. Therefore, I surmise a constant power draw of at least 300 W going 24/7 (or pretty close to that time frame) would have degraded the caps & other components of that 750W PSU to a point where the unit *somehow* became overloaded & took out the card. ( It only lasted for 8 months) However, as an added insurance policy, I intend on only purchasing the high capacity 1K plus units for future Titan oriented machines, because the trauma of losing another $1100-1200 card is definitely an experience I don't want to repeat.
Robert_JD is offline   Reply With Quote
Old 2014-08-04, 06:05   #18
LaurV
Romulan Interpreter
 
LaurV's Avatar
 
Jun 2011
Thailand

7×1,373 Posts
Default

edit limit, someone's asking: to enable DP, and unlock full DP performance you must open the NVIDIA Control Panel and navigate to "Manage 3D Settings". In the "Global Settings" tab you will find an option titled "Cuda – Double Precision" (or "cuda DP").

Caution: Due to "power limitation", Titans run at reduced clock speeds when full double-precision is enabled, to conserve power. This will not be good for your games (decrease gaming performance). Enabling DP will result in huge penalty for TF (up to 20%-30%, i.e. instead of 440GHzD/D you will get only 400, or 370, or so**, depending on many other parameters, if the card drives a display or not, if it does physics, etc).

Still, this is a great option if you are LL testing or running other cuda DP (like cuda FFT) applications. You will get a big gain for LL (up to 40%, i.e. instead of 3ms/iter, you will get 1.8ms/iter).**

So, depending of what you are using your card for, you can go one way or another. Physically, the Titans are "power locked" so they either use the internal SP cores at full capacity (good for TF) and use only few DP cores (or emulate DP operations with SP cores), or either use all DP cores at full capacity (good for LL) in detriment of the SP cores.

By overwriting the firmware of the card, or physically replacing the shunt resistors on the PCB (yes, they may use external shunt resistors to measure the power consumption, but not all card manufacturers use them) with smaller resistors, one can "unlock" the Titans, i.e. allow them to consume more power, and get higher performance. Of course, the lifetime of the Titan decreases considerably this way (and you instantly lose the warranty). But this is another story already.

-----------
** the numbers are given as example of ratios, your mileage may vary.

Last fiddled with by LaurV on 2014-08-04 at 06:19
LaurV is offline   Reply With Quote
Old 2014-08-04, 07:44   #19
Robert_JD
 
Robert_JD's Avatar
 
Sep 2010
So Cal

3216 Posts
Default

Quote:
Originally Posted by Robert_JD View Post
For about 90 percent of the time, I use the latest CUDALUCAS 2.0.5 release on BOTH of my Titans, ( EVGA black, and Asus ) with DP enabled running at stock specs. Therefore, I surmise a constant power draw of at least 300 W going 24/7 (or pretty close to that time frame) would have degraded the caps & other components of that 750W PSU to a point where the unit *somehow* became overloaded & took out the card. ( It only lasted for 8 months) However, as an added insurance policy, I intend on only purchasing the high capacity 1K plus units for future Titan oriented machines, because the trauma of losing another $1100-1200 card is definitely an experience I don't want to repeat.
Edit: oops,I neglected to add that doing first LL tests constitutes that 90 percent usage, while the remaining 10 percent is relegated towards a disabled dp mode running TF, which produces for me about 600 GHzD/D on Win 8.1, while in the mid 500s GHzD/D on a linux platform.
Robert_JD is offline   Reply With Quote
Old 2014-08-05, 00:40   #20
owftheevil
 
owftheevil's Avatar
 
"Carl Darby"
Oct 2012
Spring Mountains, Nevada

1001110112 Posts
Default

For me, running a titan or titan black adds ~260W for CL or CPm1 and ~200W for mfaktc. I've been running three titans (two vanilla, one black) 19 hours a day for a few weeks. Electricity is $.48 per KWh between 2pm and 7pm until October, so I shut the titans down during those hours.

A bit off topic, @Robert_JD, or anyone else running a titan black, what kind of memory speeds does it run at. I can't get mine to realize the advertised 7000Hz.
owftheevil is offline   Reply With Quote
Old 2014-08-05, 01:48   #21
Robert_JD
 
Robert_JD's Avatar
 
Sep 2010
So Cal

2·52 Posts
Default

Quote:
Originally Posted by owftheevil View Post
For me, running a titan or titan black adds ~260W for CL or CPm1 and ~200W for mfaktc. I've been running three titans (two vanilla, one black) 19 hours a day for a few weeks. Electricity is $.48 per KWh between 2pm and 7pm until October, so I shut the titans down during those hours.

A bit off topic, @Robert_JD, or anyone else running a titan black, what kind of memory speeds does it run at. I can't get mine to realize the advertised 7000Hz.
Funny you should mention that - I thought it was just MY imagination!

No matter how much I either try to OC, or run at stock - I seem to be stuck at the 6000 mark, no higher or lower figure - unlike the more vanilla Asus that I have which when OC, can correctly resolve M57885161 in just over 31 hours. The EVGA black version OTOH does it in 33 hours, but no faster without generating spurious residuals. Perhaps an upgraded driver is the solution, I really don't know at this point what other additional issues are possibly involved regarding the Black series.
Robert_JD is offline   Reply With Quote
Old 2014-08-05, 02:57   #22
owftheevil
 
owftheevil's Avatar
 
"Carl Darby"
Oct 2012
Spring Mountains, Nevada

32×5×7 Posts
Default

Thanks for the response. That makes 3 out of 3 (a fellow over at evga forums also reports only 6000 MHz on compute tasks). Although nvidia gave us linux users back the ability to overclock or underclock with software, it is very limited and can't be used to mess with the memory clocks on the titan blacks.
owftheevil is offline   Reply With Quote
Reply



All times are UTC. The time now is 20:16.


Fri Jul 16 20:16:32 UTC 2021 up 49 days, 18:03, 1 user, load averages: 2.24, 2.17, 2.20

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.