mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > Hardware

Reply
 
Thread Tools
Old 2013-11-11, 10:53   #1
Manpowre
 
"Svein Johansen"
May 2013
Norway

3×67 Posts
Default GTX 780 TI

http://anandtech.com/show/7492/the-g...-780-ti-review

http://www.tomshardware.com/reviews/...arks,3663.html

2880 stream processorc, but 1/24 DP speed of SP. :(

Titan is still king for cudalucas. but MfactC will be faster.
Manpowre is offline   Reply With Quote
Old 2013-11-11, 10:55   #2
Manpowre
 
"Svein Johansen"
May 2013
Norway

3118 Posts
Default

From Tomshardware.com:



Quote:
One might expect to see massive performance from Nvidia’s new offering here, but the GeForce GTX 780 Ti’s double-precision performance (1/24-rate) is much more limited than what you can achieve with GeForce GTX Titan (1/3-rate).

In many applications, this really doesn’t matter much, but the otherwise slower Titan is twice as fast in Blender. A look at a computational finance workload (Monte Carlo Price Options) shows a real-world double-to-single precision ratio of 1:25.8 for the GeForce GTX 780 Ti and 1:5.8 for the Titan. This is fairly close to the expected values. Clearly, you'll need to decide for yourself if lower compute performance is a problem before you spend $700 on a 780 Ti.
Manpowre is offline   Reply With Quote
Old 2013-11-11, 11:01   #3
Manpowre
 
"Svein Johansen"
May 2013
Norway

3118 Posts
Default

It is very interesting the test Anandtech did on compute, scroll down to folding at home double precision:

http://anandtech.com/show/7492/the-g...0-ti-review/14

Even 290x with its 1/8 DP speed over SP is slower than 780ti. where the Titan is almost twice as good as one 290x.
Manpowre is offline   Reply With Quote
Old 2013-11-11, 13:45   #4
LaurV
Romulan Interpreter
 
LaurV's Avatar
 
Jun 2011
Thailand

2·3·1,609 Posts
Default

Yeah, by the way, since I saw that result on mersenne.ca, my question was how real it is, and by how many (different users/cards) benchmarks is backed up. The card has some lousy DP performance, but it has about 200 more cores. Can the cudaLucas performance be real?
LaurV is online now   Reply With Quote
Old 2013-11-11, 15:30   #5
kracker
 
kracker's Avatar
 
"Mr. Meeseeks"
Jan 2012
California, USA

23×271 Posts
Default

Yep, I knew nVidia would kinda do something like this after 290X

Also, DP ratio "1/8" "1/2" does not tell you the actual speed, it just means etc DP is 1/8 of single precision.

The titan will be king in DP, but I think the 290X will be king in mfakto.

Last fiddled with by kracker on 2013-11-11 at 15:30
kracker is offline   Reply With Quote
Old 2013-11-11, 16:33   #6
Manpowre
 
"Svein Johansen"
May 2013
Norway

3·67 Posts
Default

Quote:
Originally Posted by kracker View Post
Yep, I knew nVidia would kinda do something like this after 290X

Also, DP ratio "1/8" "1/2" does not tell you the actual speed, it just means etc DP is 1/8 of single precision.

The titan will be king in DP, but I think the 290X will be king in mfakto.
Absolutely 290x will most probably be single precision king !
Manpowre is offline   Reply With Quote
Old 2013-11-11, 16:42   #7
ixfd64
Bemusing Prompter
 
ixfd64's Avatar
 
"Danny"
Dec 2002
California

5·479 Posts
Default

I wonder if it's possible to modify the card to increase the DP performance. There's a huge thread on hacking Nvidia cards at EEVBlog here: http://www.eevblog.com/forum/chat/ha...l-counterparts
ixfd64 is offline   Reply With Quote
Old 2013-11-11, 18:04   #8
TheMawn
 
TheMawn's Avatar
 
May 2013
East. Always East.

11×157 Posts
Default

If AMD were to adopt something like Cuda or anything similar which would be useful for LL and P-1, I think our GPU end of things would turn red in a big hurry. The R9 290X just plain beats the GTX 780 Ti in DP computing but I think the issue is actually using it. Too bad.

Apparently it's a good business practice to make the same chip and cripple its performance in different ways to market them as different GPUs? Unbelievable that some GTX 780 Ti is sitting in some dude's computer with idle hardware.

Obviously AMD and Nvidia aren't hurting all that much if they can afford to use drivers to disable a physical component on their GPU and sell it for $1000 less. Many poor sods who just don't know (hell, I didn't until now) are paying for a Tesla but are getting a Geforce with one or two switches pointing in other directions.
TheMawn is offline   Reply With Quote
Old 2013-11-11, 18:49   #9
xilman
Bamboozled!
 
xilman's Avatar
 
"𒉺𒌌𒇷𒆷𒀭"
May 2003
Down not across

22×5×72×11 Posts
Default

Quote:
Originally Posted by TheMawn View Post
Apparently it's a good business practice to make the same chip and cripple its performance in different ways to market them as different GPUs?
To me, that sounds rather naive.

For whatever reasons, none of which are intended, some chips don't make the grade.

Which would you rather do --- discard the inferior devices or market them as functional but of poorer performance than intended?

My preference ought to be clear.
xilman is offline   Reply With Quote
Old 2013-11-11, 20:19   #10
kracker
 
kracker's Avatar
 
"Mr. Meeseeks"
Jan 2012
California, USA

216810 Posts
Default

Quote:
Originally Posted by TheMawn View Post
If AMD were to adopt something like Cuda or anything similar which would be useful for LL and P-1, I think our GPU end of things would turn red in a big hurry. The R9 290X just plain beats the GTX 780 Ti in DP computing but I think the issue is actually using it. Too bad.

Apparently it's a good business practice to make the same chip and cripple its performance in different ways to market them as different GPUs? Unbelievable that some GTX 780 Ti is sitting in some dude's computer with idle hardware.

Obviously AMD and Nvidia aren't hurting all that much if they can afford to use drivers to disable a physical component on their GPU and sell it for $1000 less. Many poor sods who just don't know (hell, I didn't until now) are paying for a Tesla but are getting a Geforce with one or two switches pointing in other directions.
What Xilman said, many more expensive chips are binned higher so they will actually work/be stable for what it does. However, most of the time I think they overprice it... But there is a reason there are cheap cards and expensive cards.
kracker is offline   Reply With Quote
Old 2013-11-11, 22:09   #11
TheMawn
 
TheMawn's Avatar
 
May 2013
East. Always East.

11·157 Posts
Default

Quote:
Originally Posted by xilman View Post
To me, that sounds rather naive.

For whatever reasons, none of which are intended, some chips don't make the grade.

Which would you rather do --- discard the inferior devices or market them as functional but of poorer performance than intended?

My preference ought to be clear.
Quote:
Originally Posted by kracker View Post
What Xilman said, many more expensive chips are binned higher so they will actually work/be stable for what it does. However, most of the time I think they overprice it... But there is a reason there are cheap cards and expensive cards.
Neither of you understood my meaning, by the looks of things.

I'm not giving them crap for pricing a GTX 780 Ti higher than a GTX 780 despite it having the same chip. I fully agree that doing that is perfectly fine. It's analogous to an i7-4770 being binned as an i7-4770K or i7-4770k if the luck of the draw made it better or worse, respectively, than the standard for a plain 4770.

I'm giving them crap for this. And this. $3500 vs $730.


The GTX 780 Ti has its dual precision computing power crippled. The Tesla does not. Otherwise they're the same card. Go to the link in the ixfd64's post where the guy makes a few hardware and software hacks to fool his GTX 690 into thinking it's a Quadro.

Nvidia uses drivers that disallow the use of a number of the components based on the model. Driver says "You a 690 you no get fast DP" or "You a Quadro you get fast DP".

It's sort of similar to the story where you could flash the bios of your HD 6950 while shorting two pins and magically transform it into an HD 6970. The difference there was similar to the difference between the 780 and 780 Ti. The 780 has 1/15 of its hardware disabled because that portion of it didn't make the grade. The 780 Ti had all its hardware meet the cut but that was rare enough to justify binning it higher. Same as the 6950-6970 (though the two extra cores or whatever the hell it was DID actually function just fine).

I just can't see how only an eighth of the DP hardware "made the cut" on the GTX 780 Ti whereas the rest of the card is mint.

Last fiddled with by TheMawn on 2013-11-11 at 22:11
TheMawn is offline   Reply With Quote
Reply

Thread Tools


All times are UTC. The time now is 08:19.


Mon Aug 2 08:19:39 UTC 2021 up 10 days, 2:48, 0 users, load averages: 2.35, 2.15, 1.79

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.