![]() |
ATI card under Linux, bad idea?
I'm wanting to buy a graphics card for my quad-core computer. I've heard good things about the crunching ability of ATI cards, but I've also heard that they're a nightmare for Linux users because of lousy drivers.
I have a strong preference for ATI cards, but is my time better spent purchasing an NVIDIA card to avoid a lot of heartache? |
Hardly anything prime-related crunches on ATI, and nothing prime-related (that I know of) crunches [I]well[/I] on ATI (yet). I'd go NVIDIA. (And I did.)
|
[QUOTE=Ken_g6;250011]Hardly anything prime-related crunches on ATI, and nothing prime-related (that I know of) crunches [I]well[/I] on ATI (yet). I'd go NVIDIA. (And I did.)[/QUOTE]
ATI can sieve pretty well, and I have faith that some programmer will see the light when it comes to LLRing and graphics cards. ATI stuff is awesome, just under-utilized, which makes it seem very appropriate that AMD bought the company. |
[QUOTE=jasong;250027] ATI stuff is awesome [/QUOTE]
Go for it then... |
I'd be very grateful if someone points me on how to make my Linux box use its GPU powers (at least, in compiz-stuff), because CPU-compiz is really slow... (And yes, I have ATI HD 3450 mobility)
|
[QUOTE=jasong;250027]ATI can sieve pretty well, and I have faith that some programmer will see the light when it comes to LLRing and graphics cards. [/QUOTE]
This sentiment comes up regularly; see [url="http://mersenneforum.org/showthread.php?t=13918"]here[/url] for the last time. I can't add anything new that that thread does not already deal with. The bottom line is that right now and for probably the next six months to a year, if you want to do anything with a graphics card that does not involve graphics, you can either buy an Nvidia card, or do it yourself, or do without. There is a fourth choice, that involves a lot of cursing the darkness, which is very popular. |
The only programs I know of that i think support ati are psieve and tpsieve. Both are pretty specialized programs for sieving. Nvidia is much better supported by most programs and will be for quite some time.
|
Yep, ATI cards somehow suck under Linux at the moment. More rendering artifacts,no CUDA,support for cards more than 5 years old gets dropped and so on. The situation might,however,get better when the opensource Radeon drivers get functionality and performance comparable to the proprietary ones.
ATI stuff might have more performance on paper than NVIDIA for the same price,but the actual performance at the moment is not impressive. |
Not fully related to GPGPU, but a Mozilla developer recently said that the only driver able to accelerate Firefox was the one from nVidia; see [URL="https://hacks.mozilla.org/2011/01/firefox-4-beta-9-a-huge-pile-of-awesome/comment-page-1/#comment-349829"]here[/URL]. I have kept away from ATI on Linux for years, and it doesn't look like I'll change my mind soon.
|
Radeons kick ass in some applications
[QUOTE=jasonp;250102]The bottom line is that right now and for probably the next six months to a year, if you want to do anything with a graphics card that does not involve graphics, you can either buy an Nvidia card, or do it yourself, or do without. [/QUOTE]
Radeon HD cards, especially the 5xxx and 6xxx series, are currently the best for mining Bitcoins due to their focus on integer performance. For the same reason, they are the preferred hardware for cracking passwords, as both operations involve a lot of hashing. [URL]https://en.bitcoin.it/wiki/Mining_hardware_comparison[/URL] [URL]http://golubev.com/gpuest.htm[/URL] I assume a suitable integer transform would make Radeons great for finding Mersenne primes. Unfortunately, many people with Radeons are busy making money, instead of contributing to science. |
[QUOTE=TeknoHog;256441]Radeon HD cards, especially the 5xxx and 6xxx series, are currently the best for mining Bitcoins due to their focus on integer performance. For the same reason, they are the preferred hardware for cracking passwords, as both operations involve a lot of hashing.
[URL]https://en.bitcoin.it/wiki/Mining_hardware_comparison[/URL] [URL]http://golubev.com/gpuest.htm[/URL] I assume a suitable integer transform would make Radeons great for finding Mersenne primes. Unfortunately, many people with Radeons are busy making money, instead of contributing to science.[/QUOTE] There is a lot of truths out there which hold true depending upon from what perspective you see it, as unlike the relativity theory, there is no objective truth from any perspective. One should always buy what one feels most happy with. I'd say place your bets now how efficient openCL is. Funny this thread started around 27 jan, whereas at 28 jan 2011, AMD released a great linux driver that gives support for gpgpu for linux and also acts as a driver for the 6000 series cards. Knowing that starting 2.5 version release of their gpgpu pack (which every few months has a new name when the Indian helpdesk has worked hard again manually replacing every name as with using the replace feature it would seem as if they didn't do much past 5 years - yes sahib) only opencl will be supported. Knowing their flagship gpu's 5970 and 6990 already got released quite some time ago it's most interesting to report that until recently opencl didn't work for the 2nd gpu ( "result undefined" ), yet i assume that maybe that might get fixed now as otherwise there is no way to do gpgpu at both gpu's of the cards carrying 2 gpu's. As for opencl you typically can expect really lots of code will get written in it. How fast opencl is, i hope to be able to answer that within not too many weeks from now :) Such code would of course also work great at nvidia (not sure they support opencl as of yet - didn't figure it out - i assume they will). Larrabee and/or AVX i am not sure how fast it would be for opencl. Throughput might not be very good for indirect adressing at AVX, we will have to sit and wait. Yet when opencl also gets fixed for other cards to work well, then of course AMD will rule there at the price/performance viewpoint. It is always easier to get support for generic projects than for specialized projects, so that would benefit AMD here. Yet if you want to run just 1 specific program and only a CUDA version of it is there, i'd advice buying a nvidia card as CUDA will *never* work at AMD. Regards, Vincent |
| All times are UTC. The time now is 15:12. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, Jelsoft Enterprises Ltd.