![]() |
GTX 1660 Ti
Anyone has bought the new GTX 1660 Ti? Any benchmarks on mfaktc? Power usage? Driver/CUDA stability/issues?
|
At this time i'm thinking of replacing my 750 ti with either a 1060 or a 1660... both are in the same price range.
[URL]https://www.anandtech.com/show/13973/nvidia-gtx-1660-ti-review-feat-evga-xc-gaming[/URL] or [URL]https://www.tomshardware.com/reviews/nvidia-geforce-gtx-1660-ti-turing,6002.html[/URL] from what I read, it has 5.4 Teraflops (FP32) compared to 4.4 for the 1060. |
In games etc. the 1660Ti should probably be about the same as a GTX 1070 (some are faster, some are slower). Some newer games have reportedly been programmed to take advantage of the concurrent FP32/INT32 execution in Turing, and some others can use FP16, which the Pascal-generation cards weren't particularly good at. Even though tensor cores were removed, the GTX 1660Ti has dedicated hardware for FP16. There is some speculation that this is because some AMD-sponsored game titles extensively use FP16, since the Vega cards can run packed FP16 at twice the rate of FP32. So it could be a reaction to one of the few advantages that AMD still had.
The one outlier is again factoring with mfaktc. The clock speeds seem to be about the same as on RTX cards, so the core count is all that matters. 20% down from RTX 2060... I'd still expect it to be a bit faster in mfaktc than a GTX 1080 Ti. Not planning to buy one, though... |
[QUOTE=nomead;509756]The one outlier is again factoring with mfaktc. The clock speeds seem to be about the same as on RTX cards, so the core count is all that matters. 20% down from RTX 2060... I'd still expect it to be a bit faster in mfaktc than a GTX 1080 Ti.[/QUOTE]
This is what interests me as well. Whether it will perform like a RTX 20 series or like a GTX 10 series (or somewhere in between). Doesn't look like Linux drivers are ready yet. |
[QUOTE=axn;509764]Doesn't look like Linux drivers are ready yet.[/QUOTE]
Well, Phoronix did Linux benchmarks using driver version 418.43: [url=https://www.phoronix.com/scan.php?page=article&item=nvidia-gtx1660ti-linux&num=1]games[/url] and [url=https://www.phoronix.com/scan.php?page=article&item=nvidia-gtx1660ti-opencl&num=1]openCl[/url]. The INT performance [url=https://openbenchmarking.org/embed.php?i=1902287-SP-OPENCLGTX81&sha=4dab7f8&p=2]looks promising[/url], with performance 83% of the RTX 2060 (more than double a GTX 1080). |
[QUOTE=Mark Rose;509776]Well, Phoronix did Linux benchmarks using driver version 418.43: [url=https://www.phoronix.com/scan.php?page=article&item=nvidia-gtx1660ti-linux&num=1]games[/url] and [url=https://www.phoronix.com/scan.php?page=article&item=nvidia-gtx1660ti-opencl&num=1]openCl[/url].[/QUOTE]
Ok. Looks like SNAFU on NVidia's part. If you go [URL="https://www.nvidia.com/Download/index.aspx?lang=en-us"]here[/URL] and select 16 series, it doesn't offier Linux as a platform choice, but if you select RTX 20 or GTX 10 series, it shows Linux, and on selecting it, it will list 418 driver where it says that it supports 16 series ! |
Just wondering if someone has hard figures for the TF performance of this card. It might be an interesting (read: more affordable while delivering decent performance... :smile:) alternative to the 2xxx series.
|
[QUOTE=lycorn;515757]Just wondering if someone has hard figures for the TF performance of this card. It might be an interesting (read: more affordable while delivering decent performance... :smile:) alternative to the 2xxx series.[/QUOTE]
I actually bought this sucker, though I'm not using it for GIMPS TF work. Nonetheless, I can offer some preliminary numbers. These are run on a system with P95 running on all 4 cores, and the GPU is responsible for Xorg stuff. If you do any screen activity, the numbers will tank. And I'm making no claims that mfaktc parameters are optimal. YMMV. barrett76_mul32_gs - 1440 - 1650 - 1690 barrett87_mul32_gs - 1330 - 1530 - 1550 The first number is GD/d @ 70w power limit, second @ default 120w power limit, and third @ max power limit of 150w. Personally, I run @ 70w pl. GPU is GIGABYTE GEFORCE GTX 1660 TI OC (GV-N166TOC-6GD) |
[QUOTE=axn;515784]I actually bought this sucker, though I'm not using it for GIMPS TF work. Nonetheless, I can offer some preliminary numbers. These are run on a system with P95 running on all 4 cores, and the GPU is responsible for Xorg stuff. If you do any screen activity, the numbers will tank. And I'm making no claims that mfaktc parameters are optimal. YMMV.
barrett76_mul32_gs - 1440 - 1650 - 1690 barrett87_mul32_gs - 1330 - 1530 - 1550 The first number is GD/d @ 70w power limit, second @ default 120w power limit, and third @ max power limit of 150w. Personally, I run @ 70w pl. GPU is GIGABYTE GEFORCE GTX 1660 TI OC (GV-N166TOC-6GD)[/QUOTE] Wow those are pretty impressive numbers for the gpu prices nowdays. :thumbs-up: Do you guys have any take on NVIDIA's Creator-Ready-Driver vs Game-Ready Driver? My driver's from January, probably should update lol |
@axn: Thanks for your answer. Those are very interesting figures, way higher than I was expecting. Just for completeness. what were the bit levels and exponent sizes used for the benchmark posted?
|
Factor=bla,90027299,75,76 (for barrett76)
Factor=bla,90027299,76,77 (for barrett87) Note that I did not run these to completion. I just used these as dummy worktodo to do the benchmark, and then eyeballed an average rate based on what the program itself was reporting. Some additional info: OS: Ubuntu 18.04 LTS (4.15.0-48) driver 418.56 cuda 10.1 mfaktc 0.21 compiled for cc 7.5 |
| All times are UTC. The time now is 15:25. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, Jelsoft Enterprises Ltd.