![]() |
There is something I feel I need to pass on. I received a new GPU today. A GTX 1650. The older CUDA 10 version of [I]mfaktc[/I] will [U]not[/U] run with it. The newer 2047 runs very well.
Off-topic: I suspect [I]gpuOwl[/I] will run with it. I have my doubts about [I]CUDALucas[/I] and [I]CUDAPm1[/I]. I will have to try each. It is running around 950 GHz-d/day. The temperature is 64°C. The clock speeds are slightly less than my 1080. Everything on the back of the machine is relatively cool to the touch. So, it appears everything will be fine. If my 1080 decides to die, I know what I will replace it with. For the cost, I have no complaints at all. |
Hi,
seems like mfaktc runs fine with CUDA 11 on Ampere (no specific changes for Ampere except Makefile). :smile: [CODE]mfaktc v0.22-pre8 (64bit built) [...] CUDA version info binary compiled for CUDA 11.0 CUDA runtime version 11.0 CUDA driver version 11.0 CUDA device info name [COLOR="Red"][B]A100-SXM4-40GB[/B][/COLOR] compute capability 8.0 max threads per block 1024 max shared memory per MP 167936 byte number of multiprocessors 108 clock rate (CUDA cores) 1410MHz memory clock rate: 1215MHz memory bus width: 5120 bit [...] Starting trial factoring M66362159 from 2^74 to 2^75 (57.65 GHz-days) k_min = 142321062303420 k_max = 284642124610180 Using GPU kernel "barrett76_mul32_gs" Date Time | class Pct | time ETA | GHz-d/day Sieve Wait Jul 19 21:19 | 0 0.1% | 0.829 13m15s | 6259.18 82485 n.a.% Jul 19 21:19 | 4 0.2% | 0.779 12m26s | 6660.92 82485 n.a.% Jul 19 21:19 | 9 0.3% | 0.780 12m26s | 6652.38 82485 n.a.% [...] Jul 19 21:31 | 4617 100.0% | 0.780 0m00s | 6652.38 82485 n.a.% no factor for [COLOR="red"][B]M66362159 from 2^74 to 2^75[/B][/COLOR] [mfaktc 0.22-pre8 barrett76_mul32_gs CUDA 11.0 arch 8.0] 51D74917 tf(): total time spent: [COLOR="red"][B]12m 32.323s[/B][/COLOR] [/CODE] New absolute performance champion and I guess best performance per watt, too! :smile: Older benchmark data for Turing (RTX 2080 Ti): [URL="https://mersenneforum.org/showpost.php?p=497430&postcount=2912"]https://mersenneforum.org/showpost.php?p=497430&postcount=2912[/URL] Oliver |
[QUOTE=TheJudger;551813]New absolute performance champion and I guess best performance per watt, too! :smile:[/QUOTE]Oooh, juicy new benchmark! It is indeed the current [url=https://www.mersenne.ca/mfaktc.php]performance champion[/url], but (according to my numbers) the Telsa T4 is still the efficiency champion (70W vs 400W is a big difference).
If you have access to this GPU again and can run a quick [url=https://www.mersenne.ca/cudalucas.php]CUDAlucas and/or GPUowl[/url] benchmark I would appreciate it. |
Hi James,
I can't remember a T4 hitting 2500 GHz-d/d. And while that A100 has a TDP of 400 W it reports during mfaktc "just" 290 to 300 W. But you still might be correct that a T4 has a better performance per watt ratio when looking just at the power consumption of the GPU itself. I tend to ignore those smaller GPUs, sorry. Oliver |
[QUOTE=TheJudger;551816]I tend to ignore those smaller GPUs, sorry.[/QUOTE]Those "smaller" GPUs that still have [url=https://www.mersenne.ca/mfaktc.php?filter=t4|rx%20480]nearly 5x the TF performance[/url] of my RX 480 :down::lol:
|
[QUOTE=TheJudger;551816]I can't remember a T4 hitting 2500 THz-d/d.[/QUOTE]Neither can I. Not even 2500 [b]GHz[/b]-d/d :whistle:
I actually looked back and found one actual benchmark for a T4 that put it at around 1700/day. On my chart it was being lumped in with the other CUDA 7.5 cards, I have adjusted my data so the numbers should be a little more accurate. Still more efficient than the A100, but only 50% better not 100% better. |
I guess for many of us a RX 480 is far more enjoyable than a T4 for home usage (PC games :smile:)
|
I only use the two I have now for this project. I don't play games with them. I watch the power consumption. 200W for one and 75W for the other. 6.6 kWh in 24 hours, if I keep them running continuously, which I do not. In simpler terms, about $25 USD a month for both at the current rate. That's no back-breaker.
Utility costs here are quite steady. An average would be around $0.13 USD per kWh. There are members here to pay far more. If I had to pay what some are, there wold be a lot of oil lamps and candles used. I don't see how they manage. |
[QUOTE=storm5510;551833]Utility costs here are quite steady. An average would be around $0.13 USD per kWh. There are members here to pay far more. If I had to pay what some are, there wold be a lot of oil lamps and candles used. I don't see how they manage.[/QUOTE]Price paraffin candles on a $/Btu and $/lumen-hour basis, and electricity even at $1/kw-hr won't look so bad. The same goes for kerosene lamps.
Have you tried using nvidia-smi to reduce power use on your gpus? I find running RTX20xx or GTX1650 gpus at 50% power still provides 80% of TF throughput. It's a lot easier on the air conditioner too, so power and cost savings are considerable, even at my ~$.12/kwhr. [CODE]:d0 gtx1080 90 to 291 W "c:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -i 0 -pl 90 :d1 gtx1650 45 to 75 w "c:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -i 1 -pl 45 :d2 gtx1650 45 to 90 w "c:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -i 2 -pl 45 :d3 rtx2080 125 to 258 w "c:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -i 3 -pl 125 :d4 gtx1080ti 125 to 300 w "c:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -i 4 -pl 125[/CODE] |
[QUOTE=kriesel;551866]Price paraffin candles on a $/Btu and $/lumen-hour basis, and electricity even at $1/kw-hr won't look so bad. The same goes for kerosene lamps.
Have you tried using nvidia-smi to reduce power use on your gpus? I find running RTX20xx or GTX1650 gpus at 50% power still provides 80% of TF throughput. It's a lot easier on the air conditioner too, so power and cost savings are considerable, even at my ~$.12/kwhr. [/QUOTE] Two years ago when I started using this 1080, my UPS would begin to "whistle" after running for a few seconds. I used [I]MSI Afterburner[/I] to throttle the GPU back to 85% of capacity. It was at, or beyond, the capacity of the UPS. It is really not large enough having a maximum capacity of 300W. Now, I no longer need to throttle. The GPU's overall performance has dropped around 10%. It still runs above a thousand gigahertz days per day running [I]mfaktc[/I]. My average monthly utility cost is $115 USD during the warm weather months. In the winter is where I have to be cautious. My furnace has two 20 ampere heating elements. It is 240 VAC. It can eat a decent size hole in my pocket if I fail to pay attention. |
[QUOTE=storm5510;551884]In the winter is where I have to be cautious. My furnace has two 20 ampere heating elements. It is 240 VAC. It can eat a decent size hole in my pocket if I fail to pay attention.[/QUOTE]
Isn't that what GPUs are really for? Space heaters??? :wink: |
| All times are UTC. The time now is 22:30. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.