mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > Hardware > GPU Computing

Reply
 
Thread Tools
Old 2017-10-11, 21:35   #23
Lemonlurker
 
Lemonlurker's Avatar
 
Aug 2017

7 Posts
Default

Do tell how fast it is when you use it.
Lemonlurker is offline   Reply With Quote
Old 2017-12-09, 14:37   #24
VictordeHolland
 
VictordeHolland's Avatar
 
"Victor de Hollander"
Aug 2011
the Netherlands

32·131 Posts
Default

The (new) ASUS GTX1080TI STRIX is running smoothly for about 6 weeks now. I'm very content with the card. I installed the latest driver that was available at that time and the ASUS monitoring utility (GPU tweak). I haven't played with the clocks yet, I only turned off the lights.

With TF from 68 to 69bits with Mfaktc 0.21 CUDA 8.0
~1450 GHzdays/day
Temps: average 67C (max 70C in the last week)
Fan is on auto (42-45%) and barely hear-able over the 4 casefans. Under 60C the fan turns off. It is less noisy than my HD7950s or 280X MATRIX or GTX580/480 cards.

I don't quite understand how it's boost works as according to GPU-Z
Base clock 1493MHz
Boost 1607MHz
But if I open the sensor tab is showing clocks of 1759-1784.5MHz ??? I'd always thought of the Boostclock as the maximal achievable IF temperature and power consumption allowed it????

Measuring at the socket the PC (with i7 3770k):
375W (running mfaktc and Prime95)
322W (only mfaktc)
114W (only Prime95)
63W (idle)

So the GPU is using 260W+. The PC has a 80+ Gold PSU, it's close to its TDP of 250W (see GPU-Z Power consumpion and PerfCap Reason)

Is there a special assignment/exponent or bitrange that I could run for James performance charts? Do I somehow need to limit it to the default clock before analysis?
Attached Thumbnails
Click image for larger version

Name:	GTX1080TI-GPU-Z.png
Views:	162
Size:	73.5 KB
ID:	17315  
VictordeHolland is offline   Reply With Quote
Old 2017-12-09, 14:44   #25
James Heinrich
 
James Heinrich's Avatar
 
"James Heinrich"
May 2004
ex-Northern Ontario

7×13×47 Posts
Default

Quote:
Originally Posted by VictordeHolland View Post
Is there a special assignment/exponent or bitrange that I could run for James performance charts? Do I somehow need to limit it to the default clock before analysis?
Any assignment is fine, whatever you're currently working on. Just fill in the form here:
http://www.mersenne.ca/mfaktc.php#benchmark
Don't attempt to limit the clockspeed, just let it do its thing, and report whatever the average measured speed is in the sensors tab (~1780MHz in your case)

Last fiddled with by James Heinrich on 2017-12-09 at 14:44
James Heinrich is offline   Reply With Quote
Old 2017-12-13, 09:18   #26
bayanne
 
bayanne's Avatar
 
"Tony Gott"
Aug 2002
Yell, Shetland, UK

373 Posts
Default

Quote:
Originally Posted by VictordeHolland View Post
I've made my decision and ordered the:
ASUS ROG STRIX GTX 1080 TI

as it was now on 'sale' for €819 and comes with game voucher for Middle-Earth Shadow of War. It's predecessor was actually quite decent (Middle-Earth Shadow of Morder) so that is a nice extra.
What ghz/day does this run at?

Please ignore, just spotted posting above ...

Last fiddled with by bayanne on 2017-12-13 at 09:20
bayanne is offline   Reply With Quote
Old 2017-12-13, 17:28   #27
aurashift
 
Jan 2015

2·127 Posts
Default

Quote:
Originally Posted by LaurV View Post
The 1080 is quite good for TF, but also good, more than decent, for LL.
See Jame's LL table, it is just a bit below a Titan, and it is like two 580s put together, for a fraction of power. And it is actually the first at LL by "JVR" value, which means that, with the electricity price in your area, you will be on the "profit side" in less than a year.

Edit, and as the winter comes, are you heating your house with electricity? Then buy few more... hehe... Oh sorry, you didn't know the winter is coming... in that part of the world is always winter...
I just got a laptop with dual 1080s (system76 bonobo i7-8700k). When I get it I'll let you know how it does. Is the x8 PCIe going to limit anything? I don't imagine it will for this type of work. I probably won't continue to run it after testing, power bills and heat stress and whatnot.
aurashift is offline   Reply With Quote
Old 2017-12-14, 14:54   #28
kriesel
 
kriesel's Avatar
 
"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest

24·3·163 Posts
Default

Quote:
Originally Posted by aurashift View Post
I just got a laptop with dual 1080s (system76 bonobo i7-8700k). When I get it I'll let you know how it does. Is the x8 PCIe going to limit anything? I don't imagine it will for this type of work. I probably won't continue to run it after testing, power bills and heat stress and whatnot.
Wow, a laptop that doubles as a space heater and supercomputer (dual 330W chargers). Specs at the bottom of the page at https://system76.com/laptops/bonobo The fans must sound like microturbines. If you find the 8x PCIe somewhat limiting of primality testing or factoring throughput, try running multiple instances per GPU, so that one instance can be number crunching and occupying compute cores while another is waiting on GPU-CPU I/O. The 1080s certainly would have ample GPU ram for exponents of current interest and well above.

Last fiddled with by kriesel on 2017-12-14 at 14:58
kriesel is online now   Reply With Quote
Reply



Similar Threads
Thread Thread Starter Forum Replies Last Post
GTX 1080 storm5510 Hardware 31 2018-04-21 03:07
Recommended Cooling Unit lukerichards Hardware 15 2018-01-30 20:51
Recommended GTX 760 clockspeed? UBR47K GPU Computing 6 2015-12-20 17:22
Some recommended sequences schickel Aliquot Sequences 7 2009-07-25 10:00
Recommended TF bit levels for M(>10^8) NBtarheel_33 Math 19 2008-11-03 17:19

All times are UTC. The time now is 15:00.


Fri Jul 7 15:00:25 UTC 2023 up 323 days, 12:28, 0 users, load averages: 1.45, 1.20, 1.14

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.

≠ ± ∓ ÷ × · − √ ‰ ⊗ ⊕ ⊖ ⊘ ⊙ ≤ ≥ ≦ ≧ ≨ ≩ ≺ ≻ ≼ ≽ ⊏ ⊐ ⊑ ⊒ ² ³ °
∠ ∟ ° ≅ ~ ‖ ⟂ ⫛
≡ ≜ ≈ ∝ ∞ ≪ ≫ ⌊⌋ ⌈⌉ ∘ ∏ ∐ ∑ ∧ ∨ ∩ ∪ ⨀ ⊕ ⊗ 𝖕 𝖖 𝖗 ⊲ ⊳
∅ ∖ ∁ ↦ ↣ ∩ ∪ ⊆ ⊂ ⊄ ⊊ ⊇ ⊃ ⊅ ⊋ ⊖ ∈ ∉ ∋ ∌ ℕ ℤ ℚ ℝ ℂ ℵ ℶ ℷ ℸ 𝓟
¬ ∨ ∧ ⊕ → ← ⇒ ⇐ ⇔ ∀ ∃ ∄ ∴ ∵ ⊤ ⊥ ⊢ ⊨ ⫤ ⊣ … ⋯ ⋮ ⋰ ⋱
∫ ∬ ∭ ∮ ∯ ∰ ∇ ∆ δ ∂ ℱ ℒ ℓ
𝛢𝛼 𝛣𝛽 𝛤𝛾 𝛥𝛿 𝛦𝜀𝜖 𝛧𝜁 𝛨𝜂 𝛩𝜃𝜗 𝛪𝜄 𝛫𝜅 𝛬𝜆 𝛭𝜇 𝛮𝜈 𝛯𝜉 𝛰𝜊 𝛱𝜋 𝛲𝜌 𝛴𝜎𝜍 𝛵𝜏 𝛶𝜐 𝛷𝜙𝜑 𝛸𝜒 𝛹𝜓 𝛺𝜔