NVIDIA GeForce GTX 980 Ti

As an anecdote, I arrived at the same conclusion that you just explained. My best overclock on my 980Ti was with zero additional voltage. If I upped the volts, I ended up performing less because of the power limit -- exactly as you described. I'm super impressed with how my well my factory-overclocked 980Ti was able to go even further on the stock voltage, so I'm really not complaining.

I guess the statement here is the water cooling is really only useful for keeping it silent. The clocks just don't have much more room to grow...

I am intrigued with Maxwell 980Ti voltage scaling....in my own experiment, 980Ti overclock can work just as well without overvolting...if your GPU core hit its limit, no amount of voltage will work, GM200 is extremely dependent on die quality from the go.

A strange first for my overclocking experience...is this a result of the GM200 architecture whereby 'conventional' voltage increase will no longer work....??

Are we even increasing the voltages at all? Now the thing is, we only have software readers like GPUZ or AB to 'read' and 'change' 980Ti voltages, hardly accurate if Nvidia does some secret sauce to their voltages....Nvidia has to be doing some hidden hard locking imo...any other speculations from the GPU wizzards here?

As for the much talked about ASIC quality...it really only determines your max turbo boost clocks. Let's say you got a good 75% ASIC 980Ti, you can boost by say, +4 bins while a 68% ASIC can do +3 bins.

If you do an overclock of +100mhz to your core and your GPU die max stable clocks is 1500, and the 4th bins pushes the core to 1510, then you are out of luck, your 980Ti will CTD in games....ASIC can't give you higher clocks than what your GPU die can do...in other words, you can have a lower ASIC# 980Ti that overclocks higher than one with bigger ASIC#....granted mostly the higher ASIC# will go higher...and in some games like Witcher3, it eats into your max overclocks..you have to decrease clocks if not you get circle and squares artefacts...

my 2c findings.
 
Last edited:
Nvidia has to be doing some hidden hard locking imo...any other speculations from the GPU wizzards here?
Not exactly a wizzard, but there's a bunch of gamer-oriented graphics boards with measuring points for voltage probes on them these days. All you need is a multimeter and you can check yourself whether your software voltage monitor is accurate or not. :)
 
Back
Top