Nvidia BigK GK110 Kepler Speculation Thread

I think Nvidia shot themselves in their own foot with the temperature target. They should have set it to 90 degrees and/or should have increased fan speed a bit and that would have provided 10-15% more performance in reviews and results would have been much more consistent.

The bigger problem is that if hardware.fr's findings are correct, it infact does give you high performance in short benchmarks, and only after few minutes it settles to that 80C, lowering performance at the same time
If I measure power in Anno after 30s I get 220W but the cooling system (including the way it is calibrated) is not able to maintain 80°C with such a power draw (unless the system is in the fridge of course). After 5 minutes power drops to 180W. If I add extra cooling around the board power goes up to 200W.

So you get nice high performance in the quick benchmark run, and soon after the performance drops
 
Yes.

To illustrate the point: If you run the free copy (Basic Edition) of the recent 3DMark, you'll get lower scores than when you're buying the benchmark. Because the latter allows you to switch off the demo run before the actual benchmark. It's only a small margin (I tested this particular thing on an open testbed) of just shy of 100 points in Firestrike (8744 instead of 8837 IIRC), but still.
 
They say that, but how they measure card power alone is just buy subtracting the idle power, which is a bit dubious to say the least... especially if you consider that a higher framerate may result in more power consumption from the CPU as well...
 
Yes.

To illustrate the point: If you run the free copy (Basic Edition) of the recent 3DMark, you'll get lower scores than when you're buying the benchmark. Because the latter allows you to switch off the demo run before the actual benchmark. It's only a small margin (I tested this particular thing on an open testbed) of just shy of 100 points in Firestrike (8744 instead of 8837 IIRC), but still.

+1.. i have do the entire run.. you need make all test one after other, but between each you have the demo run with sound, each demo run is longer of the benchmark scene.. I have do some run at 1300mhz, i can tell you i was a bit worried doing it. ( In addition, the combined test is bugged with dual cards ( not all time and not with all setup )
 
The bigger problem is that if hardware.fr's findings are correct, it infact does give you high performance in short benchmarks, and only after few minutes it settles to that 80C, lowering performance at the same time

You're saying the exact same thing I'm saying...
 
We did. That's why I wrote that it doubles the amount of work for a reviewer when doing it properly:

• Free Boost @open test bench
• OC-Setting (+100 MHz GPU-Offset, 85°C temp-target, 105% power target, unlinked)
• Fixed Boost @876 MHz (nv-semi-guaranteed frequency)
• Individual clock rates per game & resolution forced after doing real gaming sessions of at least 30 minutes with no loading pauses etc. at a constant air-temperature of 28°C in front of the blower fan.

You see, I did not get much sleep since saturday.

I applaud you for your efforts to get all the data/info you can and sharing it with the rest of us.

I unfortunately saw the beginning of this new version of "optimized benchmarking" aimed at reviewers when GPUs started using boost with GTX680. It adds far too many variables and completely hinders the whole reason for reviews, to find out what sort of performance YOU will be getting from the product not an estimate or best case situation.

It seems now that reviews, through no fault of the reviewers, is becoming a one-upping competition between both companies to see who can get away with duping the readers with little to no backlash.

Edit- For example; someone on XS was talking about how Titan overclocks 39% with a mere 15w increase in power consumption because the reviewer didn't do his due diligence and note his settings and readouts.
 
A temperature limit doesn't make it easier to put in a larger or smaller case. Regardless of what case you put it in the GPU will be kept within safe operating temperature limits.

I guess if you mean by making it easier to put in a small case to mean that it then won't impact the performance in a larger case, then OK, I could go with that. But it wasn't how I was interpreting you saying "easier." As just the statement alone of being easier is nonsense.

So again, upping the thermal limit wouldn't be a good idea anyway. The whole point is keeping the GPU within safe operating temperatures. Power limit alone couldn't do this, as shown by the large case versus small case situation. Upping the thermal limit isn't an option as it puts the GPU into what Nvidia considers unsafe thermal conditions for the GPU.

Therefore anything using Boost 2.0 would be a horrible experience for me with high performance during the winter and low performance during the summer. Versus a traditional card where my performance is the same regardless of what season it is with the only difference being that it is louder during the summer than during the winter.


It's easier to put in a small case, because the card automatically keeps its temps and power consumption in check without overheating. It's exactly like you said "Regardless of what case you put it in the GPU will be kept within safe operating temperature limits." that means you can easily put it in a small case. Edit: Now I think I got where the confusion is... You thought I was saying that temp limit makes it easier to put into a small case instead of a large case... No what I said was that temp limit makes it easier to put it into a small case vs titan with no temp limit into the same small case.

80c is not about safe operating temperatures no matter how many times you say it, nor does upping that limit alone puts it in unsafe thermal conditions. They are not saying anything about over 80c being unsafe, just about the voltages. 80c is just a point in a curve they chose that offers a good mix of acoustic and performance.

Ramp up the quite conservative fan profile of Titan and it works in summer just like your traditional card, or raise the temp limit if you prefer more quiet operation.

The effect on power consumption is quite substantial, because when the card hits 80°C, it clocks down, or at the very least is prevented from clocking up. That reduces power.

See HFR's results: http://www.hardware.fr/articles/887-6/consommation-efficacite-energetique.html

Adding a couple of fans increases the power draw of Titan by up to 22W.

I was of course talking about non throttled vs non throttled situation, as of course the power consumption goes down if you throttle the card. I meant that raising the temp ceiling for lets say 5c and you won't see a big increase in power consumption compared to non throttled 79c. Stock settings with this card simply don't give proper results in many metrics. Power consumption numbers are also way down if it's being throttled. nVidia took their acoustics and "boutique" angle a bit too far with their conservative fan curve and temp throttling protocols. They should have built it so that it won't throttle under any normal situation, unlike SB says it wasn't about safety, but more about elegance.
 
Last edited by a moderator:
I dont think they are out there to cheat for the lack of a better term with regards to benchmarking.

Reviewers could simply add in benchmarks for extended play testing for instance to see this power limiting in action and to see just how much the clocks drop. The PCPer YT review showed the Titan dropping from ~997MHz to 980MHz and then completely stabilised there (~80C). And it looks like this 80C temp target can be changed via EVGA precision software so that the limiting can occur at a higher temp.

But at the end of the day, I'm just thinking that the Titan at stock is optimised for the sweet spot for performance/acoustics (i.e. heat/power consumption). Obviously acoustics can be thrown out of the window by raising the target power to 106%, target temp to 95C, put a shiney waterblock, unlock the voltage and see it boost to +1150MHz without being limited other than by the silicon/software limit..

Heres Guru3D's OC review - http://www.guru3d.com/articles_pages/geforce_gtx_titan_overclock_guide,1.html
 
You should also consider that reviewers were under extreme time pressure to get their reviews done. Hw is only available to them for a limited amount of time and they have to meet also review release deadlines.
 
25q556b.jpg


http://www.techpowerup.com/
 
If you can't afford a Titan, then just wait for Nvidia Maxwell next year, same or nearly equal performance levels and far cheaper.

You're still paying close to $1000 for multi-GPU solutions with tons of issues as PCPer has shown.

http://www.pcper.com/reviews/Graphi...ance-Review-and-Frame-Rating-Update/Frame-Rat

AMD's CrossFire technology shows severe performance degradations when viewed under the Frame Rating microscope that do not show up nearly as dramatically under FRAPS.
 
You should also consider that reviewers were under extreme time pressure to get their reviews done. Hw is only available to them for a limited amount of time and they have to meet also review release deadlines.

They can always do a followup.. Its not like people are rushing out to grab $999 video cards as soon they are released anyway :devilish:
 
If you can't afford a Titan, then just wait for Nvidia Maxwell next year, same or nearly equal performance levels and far cheaper.

Yep for a performance part at actual high end prices :LOL:

You're still paying close to $1000 for multi-GPU solutions with tons of issues as PCPer has shown.

http://www.pcper.com/reviews/Graphi...ance-Review-and-Frame-Rating-Update/Frame-Rat

An enthusiast would want a single high end GPU for a reason, otherwise he'd go for mGPU instantly with their release.
 
Back
Top