NVIDIA Kepler speculation thread

Or just a very regular forward movement which is being optimized to hell and back?
http://www.pcgameshardware.de/aid,8...Grafikkarte-der-Welt/Grafikkarte/Test/?page=3
For Crysis Warhead, we're observing the same as Computerbase when moving forward in basically a straight line. When dodging and strafing, as is usual in first person shooters, things start to get nasty: GTX 690 seems to be able to smooth five to seven frames before one drastic outlier occurs.
 
Hmh, 300W limit? It has 2x8pin, it's not following the limitations, the real-life consumption (according to TPU) is about same as 6990. (sure, it's under 300W, but TDP != consumption)
 
Another micro-stutter investigation from HT4U.net:

http://ht4u.net/reviews/2012/nvidia_geforce_gtx_690_im_test/index3.php

Good to see sites investigate this.

It seems that the lower the frame rate the more pronounced the MS gets (although higher frame rate results are missing). It would be nice to see what the results are at 90 FPS to get an idea of whats going on (even if you have to reduce IQ and/or resolution to get it). I also have to bring into question how frames rendered ahead is being used as well. What would be the difference between 0 and 1 at 90 FPS for example.
 
Kaotik said:
Hmh, 300W limit? It has 2x8pin, it's not following the limitations, the real-life consumption (according to TPU) is about same as 6990. (sure, it's under 300W, but TDP != consumption)
So here's a card that, according to Anandtech, consumes at load 100 to 120W less than the GTX590, yet retains the same power supply capacity. IOW: a large safety margin over what's strictly required.

And somehow you consider that bad? What do you suggest they do instead? Remove those 2 GND pins to get 8+6 pin connectors and reduce the operating margin? Just to satisfy an footnote in the PCIe specification that's not even required to pass certification?
 
So here's a card that, according to Anandtech, consumes at load 100 to 120W less than the GTX590, yet retains the same power supply capacity. IOW: a large safety margin over what's strictly required.

And somehow you consider that bad? What do you suggest they do instead? Remove those 2 GND pins to get 8+6 pin connectors and reduce the operating margin? Just to satisfy an footnote in the PCIe specification that's not even required to pass certification?

I still have no idea why this footnote is even there.
 
Tridam's just published an article about GPU Boost, revisited: http://www.hardware.fr/focus/65/gpu-boost-gtx-680-double-variabilite.html

Basically, his press sample was qualified up to 1110MHz while retail cards may be limited to 1097, 1084, 1071, or perhaps as low as 1058MHz. So he benched a random retail card against his press sample, measured a 1.5% difference on average, up to 5% in Anno 2070.
reminds me when TomsHardware got a X800XT-PE sample with memory clocked at 575 instead of 560MHz... :)
 
GTX 670

2155528kz9hfykxtwkhjw4pu19.jpg


 
That Gigabyte OC card is very close to the stock GTX 680. Nice.

If these are graphics scores and not overall system that is.
 
Back
Top