NVIDIA Kepler speculation thread


Wtf!? Why would anybody need to wirelessly overclock their graphics card from their phone? :LOL:

If NVIDIA doesn't have anything like ZeroCore, I would argue that their power advantage under load essentially vanishes, except maybe for people who only turn their computer on for gaming.

AMD's list of competitive advantages seems to be diminishing rather quickly. I really want to see a Pitcairn vs "GK106" showdown.
 
Settings unknown unfortunately. And it's not 1080p with 4xAA and extreme, it has to be lower because I get barely 47fps at 900p.

I bet for Tesselation "normal " ( default is disabled in 3.0 ), 1080P. all setting by default. ... ( but seriously i start to ask me what can be the difference for Kepler on 2.1 vs 2.5 vs 3.0, i allways find suspect when they release a new version of Unigine one week before the launch of an nvidia card with major difference on scores )

Wtf!? Why would anybody need to wirelessly overclock their graphics card from their phone? :LOL:

Lol, well this feature exist since nearly 6-8months now ( if not more ) . No interest untill you like check temps, voltage, when gaming on your tablet or phone when you play. ( for OC, pure gadget ).
 
Last edited by a moderator:
So +40% is best case. At least we know now where it came from. I'm guessing it will be all over the place in actual games. From being slower to being faster. Depending on settings.
 
So +40% is best case. At least we know now where it came from. I'm guessing it will be all over the place in actual games. From being slower to being faster. Depending on settings.


I dont know if it is the best " resolution case", but as score dont meet extreme tesselation for a single GTX 580, we can assume this is with normal tesselation. and done with the popular resolution of 1080P.
Now maybe at lower resolution the difference is a little bit higher.
 
Seems like a nice chip, but at almost 20% smaller than the GF114 and 10% more expensive than the GF110...I quess it'll be good for their bottom line. I'm sorry if that doesn't exactly make me happy. I would pay money to hear how nVidia's marketing strategy has gone during the last six or so months with regards to pricing and naming of this chip. One thing is for sure I won't be upgrading any time soon, but I think that year from now the market will be more friendly.
 
http://www.techpowerup.com/downloads/2120/TechPowerUp_GPU-Z_v0.6.0.html

GPU-Z 0.6.0 with GTX 680 support released, maybe now we can get a better reading of the clocks.

So, they removed the Shader frequency area and put Boost frequency instead. Ok. :p
What about AMD cards with that UGLY empty Shader area? :???:

Compare:
http://www.techpowerup.com/reviews/AMD/HD_7850_HD_7870/29.html

to
http://forum.beyond3d.com/showpost.php?p=1629840&postcount=2981

:???:

So +40% is best case. At least we know now where it came from. I'm guessing it will be all over the place in actual games. From being slower to being faster. Depending on settings.

I wouldn't be so sure. Might be an advertisement deal with Unigine developers. ;)
 
Last edited by a moderator:
I dont know if it is the best " resolution case", but as score dont meet extreme tesselation for a single GTX 580, we can assume this is with normal tesselation. and done with the popular resolution of 1080P.
Now maybe at lower resolution the difference is a little bit higher.

By best case I meant you won't be seeing any actual games that show the same kind of boost.
 
:LOL:
That's what you say before seeing the big daddy. :mrgreen:

Speaking of which....where is the GK100 anyway? Do we have any presumed launch/availability dates?

There's really no point upgrading now only to find your GTX 6XX part being replaced by a GTX 7XX part a few months later! :S

Although the way this is going, NVidia is going to ask around 800$ for the 780 anyway.
 
http://us.ncix.com/products/index.php?sku=69800

EVGA GeForce GTX 680 Superclocked TBD 2GB TBD GDDR5 2xDVI DisplayPort HDMI PCI-E 3.0 Video Card
Reg. Price: $578.20 USD

Availability: This Product is available in 5 to 10 Days

Yeah, I saw those earlier... they also have this:

http://us.ncix.com/products/index.php?sku=69802

MSI GeForce GTX 680 OC 1056MHZ 2GB 6GHZ GDDR5 2xDVI DisplayPort HDMI PCI-E 3.0 Video Card
N680GTX-PM2D 2GD5/OC

Pricey... I really am looking forward to seeing what these things can do, but still... the 22nd can't come soon enough.
 
If you stuck 6gbps memory on Pitcairn PCB, you'd probably still wouldn't be able to up the clock because the MC is likely not sized for those clocks...

So you add area to fix that and... you'll end up exactly where a scaled down GK104 or stripped 7970 would end up?

Many review sites actually achieved memory speeds >1500Mhz when overclocking HD7870 ... so the memory controller is certainly up to the task.

If GK104 ends up being 10% faster than an hd7970 and more efficient with power, then it would probably rival pitcairn's efficiency or at least it would put it close enough to be indistinguishable.
 
Back
Top