Nvidia BigK GK110 Kepler Speculation Thread

Hopefully this will end the stupid 10,000 cards only rumor. Like I said, not every GK110 will have working ECC, those can be perfectly fine as GeForce and sold as gaming cards. Though I'm wary of AIBs cheap plastic shrouds in custom coolers, those are cheap looking and doesn't even compare to Titan's superior aluminium and magnesium cooler.

Yea, been trying to correct people that have been parroting the whole "limited production run" "only 10k cards" "no custom card" BS for the last week or two to no avail.

FYI- Titan's shroud is cast aluminium not the mix like GTX690.
 
I for one fail to see who this product is targeted at.

If I was in the market for $1000 graphics cards Titan would probably rank higher than SLI 680's or crossfire 7970's. No amount of pretty bar graphs can compensate for the rage I would feel if half my graphics hardware went to waste due to software issues.

In the end some people just buy stuff cause they can, especially if it's a novelty like Titan. Whether they regret it later on sort of depends on how long it takes for a competitor to emerge.
 
This product is for hardcore nvidia fans with a lot of money and lack of vision to spend this money for something better. Especially when you are going to buy two of them..
That's the target.
 
It's a bit like driving a Porsche or a Ferrari - no one would need (or use, for that matter) one to drive their kids to school. But it's very nice to have to go on a ride on a curvy mountain road or on the german autobahn. :)
 
It's a bit like driving a Porsche or a Ferrari - no one would need (or use, for that matter) one to drive their kids to school. But it's very nice to have to go on a ride on a curvy mountain road or on the german autobahn. :)
With the difference that many people will admire a $300K marvel of technology but they jammer about 'lack of vision' (I thought I'd heard it all, but no) when it's $1K. ;)
 
It's a bit like driving a Porsche or a Ferrari - no one would need (or use, for that matter) one to drive their kids to school. But it's very nice to have to go on a ride on a curvy mountain road or on the german autobahn. :)

For be honest it cost less of the units i use for cool my hardware, whatever it is Phase change unit or my actual H2o setup. Porshe of ferrari is a bit a strange comparaison.. if only they was just cost 2 times the price of Mr everyone car.
 
Last edited by a moderator:
I actually don't have an issue with $1000 cards. I would like to see $1000 worth of performance however, or something close.

Based on the AT article it's about 35%-40% faster than the 680. Anywhere else that makes it about 20% faster than the 7970 GHz. I'm sure you can buy better aftermarket cooling for $100-$150.
 
Full speed FP64 only runs at base clock

The difference wouldn't be that large, Boost is less than 5%.

Wow I actually guessed right :D

The penalty for enabling full speed FP64 mode is that NVIDIA has to reduce clockspeeds to keep everything within spec. For our sample card this manifests itself as GPU Boost being disabled, forcing our card to run at 837MHz (or lower) at all times. This is why NVIDIA’s official compute performance figures are 4.5 TFLOPS for FP32, but only 1.3 TFLOPS for FP64. The former assumes that boost is enabled, while the latter is calculated around GPU Boost being disabled. The actual execution rate is still 1/3.
http://www.anandtech.com/show/6760/nvidias-geforce-gtx-titan-part-1/4
 
I actually don't have an issue with $1000 cards. I would like to see $1000 worth of performance however, or something close.

Based on the AT article it's about 35%-40% faster than the 680. Anywhere else that makes it about 20% faster than the 7970 GHz. I'm sure you can buy better aftermarket cooling for $100-$150.

Take an lightning or DCII or even "lower" models with better cooling system. ( and there's still artic cooling solution, cheap, silent. ).
 
Last edited by a moderator:
If I was in the market for $1000 graphics cards Titan would probably rank higher than SLI 680's or crossfire 7970's. No amount of pretty bar graphs can compensate for the rage I would feel if half my graphics hardware went to waste due to software issues.

Since quantities are not super limited, I would have preferred to see NVIDIA launch Geforce Titan with an MSRP of ~ $849 USD. NVIDIA would have avoided much of the negative press and ridicule from people who point out that 7970 GHz Ed. Crossfire and GTX 680 SLI are both cheaper and faster in comparison (albeit at significantly higher power consumption and higher noise), and they would have avoided pricing Geforce Titan at the same price point as the faster GTX 690. In my opinion, adding an extra $150 USD to the MSRP is not worth all the negativity associated with a $999 USD MSRP for a single GPU product. I realize that NVIDIA is trying to offer much more double precision performance per dollar vs. the Tesla K20 variants, but the feature set/reliability/support is not the same as Tesla either, and I feel it would have been better to position Geforce Titan as a very high end gaming card at $849 USD, with full double precision performance de-emphasized and reserved for higher margin Tesla variants.

By the way, here is the approximate performance increase for Geforce Titan vs. GTX 680 based on a chart that was briefly shown (but now taken down) at Geforce.com (2560x1600, 4xMSAA, 16xAF, Maximum Game Settings):

Metro 2033: +65%
Crysis 3: +44%
The Witcher 2: +33%
Max Payne 3: +44%
Crysis 2: +52%
Shogun 2: +46%
Assassin's Creed III: +38%
Borderlands 2: +33%
Lost Planet 2: +31%
Sleeping Dogs: +49%
Battlefield 3: +46%
Deus Ex: Human Revolution: +52%
Dirt 3: +42%
The Elder Scrolls V: Skyrim: +44%
Call of Duty: Blacks Ops II: +41%
Batman: Arkham City: +36%
StarCraft II: +58%

Average performance increase: +44%
 
Take an lightning or DCII or even "lower" models with better cooling system. ( and there's still artic cooling solution, cheap, silent. ).

Yep I have a DCII 680 with a decent 1228/6500 overclock that's more than enough for now. Using Anand's 992Mhz Titan boost result it has ~40% more tex/shader performance and ~40% more bandwidth than my current card. That's a decent bump but not necessary for the vast majority of games out there.
 
Since quantities are not super limited, I would have preferred to see NVIDIA launch Geforce Titan with an MSRP of ~ $849 USD. NVIDIA would have avoided much of the negative press and ridicule from people who point out that 7970 GHz Ed. Crossfire and GTX 680 SLI are both cheaper and faster in comparison (albeit at significantly higher power consumption and higher noise), and they would have avoided pricing Geforce Titan at the same price point as the faster GTX 690. In my opinion, adding an extra $150 USD to the MSRP is not worth all the negativity associated with a $999 USD MSRP for a single GPU product. I realize that NVIDIA is trying to offer much more double precision performance per dollar vs. the Tesla K20 variants, but the feature set/reliability/support is not the same as Tesla either, and I feel it would have been better to position Geforce Titan as a very high end gaming card at $849 USD, with full double precision performance de-emphasized and reserved for higher margin Tesla variants.

Agreed completely.
 
gtxtitan-vs-gtx6808dra3.png


BTW IIRC 680 is hardly faster than 580 in metro bench, so it's not surprise TT performs well..
 
Honestly I think that is really poor, and there's a reason why we aren't seeing it compared to the 7970 GHz edition. There simply can't be any justification for this as a gaming card at this price.
 
Here is the approximate performance increase for Geforce Titan vs. GTX 680 based on the chart above (2560x1600, 4xMSAA, 16xAF, Maximum Game Settings):

Metro 2033: +65%
Crysis 3: +44%
The Witcher 2: +33%
Max Payne 3: +44%
Crysis 2: +52%
Shogun 2: +46%
Assassin's Creed III: +38%
Borderlands 2: +33%
Lost Planet 2: +31%
Sleeping Dogs: +49%
Battlefield 3: +46%
Deus Ex: Human Revolution: +52%
Dirt 3: +42%
The Elder Scrolls V: Skyrim: +44%
Call of Duty: Blacks Ops II: +41%
Batman: Arkham City: +36%
StarCraft II: +58%

Average performance increase: +44%

At the risk of quoting myself, and assuming that the NVIDIA-supplied data above is accurate and not subject to change, here is the approximate performance increase for Geforce 690 vs. GTX Titan (using GTX 690 data from http://www.geforce.com/Active/en_US/shared/images/embed/690v680.png , 2560x1600, 4xMSAA, 16xAF, Maximum Game Settings):

Battlefield 3: +19%
Deus Ex: Human Revolution: +18%
Dirt 3: +18%
The Elder Scrolls V: Skyrim: +26%
Starcraft II: +22%
The Witcher 2: +31%

Average performance increase: +22%

So the performance of Geforce Titan is actually closer to GTX 690 than it is to GTX 680 or 7970 GHz Ed, and my proposed $849 MSRP would have been justifiable :) Oh well, it is what it is, no turning back now...
 
As pointed out in this thread, there's a mistake somewhere. Mathematically, it doesn't work out, Boost or not.

It's because that is the theorectically peak of K20x(aka 732mhz) which is listed in the slide.
K20Efficiency_575px.png


So either someone copy/pasted the info from K20x and forgot that Titan has different clocks or Nvidia decided that a cheaper card shouldn't have higher DP and limits it to K20x speeds.
 
At the risk of quoting myself, and assuming that the NVIDIA-supplied data above is accurate and not subject to change, here is the approximate performance increase for Geforce 690 vs. GTX Titan (using GTX 690 data from http://www.geforce.com/Active/en_US/shared/images/embed/690v680.png , 2560x1600, 4xMSAA, 16xAF, Maximum Game Settings):

Battlefield 3: +19%
Deus Ex: Human Revolution: +18%
Dirt 3: +18%
The Elder Scrolls V: Skyrim: +26%
Starcraft II: +22%
The Witcher 2: +31%

Average performance increase: +22%

So the performance of Geforce Titan is actually closer to GTX 690 than it is to GTX 680 or 7970 GHz Ed, and my proposed $849 MSRP would have been justifiable :) Oh well, it is what it is, no turning back now...

The 7970 GHz is ~20% faster than the 680 at 1600p, so it looks likely to be near the middle of the two.
 
Back
Top