Nvidia BigK GK110 Kepler Speculation Thread

The 7970 GHz is ~20% faster than the 680 at 1600p, so it looks likely to be near the middle of the two.

With the latest Geforce drivers, in the same games quoted above? I doubt that. I would expect GTX 680 performance with the latest drivers to be within ~ 10% of 7970 GHz Ed. performance in these games.
 

http://www.xbitlabs.com/articles/graphics/display/his-iceq-x2-7970-7950-7850_11.html#sect5

The 7970 GHz is 17.5% faster at 1600p + 4xAA in this recent review, granted it's quite a favourable AMD lineup of games, however that is vs the Asus DirectCU II TOP which has one of the highest overclocks on the 680 I believe.

A previous article vs a reference 680 had the 7970 GHz some 27% faster.

Hardocp did a relatively recent one with new drivers and the GHz edition was over 15% faster - http://hardocp.com/article/2012/11/12/fall_2012_gpu_driver_comparison_roundup/7#.USPbM2c1O70

I think the GHz edition actually does better at 1080p here for some reason.

There are more. 10% is the low end, 20% the high end.
 
The only way to know for sure would be to use the latest and greatest drivers for each card and to use the same games that I listed above.
 
The only way to know for sure would be to use the latest and greatest drivers for each card and to use the same games that I listed above.

Well you will see high diffference on some games and lower on some.. SC2 is a good example. ( run "not as well" on 7970 it was on 680 ( mostly due to the code used ) but like the game run at high fps, not a problem ( really hard game to test anyway )... Same if you want compare COD series.. ( old graphics engine )

draw a line will be difficult, you will need base yourself on different reviews and games lineup, like allways ..
 
Yeah you'll have to use a bit of common sense. Skyrim is a weird one, I've seen 20% leads either way on that even with new drivers. BF3 can be highly favouring the 7970 or still a win for Nvidia depending on the website. BL2 is doing Nvidia a lot of favours as well currently, and is often contributing a large overall percentage swing towards the 680 even if it loses in all the other tested games.
 
This isn't rocket science here folks. The only performance data we have on Geforce Titan right now is from the Geforce.com slide comparing GTX Titan to GTX 680. We have a similar slide on Geforce.com comparing GTX 690 to GTX 680. Since we do not have a slide comparing GTX 680 to 7970 GHz Ed. using the very latest drivers, and using the exact same settings and same games as above, then we are just throwing around numbers.
 
The fact that there isn't a slide comparing it to the 7970 GHz is the reason why we're looking at it with some suspicion. When was the last time Nvidia released a new performance winning card and didn't compare it to the competition?
 
Since quantities are not super limited, I would have preferred to see NVIDIA launch Geforce Titan with an MSRP of ~ $849 USD. NVIDIA would have avoided much of the negative press and ridicule from people who point out that 7970 GHz Ed. Crossfire and GTX 680 SLI are both cheaper and faster in comparison (albeit at significantly higher power consumption and higher noise), and they would have avoided pricing Geforce Titan at the same price point as the faster GTX 690. In my opinion, adding an extra $150 USD to the MSRP is not worth all the negativity associated with a $999 USD MSRP for a single GPU product. I realize that NVIDIA is trying to offer much more double precision performance per dollar vs. the Tesla K20 variants, but the feature set/reliability/support is not the same as Tesla either, and I feel it would have been better to position Geforce Titan as a very high end gaming card at $849 USD, with full double precision performance de-emphasized and reserved for higher margin Tesla variants.

By the way, here is the approximate performance increase for Geforce Titan vs. GTX 680 based on a chart that was briefly shown (but now taken down) at Geforce.com (2560x1600, 4xMSAA, 16xAF, Maximum Game Settings):

Metro 2033: +65%
Crysis 3: +44%
The Witcher 2: +33%
Max Payne 3: +44%
Crysis 2: +52%
Shogun 2: +46%
Assassin's Creed III: +38%
Borderlands 2: +33%
Lost Planet 2: +31%
Sleeping Dogs: +49%
Battlefield 3: +46%
Deus Ex: Human Revolution: +52%
Dirt 3: +42%
The Elder Scrolls V: Skyrim: +44%
Call of Duty: Blacks Ops II: +41%
Batman: Arkham City: +36%
StarCraft II: +58%

Average performance increase: +44%

I found a review with many of those games comparing the 7970 GHz and 680 -

http://www.legionhardware.com/artic...hz_edition_7950_iceq_xsup2_boost_clock,1.html

The 7970 GHz edition is 15% ahead in the same 9 games (Witcher 2, Max Payne 3, Crysis 2, BL2 (loss), Sleeping Dogs, BF3, Deus Ex, Dirt 3, Skyrim) while Titan's lead over the 680 in those same 9 games is 39%. Based on those 9 games Titan would be 21% faster than the 7970 GHz.
 
Last edited by a moderator:
The fact that there isn't a slide comparing it to the 7970 GHz is the reason why we're looking at it with some suspicion. When was the last time Nvidia released a new performance winning card and didn't compare it to the competition?

Why do they need to validate the competition by including it on their slides? It's not like they are going to be able to hide these comparisons when they appear on Thursday.

I've gotta say may, your constanst bias and negativity towars anything NV is really tiring.

Titan is by far the most powerful single gaming GPU available today or can be made into the most powerful multi GPU gaming rig. It's completely without competition in that regard and is unlikely to get any from AMD for at least 9 months, so why can't NV charge a bomb for it? They don't have the supply to sell buckets and they have no competition in the single GPU space so the only reasonable thing to do it sell it for as much as you can while still being able to sell all of what you produce. It's basic demand and supply. It's not as if Titan is technically lacking in any way. It offers more performance than anything else at competitive performance/watt levels and extremely impressive accoustics. Additionally it's GPGPU abilities will likely be insane.
 
I'm still waiting for someone to explain why Titan should be priced lower. It'll probably sell out at $1000. Pricing it any lower wont magically increase the supply of cards.
 
I've gotta say may, your constanst bias and negativity towars anything NV is really tiring.

There is an ignore function and I suggest you use it.

Titan is by far the most powerful single gaming GPU available today or can be made into the most powerful multi GPU gaming rig. It's completely without competition in that regard and is unlikely to get any from AMD for at least 9 months, so why can't NV charge a bomb for it?
I'm taking issue with the "by far" assumption. At various points during the past few years we've seen AMD hold a higher lead and yet somehow didn't need to charge $1K for their gpu.

At best Titan will be 40% faster for 9 months, more likely 25%-30% faster. This is not even close to being worth $1K and if the tech press falls over themselves to laud it as being so then they'll just be harming themselves in the long run.

Take a look at AT's article and read the comments below - http://www.anandtech.com/show/6760/nvidias-geforce-gtx-titan-part-1

We know the performance range and we know the price and it's clearly getting a very large thumbs down from the consumer.
 
There is an ignore function and I suggest you use it.

A bigger problem is that you've started having me consider the ignore function in your regard. My ignore function is a bit peculiar, as it auto-magically helps everyone auto-magically engage theirs. There is line between partisan and mad raving fan - I suggest you consider it. Also, this is not how one behaves in polite company "I can fling poop around your house, if you don't like it just ignore it". If needed, take a break, chillax, and consider how to evolve your posts beyond "ATI is the best" noise.
 
A bigger problem is that you've started having me consider the ignore function in your regard. My ignore function is a bit peculiar, as it auto-magically helps everyone auto-magically engage theirs. There is line between partisan and mad raving fan - I suggest you consider it. Also, this is not how one behaves in polite company "I can fling poop around your house, if you don't like it just ignore it". If needed, take a break, chillax, and consider how to evolve your posts beyond "ATI is the best" noise.

Where have I said that? Just because I'm not buying into this nonsense does not make me a raving ATI fan. In fact you will find many posts by myself stating flatly that they make far too many mistakes and they only had themselves to blame over losing to the 680 etc etc.

The problem is there are a bunch of people who flatly refuse to accept that this is stupidly overpriced for the performance. Can you imagine what the consensus of opinion would be if AMD were about to release this? Do you actually believe I'd be defending them?
 
The problem is there are a bunch of people who flatly refuse to accept that this is stupidly overpriced for the performance. Can you imagine what the consensus of opinion would be if AMD were about to release this? Do you actually believe I'd be defending them?

I think the problem is the assumption that price for performance is the same for everyone. We all have different criteria for what we consider acceptable though. For instance, I do a lot of CUDA programming. More recently, I've been trying to explore the new features in compute 3.5. It is important that I do so for my career. With previous versions of CUDA, I have always been able to buy a x80 gaming card and use that as a home test bench. That allowed me to get up to speed before hardware arrives that I am expected to code for. If you notice, that is not currently an option.

I actually considered buying a K20x for my home machine. That runs between $4k and $5k - and is really not multi-purpose. In other words, I can't play my games on it. While I would prefer a lower price, $1k seems like a pretty good deal for me to get a card that I can use as a gaming card and still get the compute value. In addition, the card has a performance point high enough that it will probably last until the next version of compute comes out.

I recognize that I am an edge case. However, this card seems aimed at edge cases like me. Possibly it would be better to ignore the argument surrounding value. In this case, the market will decide if there are enough people like me to support this card. If there aren't, then I'm sure a price drop will happen in the near future.
 
Back
Top