NVIDIA Kepler speculation thread

Yes it does but the only time I see a graphics card anymore is the five minutes it takes to take it out of the box and put it in my case. I hope anybody who buys one of these plans to keep it on display.


Take a bit of time more for me, it is when i completely nake it ( mmm ) just the time to put my waterblocks on it.

Anyway, Nvidia should really stop with their marketing, what this thing about Portal 2 4xMSAA vs FXAA ? .. you can use 16xMSAA ( full 2x8AA ( only available in crossfire) or 24x MSAAEdge or full screen SSAA without any problem in this game. Why would you care about 4xMSAA ?
 
Take a bit of time more for me, it is when i completely nake it ( mmm ) just the time to put my waterblocks on it.

Anyway, Nvidia should really stop with their marketing, what this thing about Portal 2 4xMSAA vs FXAA ? .. you can use 16xMSAA ( full 2x8AA ( only available in crossfire) or 24x MSAAEdge or full screen SSAA without any problem in this game. Why would you care about 4xMSAA ?
Well, obviously in that particular game it's irrelevant, but it serves as a decent enough screenshot for those more demanding games where FXAA may be more important.
 
http://www.geforce.com/whats-new/articles/article-keynote/
Wonder if this feature also extends to dual SLI GTX 680 cards?

They've had similar thing out there forever (at least since G80 I think), they increase input lag to decrease microstuttering.
Maybe they've changed it since, but as it was after 500-series launched that it people actually started to talk about it, I suspect this is just nVidia PR seizing the moment and using the talk to advertise old feature as new
 
660ti looks good if indeed real. Guessing it will exceed 7850 handily at 249...thereby pushing 7850 down to 200 in the process?

This launching early May too? Somehow I'm doubting, but hopeful.

Interesting though that going against Pitcairn it will be Nvidia in the position of having the big/expensive die again...so I guess there is nothing planned but Gk104?
 
Interesting though that going against Pitcairn it will be Nvidia in the position of having the big/expensive die again...so I guess there is nothing planned but Gk104?

Someone somewhere not so long ago mentioned that NVshould be laughing their arses. I mean- they design GK104 which should essentially and logically fight against Pitcairn but because of both Kepler doing so well and Tahiti being so underwhelming, you have actually fully-enabled GK104 beating Tahiti, and the salvage parts readily available to combat Pitcairn.

GK104 is not a big and expensive die (not considering the obvious manufacturing difficulties NV has when producing it). It's exactly in the sweet zone. ;)
 
Someone somewhere not so long ago mentioned that NVshould be laughing their arses. I mean- they design GK104 which should essentially and logically fight against Pitcairn but because of both Kepler doing so well and Tahiti being so underwhelming, you have actually fully-enabled GK104 beating Tahiti, and the salvage parts readily available to combat Pitcairn.

GK104 is not a big and expensive die (not considering the obvious manufacturing difficulties NV has when producing it). It's exactly in the sweet zone. ;)
"Big" and "expensive" are relative terms. GK104 is 38% larger than Pitcairn. If nvidia is yielding that many salvage parts, then it seems that they would lose money since the manufacturing cost of GK104 is the same whether it's fully functional or not.

-FUDie
 
"Big" and "expensive" are relative terms. GK104 is 38% larger than Pitcairn. If nvidia is yielding that many salvage parts, then it seems that they would lose money since the manufacturing cost of GK104 is the same whether it's fully functional or not.

-FUDie

Costs are the same but yields also increase when salvage parts are included. Do we have data that says fully enabled Pitcairn yields are better than salvage GK104's? Bottom line is nVidia is probably better positioned now than they ever dreamed. Fighting Pitcairn with salvage 104's is certainly better than needing fully functional parts to do so.
 
GK104 is 38% larger than Pitcairn.

It's not a problem for them to compete with bigger chips, GF114 is bigger than Barts by almost the same margin. ;)

If nvidia is yielding that many salvage parts, then it seems that they would lose money since the manufacturing cost of GK104 is the same whether it's fully functional or not.

Lose but imaginary money with respect to the best case. They are doing this all the time since they can't sell those chips atdouble the price as an example. Not mentioning that the fully-enabled GK104 carries so much price premium over it to offset salvage costs.



Where is the GK106?
 
Before 28nm Nvidia paid per good chip, now they are paying per wafer if all the rumors are true. So the ability to compete in the same market with bigger chips may be affected somewhat. The same is true with 7970 vs 680, but the limited availability of the latter could have helped the sales of the AMD part.
 
Before 28nm Nvidia paid per good chip, now they are paying per wafer if all the rumors are true. So the ability to compete in the same market with bigger chips may be affected somewhat.
That depends of course on the exact pricing model in their current contract and the yields, both probably a function of time. The new contract could offer lower cost/functional die in the future or it may be a higher one compared to the old contract. If the 28nm process matures relatively fast, nV will be better off with some likelihood. If the process turns out to be problematic in the long run, they won't. We simply don't know what will happen.
 
That's of course correct, but this will be true for AMD, too. In the past, when Nvidia paid per functional die, having bigger dies and so lower yields (as die size and yield are related, even in a mature process) especially in the troubled 40 nm development helped the company in keeping the costs low. They had a sort of "insurance". Now, if they have bad yields (now or in th future), costs for a good GPU will skyrocket .
 
It's not a problem for them to compete with bigger chips, GF114 is bigger than Barts by almost the same margin. ;)
And you really think this is free for nvidia?
UniversalTruth said:
Lose but imaginary money with respect to the best case. They are doing this all the time since they can't sell those chips atdouble the price as an example. Not mentioning that the fully-enabled GK104 carries so much price premium over it to offset salvage costs.
How do you know it offsets costs at all? Say GK104 yields are terrible and you get a lot of salvage parts. Then selling a few for a premium does little to offset the costs. Or, say GK104 yields are great so that you get few salvage parts. Then how do you meet demand for the salvage parts?

Remember the ATI 9500 Pro? They were R300 chips on a board with 128-bit memory bus. Demand was so high, ATI was putting fully functional chips on the boards and, effectively, losing money. Seems like nvidia is in a similar pickle here.

-FUDie
 
It's not a problem for them to compete with bigger chips, GF114 is bigger than Barts by almost the same margin. ;)
GF114 vs. Barts was a completely different story.

GF114 was a chip that actually seemed to yield in good quantity and the corresponding cards were profitable because Nvidia sold a LOT of them.

By contrast, GK104 seems to be rather hard to make even a month after launch.

NVidia's main advantage this round is that their high-midrange chip can actually compete with AMD's high-end chip. NVidia's main problem is that they can't make even remotely enough of them - and they still don't have a high-volume chip to make some real money with quantity.
 
GK104's ability to compete well with Tahiti doesn't necessarily make it a good choice to compete against pitcairn, the volume requirements of a $300 part are going to be much higher than those of a $500 part. So they either have a lot of partially functional gk104 parts (meaning yields might be an issue) or they are going to be putting sabotaged (laser cut) gk104's in the 660 which means they will be selling those parts for ~50% of their potential. It seems unlikely to me that they are holding back 680 production to meet future 660 and 670 demand (because that would be stupid) and the supply of 680s isn't near meeting demand yet.

So are Nvidia getting a lot of partially functional gk104s or is demand way higher than it has ever been for a $500 card or are yields just plain bad right now and maybe we'll see a revision chip shipping. Or maybe Charlie was right and they shut down TSMC... ok maybe not that.
 
GK104 is quite small, especially compared to GF104 and the 6870 series.

If you look at the dies size it is

Tahiti > GK104 > Pitcairn > GK106 - I hardly see any disadvantage for NV.
 
Last edited by a moderator:
From SemiAccurate: "Why can’t Nvidia supply Kepler/GK104/GTX680? TSMC is blameless, Kepler is a self-inflicted wound."

Yet the 680 somehow managed to outsell all Pitcairn parts (according to Steam). What magic is this!? :oops:

NVIDIA GeForce GT 540M +0.28%
Intel HD Graphics 3000 +0.23%
Intel HD Graphics +0.21%
NVIDIA GeForce GTX 680 +0.18%
Mobile Intel 4 Series Express +0.18%
ATI Mobility Radeon HD 5470 +0.16%
ATI Radeon HD 5450 +0.14%
ATI Radeon HD 6310 %+0.14%
NVIDIA GeForce GT 520M +0.14%
ATI Radeon HD 7800 Series +0.10%
 
Back
Top