Nvidia BigK GK110 Kepler Speculation Thread

True, but smart phones usually aren't 1080p either. And desktop monitors are generally a little bit on the low side as far as pixel viewing density goes.

Edit: The iPhone 5 is 1136x640, while the Nexus 4 is 1280x768. I don't think it's very likely that 4" smart phones will go much higher in resolution.

Well, Sony Xperia Z is with a 5" 1080 X 1920 display, with the staggering PPI of ~441. :oops:
http://www.gsmarena.com/sony_xperia_z-5204.php

While HTC One is with 4.7" 1080 X 1920 display, with the unthinkable PPI of ~469. :oops:
http://www.gsmarena.com/htc_one-5313.php
 
So how about that Titan? It looks less expensive to build than G80 was, and that was <$500. GPU is about the same size and it uses the same bus width. It doesn't have a separate IO chip like G80 needed. The way card prices have slid upward is interesting.
 
It is quite a risky purchase given that the battery is non-removable. :LOL:
Most smartphones have around 2 years use-cycle anyway, any battery will last that long, and as long as it's good for one day of use / night on charger, it's good for me, on longer trips I can always charge in car (my current One S has non-removable battery too)
 
So how about that Titan? It looks less expensive to build than G80 was, and that was <$500. GPU is about the same size and it uses the same bus width. It doesn't have a separate IO chip like G80 needed. The way card prices have slid upward is interesting.

In the past, NVIDIA was paying per die and not per wafer. Manufacturing costs have gone up with each new fabrication process too. And at the time of G80, the Tesla business did not exist, and ATI was still working on large single die GPU's. So these are all factors in explaining the relatively high price of Geforce Titan relative to 8800 GTX (although there was an 8800 Ultra released at the time that retailed for $839 USD).
 
In the past, NVIDIA was paying per die and not per wafer. Manufacturing costs have gone up with each new fabrication process too. And at the time of G80, the Tesla business did not exist, and ATI was still working on large single die GPU's. So these are all factors in explaining the relatively high price of Geforce Titan relative to 8800 GTX (although there was an 8800 Ultra released at the time that retailed for $839 USD).



The Tesla is high priced due to timeline,, it seems Nvidia and AMD was not going to release any card of their next series 1 year later of the 7970 and 680.. and so Tesla is alone in many domain.. today it look like an exepctional cards ( tomorrow, im affraid it will quickly be forget ) )
 
So how about that Titan? It looks less expensive to build than G80 was, and that was <$500. GPU is about the same size and it uses the same bus width. It doesn't have a separate IO chip like G80 needed. The way card prices have slid upward is interesting.

I'm not exactly sure about the cooler though. And to bring up another point: How did wafer prices and especially R&D costs evolve since G80? If I take on SMX out of GK110, one memory controller and one ROP I might be already over G80's total transistor count.
 
Yeah, G80 didn't push the bleeding edge manufacturing tech. It was on a very proven 90nm process. They just took 90nm to a new level with it.
 
In the past, NVIDIA was paying per die and not per wafer. Manufacturing costs have gone up with each new fabrication process too. And at the time of G80, the Tesla business did not exist, and ATI was still working on large single die GPU's. So these are all factors in explaining the relatively high price of Geforce Titan relative to 8800 GTX (although there was an 8800 Ultra released at the time that retailed for $839 USD).

They also could pay according to where the wind blows from for all I care. None of the above actually explains the high price of the Titan. A good starting point would be why the 2GB/GTX680 with just a 294mm2 die compared to a 3GB/HD 7970 with a 350mm2 die costs by a healthy margin more than the latter. I severely doubt that AMD sell its GPUs at a loss and even more so the 7950 SKUs.

How much of a difference is that mythical "per wafer" thing exactly going to make in the end; someone mentioned it once from NV and ever since everyone is parroting it and no one is actually able to tell what the difference would be and no it's not "self explanatory" either.

Assuming each 28HP wafer costs around $6k under that "per wafer" agreement and the IHV gets an average wafer yield of around 50% it will have changed what exactly compared to past agreements?

Alas if contracts wouldn't have tons of clauses to protect both sides; at the end of the day you may call it anything you want.

If memory serves well rumors had it that the GT200/65nm - 575mm2 had an average yield at the start of its production of ~52% on a mature process and cost them about $120 per chip to manufacture.
 
Hey remember when GT200 vs RV770 was the hot topic and people were sure that NVIDIA was gonna implode in the price war against 4870? ;) That was around $300, I think. AMD desperately clawing at getting market share back and NV not faster-enough to escape entirely from it.
 
They also could pay according to where the wind blows from for all I care. None of the above actually explains the high price of the Titan. A good starting point would be why the 2GB/GTX680 with just a 294mm2 die compared to a 3GB/HD 7970 with a 350mm2 die costs by a healthy margin more than the latter. I severely doubt that AMD sell its GPUs at a loss and even more so the 7950 SKUs.

Of course it explains the high price of Titan. There was no large die single GPU challenger from AMD, and there was not even an official dual die GPU challenger from AMD; there was relatively high demand for the much higher margin and higher priced Tesla GK110 from Oak Ridge National Labs and others; the number of useable dies per wafer surely cannot be anywhere near as good with a > 500 mm^2 7.1 billion transistor die than a < 300 mm^2 die with half the transistors, which means that per wafer pricing should result in a cost penalty with very large die sizes vs. the previous pricing model; the fabrication cost at smaller and smaller nodes is going up and up; etc. And as for the supposedly skewed pricing of GTX 680 vs. HD 7970, you forget that HD 7970 was originally priced at $549 USD while GTX 680 was originally priced at $499 USD. To deny that NVIDIA's competitive situation, NVIDIA's higher margin Tesla business, and NVIDIA's new pricing model have nothing to do with it is nonsense.
 
Hey remember when GT200 vs RV770 was the hot topic and people were sure that NVIDIA was gonna implode in the price war against 4870? ;) That was around $300, I think. AMD desperately clawing at getting market share back and NV not faster-enough to escape entirely from it.

At the same times the 7900 series have never sell so much of this first entry of the year.. I see non stop peoples coming with new cards.. Look like Nvidia is good at do good marketing for AMD. When you look the fresh review of Titan, but you are not there for put 1000$ in a card.. what do you seen ? the 7970 is just half way between Titan and the 680 . I have stop count the post i seen with: " ooh.. but the 7970 is faster of the 680? "
 
Of course it explains the high price of Titan. There was no large die single GPU challenger from AMD, and there was not even an official dual die GPU challenger from AMD; there was relatively high demand for the much higher margin and higher priced Tesla GK110 from Oak Ridge National Labs and others; the number of useable dies per wafer surely cannot be anywhere near as good with a > 500 mm^2 7.1 billion transistor die than a < 300 mm^2 die with half the transistors, which means that per wafer pricing should result in a cost penalty with very large die sizes vs. the previous pricing model; the fabrication cost at smaller and smaller nodes is going up and up; etc. And as for the supposedly skewed pricing of GTX 680 vs. HD 7970, you forget that HD 7970 was originally priced at $549 USD while GTX 680 was originally priced at $499 USD. To deny that NVIDIA's competitive situation, NVIDIA's higher margin Tesla business, and NVIDIA's new pricing model have nothing to do with it is nonsense.

A nice summary of expectable excuses to justify high profit margins; when volumes reduce but revenues increase I'm sure it's all a matter of higher costs. But since you're so keen to degrade everything as easily to "nonsense" I'd suggest you'd find some solid information how the situation really looks like instead of just trying to adjust reality to your own agenda.
 
I think the end result is even worse because they reveal their bad image to customers plus they miss lower profit margins; when volume is up but revenues still stay up because of it...

But why shift your entire product line pricing down when there is, supposedly, no real reason to? It seems pretty clear that their products are selling well and being able to position their products where they have is obviously quite an advantage.

I think we might see a slight shift with Nvidia pricing due to AMD having the sales momentum at the moment but that will probably come from more aggressive pricing of AIB's custom cards and not necessarily from Nvidia's MSRP.
 
Not only that but they released it the day after the PS4 reveal. That either points to terrible planning or an attempt to hide the bad news.
 
But why shift your entire product line pricing down when there is, supposedly, no real reason to? It seems pretty clear that their products are selling well and being able to position their products where they have is obviously quite an advantage.

I think we might see a slight shift with Nvidia pricing due to AMD having the sales momentum at the moment but that will probably come from more aggressive pricing of AIB's custom cards and not necessarily from Nvidia's MSRP.

I think Nvidia took good advantage of some chaos at AMD over the past 6 months but that advantage has gone and then some. Most of their cards are now simply unbuyable at current prices vs the Never Settle Reloaded bundle.
 
Back
Top