Nvidia GT300 core: Speculation

Status
Not open for further replies.
(figures taken out of the air for comparison like most figures in these threads - guesttimates)
I know that high end products make a lot of money for IHVs. That's not a "guestimate" and it's not "taken out of the air".

I suppose the same goes for nVidia, which explains why GT200 isn't really cutting into their business results overall. I guess the 9600 and 9800-variations are the bread-and-butter.
I repeat: GT200-based cards are selling for less than $200 right now. That's not ultra high end, that's not even middle class per se.
 
I repeat: GT200-based cards are selling for less than $200 right now. That's not ultra high end, that's not even middle class per se.

That's not the point.
The point is which products have the largest sales volume and/or bring in the most money.

I mean, Intel's IGPs only cost about $4 each, certainly not high-end. But it gets Intel the biggest marketshare in the graphics market.
nVidia probably sells a lot more IGPs or very low-end videocards than the GT200-based cards.
Despite them 'only' being $200, most people simply don't need one, and besides, most OEMs don't sell systems with GT200-based cards preinstalled, except for super high-end systems (heck, I have a Dell Precision T3400 desktop at work, which is a business workstation series, and I had to custom-order a 9800GTX+ card for it, because Dell only offered Radeons or cheapo low-end Quadro cards. They don't offer GeForces at all).
 
That's not the point.
The point is which products have the largest sales volume and/or bring in the most money.
The point is that it's not smart to think that high end is just for the image and it's not bringing any serious money to the IHV. NV had ~20% of revenue from workstation/server products last quarter which sells in far less quantities than any mainstream videocard -- even high end. It's most likely the same for high end mainstream segment -- the sales are certainly less than in middle and low end sectors but the profits on each card sold are so much higher that even while selling susbtantially less they still bring an important amount of money to the company.
I agree that middle end is important but that doesn't mean that high end is not.
 
The point is that it's not smart to think that high end is just for the image and it's not bringing any serious money to the IHV. NV had ~20% of revenue from workstation/server products last quarter which sells in far less quantities than any mainstream videocard -- even high end.

Yea, but did you look at the prices they charge?
A Quadro or Tesla card which is technically virtually the same hardware as a GTX260 or GTX285 goes for thousands of dollars. The cost of manufacturing is insignificant in the total sales price and as such the profit margin is huge.
Obviously those products generate a lot of revenue even at low sales volumes.

The same doesn't hold for the GeForce GTX series. The price is close to the manufacturing cost, and profit margin is small.

It's most likely the same for high end mainstream segment -- the sales are certainly less than in middle and low end sectors but the profits on each card sold are so much higher that even while selling susbtantially less they still bring an important amount of money to the company.

That wasn't the point though.
The point was a comparison with ATi.
Now ATi obviously has smaller GPUs, and slightly simpler PCBs. So ATi will have lower production cost.
Since they sell at about the same price, or nVidia even slightly more expensive, the conclusion has to be that nVidia isn't making as much profit as ATi is in that segment.

*But*, since the overall business results of nVidia compared to ATi don't seem to reflect that, apparently nVidia can 'afford' these small price margins. In fact, it seems that ATi is struggling more, on the whole. So apparently nVidia is doing well enough in the other segments, their bread-and-butter.
 
The same doesn't hold for the GeForce GTX series. The price is close to the manufacturing cost, and profit margin is small.
Do you know the manufacturing cost of any GeForce GTX card?

That wasn't the point though.
The point was a comparison with ATi.
Now ATi obviously has smaller GPUs, and slightly simpler PCBs.
So 4870X2 PCB is simplier than GTX285 PCB?

So ATi will have lower production cost.
So 2 GB of GDDR5 is lower in cost than 1 GB of GDDR3?

Since they sell at about the same price, or nVidia even slightly more expensive, the conclusion has to be that nVidia isn't making as much profit as ATi is in that segment.
This might be true for GTX260 vs 4870 and GTX275 vs 4890 scenarious (and the reason for those is the crappiness of GT200 mostly), but we're talking about mGPU vs sGPU solutions situation.

*But*, since the overall business results of nVidia compared to ATi don't seem to reflect that, apparently nVidia can 'afford' these small price margins. In fact, it seems that ATi is struggling more, on the whole. So apparently nVidia is doing well enough in the other segments, their bread-and-butter.
That's what puzzles me. If GT200 is that bad wrt production costs and G92 is old and bad in comparision to competition then why isn't NV suffering and ATI making loads of money? Something with this "general assumption" about the costs of ATI GPUs seems to be off.
 
Since when was the 4870X2 competing with the 285? It's performance is clearly head and shoulders above it. Obviously it costs more to make but it also sells for more.

That's what puzzles me. If GT200 is that bad wrt production costs and G92 is old and bad in comparision to competition then why isn't NV suffering and ATI making loads of money? Something with this "general assumption" about the costs of ATI GPUs seems to be off.

It is really puzzling. I think about this daily and I'm not really sure what the answer is but it seems that Price-performance/efficiency leadership doesn't have a big impact on the bottom line. NV's brand recognition seems a lot better than ATI's (this is just pure speculation on my part but it seems that if people know of an IHV they know of NV and not ATI), and their devrel dept is better funded. NV is really good at marketing and promoting their products.
 
Btw GT200 is twice as big as RV770 so the die is obviously going to be more expensive, board costs are going to be higher as well do to the 384bit memory bus necessitating more PCB layers, and GDDR3 and GDDR5 memory are near price parity at the moment. Given this it's pretty easy to see that NVs margins on GT200 based products are lower than ATI's.
 
WRT to Theo's article earlier:
"GT300 is the first truly new architecture since SIMD [Single-Instruction Multiple Data] units first appeared in graphical processors."

Wow - first "truly new" since when? TNT or was it Geforce 256? ;)
 
Since when was the 4870X2 competing with the 285? It's performance is clearly head and shoulders above it. Obviously it costs more to make but it also sells for more.
I have both 4870X2 and GTX280 and tbh i prefer GTX to the 4870X2 because i've grown tired of all those AFR problems - it works, it's not, it lags, it's not, it's fast, it's not... GTX280 is much more stable in it's overall performance though certainly slower in most of games. And GTX285 is better than GTX280 -- it's less noisy. So i wouldn't say that 4870X2 is just plain better than GTX285. If i had to go and buy one right now i think i'd bought GTX285.
But we're talking about mGPU vs sGPU cards in general, not the isolated GT200 vs RV770x2 case (which as i've already said shouldn't be considered as a standard probably because of the greatness of one RV770 and the relative suckiness of GT200).

It is really puzzling. I think about this daily and I'm not really sure what the answer is but it seems that Price-performance/efficiency leadership doesn't have a big impact on the bottom line. NV's brand recognition seems a lot better than ATI's (this is just pure speculation on my part but it seems that if people know of an IHV they know of NV and not ATI), and their devrel dept is better funded. NV is really good at marketing and promoting their products.
How's brand recognition and devrel helping make more money if you sell your cards without any profit? I think that the "general assumption" about the comparative production costs of current NV and AMD videocards is wrong. AMD isn't making loads of money off RV770 and NV isn't totally screwed selling GT200 cards at $200 price point.
 
How's brand recognition and devrel helping make more money if you sell your cards without any profit? I think that the "general assumption" about the comparative production costs of current NV and AMD videocards is wrong. AMD isn't making loads of money off RV770 and NV isn't totally screwed selling GT200 cards at $200 price point.

I don't think the assumption is wrong at all. HD 4890 makes more money for ATI at $250 than GTX 275 makes for NV at the same price. That's pretty much a given IMO. RV770 is a smaller die and uses a narrower bus. Both board and die costs are going to be lower for ATI. NV isn't totally screwed selling GT200 based parts for 200 but the margin is most certainly not what they'd like it to be.

NV is making their cash off of G9X based parts. Also in the desktop professional segment ATI is nearly non- existent and in the notebook arena NV thoroughly dominates everything above the mid range. I recently looked a huge collection of laptop SKUs and ATI SKU share (note this is not market share) is pegged around 33% but in the high end and professional segments it's ~10-20. Professional alone is more like 5% SKU share. Low end is about 40% or so. This is where NV is raking in the cash, NV does near ~800 million in the professional segment per year and this is pretty much entirely margin. Devrel and marketing is why they are dominating these segments because any objective comparison shows that ATI's parts are very competitive.
 
Do you know the manufacturing cost of any GeForce GTX card?

What does it matter?
A high-end Quadro or Tesla is virtually identical to a GTX, so without knowing the actual manufacturing cost, we can deduce that the manufacturing cost of these three should be nearly identical. Hence, a difference in salesprice of over $1000 can't be related to manufacturing cost. Since the GTX is by far the cheapest of the three, it's by far the closest to manufacturing cost, and as such has by far the lowest profit margin.

So 4870X2 PCB is simplier than GTX285 PCB?

So 2 GB of GDDR5 is lower in cost than 1 GB of GDDR3?

I was comparing single-card solutions mainly, because the dual-GPU cards are even less interesting in terms of sales volume.

This might be true for GTX260 vs 4870 and GTX275 vs 4890 scenarious (and the reason for those is the crappiness of GT200 mostly), but we're talking about mGPU vs sGPU solutions situation.

Well, I wasn't.

That's what puzzles me. If GT200 is that bad wrt production costs and G92 is old and bad in comparision to competition then why isn't NV suffering and ATI making loads of money? Something with this "general assumption" about the costs of ATI GPUs seems to be off.

Well, as I was trying to say in my previous posts, it's probably the relation between the profit margin and sales volume... so the 'significance' of the specific product lines in the overall business results of nVidia and ATi.
Another theory might be that ATi doesn't get very high yields on their GPUs while nVidia does, so the actual cost isn't quite what the die size difference implies (looking at how much the power consumption went up from 4870 to 4890, it seems that ATi does bin its GPUs rather aggressively).
 
NV's brand recognition seems a lot better than ATI's (this is just pure speculation on my part but it seems that if people know of an IHV they know of NV and not ATI), and their devrel dept is better funded. NV is really good at marketing and promoting their products.

The funny thing is that ATi has been around much longer. The first time I saw an ATi card was in a Tulip 286 machine. I recall that the ATi VGA card had a mouse port, quite unusual.
Somehow ATi never could build up a good reputation that really 'stuck' except in photography.

nVidia literally came out of nowhere not too long ago, and they managed to establish themselves as THE brand for 3d acceleration. A combination of great products and marketing them to the masses (and developers).
They reap the benefits when things aren't going as smoothly (eg GeForce FX era).
 
Devrel and marketing is why they are dominating these segments because any objective comparison shows that ATI's parts are very competitive.

My company also has a policy to support only nVidia hardware. As far as I know, there have never been any nVidia representatives around who may have influenced that decision.
The thing is just that we can't afford to validate our software for more than one vendor. So we just pick the vendor that has the biggest marketshare and the best reputation in terms of driver compatibility.
As such, all our systems are bought specifically with nVidia hardware onboard.

I would be surprised if we are the only company in the world with a policy like this.
 
I wish there was some data to support my assertion that NV brand recognition is better than ATI's... I guess the whole FX era is indication enough though. Those parts were utter crap and yet they still sold boatloads of shitty FX5200s. If we go back far enough we've also seen this sorta of anomaly happen with ATI though. The Rage Pro sold boatloads and was crap compared to the competition. Then it was a combination of 2D+Shit 3D+AGP (first part with AGP actually) that made the difference. At least that was a tangible, measurable different w.r.t the competition... now NV leads based purely on this image they've built for themselves as the defacto provider of 3D.
 
Perhaps people who work with professional OpenGL graphics software can comment a little more on the differences between the two vendors.

Anecdotal evidence I've come across seems to point to one's OpenGL implementation being more complete and support more helpful, and the revenues so far appear to support the supposition.
 
My company also has a policy to support only nVidia hardware. As far as I know, there have never been any nVidia representatives around who may have influenced that decision.
The thing is just that we can't afford to validate our software for more than one vendor. So we just pick the vendor that has the biggest marketshare and the best reputation in terms of driver compatibility.
As such, all our systems are bought specifically with nVidia hardware onboard.

I would be surprised if we are the only company in the world with a policy like this.

You're not the only company, there are many companies just like yours that specify NV as a requirement for the same reasons. What ATI needs to do is start eroding this soft leadership NV holds over the market. It would obviously take plenty of time but NV's reputation hasn't been built over night, they've built it up since 1999 with the release of the GeForce. Near constant performance leadership and market domination for 10 years (with a few slips here and there obviously) is what has built this reputation.
 
Perhaps people who work with professional OpenGL graphics software can comment a little more on the differences between the two vendors.

Anecdotal evidence I've come across seems to point to one's OpenGL implementation being more complete and support more helpful, and the revenues so far appear to support the supposition.

I don't work with pro. OGL graphics but I'll say that there are minor differences, for example it took ATI a long time to support gen lock on the FireGL Products (gen lock locks the output across several adapters so that it's in sync on the output device(s)). And I wouldn't be surprised if ATI's OGL support was lagging NVs in some minor ways. But the differences are surely tiny, and most probably wouldn't affect 90%+ of CAD users. I think the culprit here is companies specifying NV as a requirement.
 
I think the culprit here is companies specifying NV as a requirement.

I think you're oversimplifying it a lot. There's the matter of expertise and comfort with Nvidia's tools and support. It's not like us where we just stick the card in a slot and install a driver. Many companies probably have valuable relationships with Nvidia that can't be replaced simply by sticking an ATi card into the motherboard.
 
Seems like Theo is just cobbling together bits and pieces that we've heard elsewhere....
Which is not so bad, speculation is fun ...

Personally I think it's hard to unite Dally's statement about Larrabee not being radical enough with just continuing the present type of GPU architecture ... in the end Larrabee is not so different apart from the caches. The parallel execution of shaders through SIMD and vectorized loads/stores is very similar. I think a very low branch granularity GPU is possible and will provide some very big advantages as everything moves to (sub) pixel sized features.
 
Status
Not open for further replies.
Back
Top