Roy Taylor: "Only Brits care about ATI"

Explain? I haven't followed graphics tech so closely lately, but everything I've heard has nvidia winning on the high end and at least tieing at lower price points.
My point is simply that NVIDIA thought RV670 would be much less impressive in every single way pre-launch, so they got a bit surprised there and their optimism was unwarranted. I'm hardly the first person to be claiming this either... (CJ's posts come to mind, among others)
 
You know Nvidia's scared when they start their misinformation campaign and then tell everyone they are not worried about the people they are deriding. If that was true, why spend all that time and money on a negative campaign for competitors that are not a threat?

Given what they've recently said about Intel, I think Nvidia can see it coming now - that day when they are caught between the Intel rock and the AMD hard place when both companies can supply everything themselves instead of getting it from Nvidia.


It goes both ways, Intel actually are the ones that started the fud campaign. Its more marketing then anything else, with the whole GPU vs CPU which one is more important. Of course Intel will say CPU's and nV will say GPU's, since they make money on those components. Larrabee is still a CPU right?

nV being arrogent, well lets say the next round of graphics wars........ isn't that close at least for single core perspective.
 
NV should stop being so arrogant about their upcoming product launches, or they risk looking just as dumb as with G92 vs RV670 where they massively underestimated RV670 from every single point of view internally. I'm not saying they will underdeliver and AMD will overdeliver this time around, but it seems pretty dumb to risk making the same mistake again.
Couldn´t have put it any better. Mirrors almost exactly my thoughts on the matter.

Just stumbled over a transcript from a conversation Jen-Hsun had with Brooke Crothers from CNet, which might interest some of you guys:
http://www.cnet.com/8301-13924_1-9939430-64.html

Judging solely from my personal perspective, it tells me how dissatisfied Jen-Hsun seems to be with the whole situation that NV tries very hard, but can´t really cope with Intel in the long-term, because they are bound to certain limits internally (execution) and financially (fab-wise, R&D-wise) and it certainly looks like he is also frustrated, because he knows that this is the one battle that is not completely controllable solely by himself and his company. What he can´t control makes him nervous, especially the future outlook and positioning of his company, so he´s getting a little emotional (this is not bad per-se) on the whole matter. Since Intel is the master of playing dirty tricks (I don´t even wanna know what really goes on behind the curtain) and are not exactly starving for money, all he seems to want is tell the world about how he feels.

I think Jen-Hsun surely still has a lot of insight into what he´s talking about, since it´s his daily business. And while it´s understandable that he is commenting on Larabee (it got quite the attention in the media), I don´t really see the need for it to sound that seriously arrogant, because if it´s really that kind of vaporware Jen-Hsun seems to think, instead a shorter comment, like "Larabee, that´s Intel, right?" would´ve not only said the same thing he said in 3 sentences, but it would also be a more laid-back approach, especially if you´re talking to the media. (I still remember his words, when in the CC before G80 got released, he answered it with "G80, what´s that?" and everybody just had a great time.)

Finally, since AMD is still on the playing-field and recently putting out products that are competing pretty well at least within the performance-segment, NV will also have a hard time to keep up with their pace and margins. It´s the volume in that very segment that NV needed for very healthy margins and if AMD continues to grab a good portion of that segment (excellent performance per $), they have to replace that with something that has real value over their competitors, like the recently started broader CUDA-approach or high quality video transcoding/encoding, because that really would set them apart, even from Intel CPUs (yes, there´s a small Jen-Hsun in me, too).

Their other efforts, like Arun already mentioned, will get more important than ever, since there´s not only the opportunity, but also the demand for e.x. mobile stuff with good graphics capabilites.
 
Finally, since AMD is still on the playing-field and recently putting out products that are competing pretty well at least within the performance-segment

Which products are competing so well?

NV will also have a hard time to keep up with their pace and margins. It´s the volume in that very segment that NV needed for very healthy margins and if AMD continues to grab a good portion of that segment (excellent performance per $), they have to replace that with something that has real value over their competitors, like the recently started broader CUDA-approach or high quality video transcoding/encoding

I don't see what CUDA has to do with any volume segment, and quality video encoding is also a niche that few people know or care about.
 
I think you shouldn't read too much into it. You just need to look at NVIDIA's stock graph in the 6 months before Analyst Day and the various analyst comments in the same time period. There's really nothing else to it; analysts were panicking all over the place, and Jen-Hsun decided to try making clear that it would be just as appropriate to be panicking about Intel. Both are risky investments and it'd be naive to think one is orders of magnitude more likely to win a 'war' than the other, especially given on whose terms this war will be waged.

Honestly, much of the media perception from NV's Analyst Day was made by people who didn't even bother listening to one full hour of it, and those who did likely didn't often listen to NVIDIA CCs in the past. I'm a complete loss as to how these people could accurately judge what was going on there - and if I disagree with them, it's not because I'm biased, it's because I have possibly orders of magnitude more data about the situation. The one thing I don't like in the current situation si that it seems too many levels of NV management underestimates ATI. Jen-Hsun still seems to kinda care about them, but sometimes I wonder if anyone else even does? Heh.

Once again, proper investment in CUDA R&D for consumer applications could very easily achieve a 100:1 return on investment. I've talked about this so many times I grow tired of it - honestly, the only persons it'd still be worth talking about it with is Jen-Hsun or Andy Keane, there's only so many times I can repeat something otherwise without growing extremely annoyed of my own repetition.
 
Which products are competing so well?
Practically every single product that is based on the RV670 ASIC. More to come.

I don't see what CUDA has to do with any volume segment, and quality video encoding is also a niche that few people know or care about.
Too short sighted, if I may. CUDA, extended video capabilities and accelerated photo-effects-processing (think Photoshop, which is used a lot) are only examples of the direction that would give them a clear advantage and these are one of the strong points were NV could excel. Practically everything you can think of that can save serious time in your workflow can also be directly translated into value, which then generates demand for GPUs. That´s what they do best. But only if there´s software that releases that unused potential.

Not even Intel could talk that away, or they would make themselves look like complete idiots (if they really cared is another matter).

NV pays for the very expensive silicon that already sits idle most of the time (when it´s not used by games). Games on the PC were and obviously still are the driving force but if that trend continues and discrete GPUs are only used for games, while consoles get better and better every day, this will change. This is strictly talking about the PC market and I realize that NV can also continue to sell their IP / ASICs to Sony, MS etc. but the PC market will be the bottleneck, because that´s where Intel plays best.
 
Practically every single product that is based on the RV670 ASIC. More to come.

I just don't see it. The 3870 is sandwiched between the 9600GT and 8800GT, both of which are doing extremely well, and the 3850, though certainly offering very good value for money, doesn't really play in the performance segment.

Too short sighted, if I may. CUDA, extended video capabilities and accelerated photo-effects-processing (think Photoshop, which is used a lot) are only examples of the direction that would give them a clear advantage and these are one of the strong points were NV could excel. Practically everything you can think of that can save serious time in your workflow can also be directly translated into value, which then generates demand for GPUs. That´s what they do best. But only if there´s software that releases that unused potential.

That's a big if. At this point in time CUDA offers no tangible benefit for consumers. And historically the market has never been quick to take up wide support for additional instruction sets like MMX/SSE, let alone those that don't have Intel's seal of approval.
 
NV should stop being so arrogant about their upcoming product launches, or they risk looking just as dumb as with G92 vs RV670 where they massively underestimated RV670 from every single point of view internally. I'm not saying they will underdeliver and AMD will overdeliver this time around, but it seems pretty dumb to risk making the same mistake again.

Hmmm how is G92 a result of Nvidia underestimating RV670? G94 is competing well with it in both the 3870 and 3850 incarnations. G92 also holds up well to the 3870-X2 in a lot of cases - seems like Nvidia estimated RV670 performance just fine.

To me G92 is just a cynical attempt to sell people the same chip for two years, price drops notwithstanding. And what they got for their trouble was poor yields on 65nm.
 
I just don't see it. The 3870 is sandwiched between the 9600GT and 8800GT...
You need to rewind back a little. This is already months after RV670 got released and exactly what I´m talking about.

When RV670 was reviewed and already available, NV took a great hit at their margins, because they had to deal with it. G92 was cutting into their margins, right from the start. G94 was released months later and still is way above 200mm², while RV670 is only ~190mm. Now, if NV had released G94, when ATI released RV670, this wouldn´t have been much of a problem. However, NV only had G92 ready by that time and they even needed to bring it to the market, as fast as they could. ~190mm² vs. ~330mm² isn´t exactly what NV had in mind and they didn´t realize ATi was ready until it hit them by surprise.

Business-wise, what matters is what you get in return. So, all in all, ATI gave them a hard time, but only because NV vastly underestimated ATi. This should not happen twice. This is competition and ATi did a good job this time.

Florin said:
That's a big if. At this point in time CUDA offers no tangible benefit for consumers. And historically the market has never been quick to take up wide support for additional instruction sets like MMX/SSE, let alone those that don't have Intel's seal of approval.
I´m not talking about the current point in time. I´m looking more towards the future (of NV). And since I´m obviously trying to make examples out of what NV needs to do, Intel can just sit on their asses for another 10 years and still survive, because they simply got the money for it, but that´s not exactly the topic I was talking about.

NV doesn´t need Intel´s seal of approval for software that works on their own GPUs. If they offer higher performance in an application that is time consuming and can be accelerated by the GPU, while their market penetration is still very high, NV needs to release it and profit from it. Intel cannot just take that back over night.

What Intel does or does not shouldn´t be NV´s concern. Not if they let Intel eat into their business like they allowed ATI. There´s no other way around it. They cannot run. If Jen-Hsun wants to be bigger than Intel (famous quote) he has to show us that he really has good ideas and the timing in place to execute them.
 
You need to rewind back a little. This is already months after RV670 got released and exactly what I´m talking about.

When RV670 was reviewed and already available, NV took a great hit at their margins, because they had to deal with it. G92 was cutting into their margins, right from the start.
Let's not exagerate either, their GPU margins remain noticeably superior to ATI's AFAIK; and I haven't heard Jen-Hsun complain about pricing pressures in the low-end, yet I did hear AMD complain about it at the last quarterly CC... The only possible explanation here is that die size isn't the only thing to consider. It's by far the most important one, but there are other factors.

Clearly G92 is less cost efficient than RV670, I'm not going to contest that, and clearly margins aren't where they'd like them to be. However, a possibly more important part of the margin problem in Q1 was excess G80 inventory (and thus lower pricing to get rid of them). And why did that happen? Because RV670 overdelivered on performance.

You said in the NV Q1 CC thread that you thought Jen-Hsun was not telling the whole truth when it comes to G92 being both higher performance and lower cost. Perhaps, but the main thing he wasn't disclosing isn't that G92 margins are too low; it's that having to release a faster-than-expected G92 SKU at launch prevented them from selling their G80 inventory like they had expected to.

The original rumours were that the first G92 SKU would have 6 clusters and be named GeForce 8700. I think it's pretty clear that would have been much less competitive with a GeForce 8800 GTS 320/640MiB, and would have made the GTX and Ultra less redundant. So I do believe that was indeed their original plan, and having to compete with RV670 completely screwed their inventory management strategy.

So it's not that 'faster and cheaper' transitions never happen. They certainly do, and when handled properly they're not a problem at all. But in this case, they didn't think it would be one of those until it was too late. And we both know what happened afterwards...
 
Let's not exagerate either, their GPU margins remain noticeably superior to ATI's AFAIK; and I haven't heard Jen-Hsun complain about pricing pressures in the low-end, yet I did hear AMD complain about it at the last quarterly CC... The only possible explanation here is that die size isn't the only thing to consider. It's by far the most important one, but there are other factors.
Sure, they are still very healthy, no denying in that. However, they were on a pretty healthy uprise and then they made some mistakes that I wouldn´t expect from NV. However, if that remains to be their only misstep for a while, it will be a thing of the past pretty soon. But it was certainly less than optimal and I think we can both agree on that.

Clearly G92 is less cost efficient than RV670, I'm not going to contest that, and clearly margins aren't where they'd like them to be. However, a possibly more important part of the margin problem in Q1 was excess G80 inventory (and thus lower pricing to get rid of them). And why did that happen? Because RV670 overdelivered on performance.
Exactly.

You said in the NV Q1 CC thread that you thought Jen-Hsun was not telling the whole truth when it comes to G92 being both higher performance and lower cost. Perhaps, but the main thing he wasn't disclosing isn't that G92 margins are too low; it's that having to release a faster-than-expected G92 SKU at launch prevented them from selling their G80 inventory like they had expected to.
Not specifically G92. He said that "When you introduce a new product, it's either higher performance or lower cost..." which I then commented with him not telling the whole truth when it comes to producing an ASIC that has both higher performance and is cheaper to produce.

It just sounds like he´s highlighting something that seems very very admirable, but instead it´s done in every business. This is nothing terribly special. It would´ve been special if everything went so they didn´t have to sell G80 cores at a loss and G92 would´ve been yielding high from the beginning, but judging from the current and past CC comments, they failed in doing so.

So it´s kinda strange that his conclusion "G92 is both faster and cheaper" is added. Like he wants to congratulate himself on that.

So it's not that 'faster and cheaper' transitions never happen. They certainly do, and when handled properly they're not a problem at all. But in this case, they didn't think it would be one of those until it was too late. And we both know what happened afterwards...
Yep.

The original rumours were that the first G92 SKU would have 6 clusters and be named GeForce 8700. I think it's pretty clear that would have been much less competitive with a GeForce 8800 GTS 320/640MiB, and would have made the GTX and Ultra less redundant. So I do believe that was indeed their original plan, and having to compete with RV670 completely screwed their inventory management strategy.
Judging from the leaks that happened some weeks before RV670 hit the streets, that is the same theory I have and what I think messed it all up. I started right there.
 
The original rumours were that the first G92 SKU would have 6 clusters and be named GeForce 8700. I think it's pretty clear that would have been much less competitive with a GeForce 8800 GTS 320/640MiB, and would have made the GTX and Ultra less redundant. So I do believe that was indeed their original plan, and having to compete with RV670 completely screwed their inventory management strategy.
Well, perhaps. Rumors aren't really evidence though. 3870 is thoroughly beaten by even the partially disabled G92 in 8800GT. 8800GT came first and RV670 couldn't beat it. Even the newer 825MHz RV670s can't beat 8800GT. Obviously, ATI's aggressive pricing has been the best part of RV670 for consumers. Still, I'm not sure RV670 impacted NV pricing that much other than redefining the lower end of the market. It became the cheaper and slower option, sort of like choosing a Athlon 64 X2 over a Core 2.

I suppose if RV670 had turned out to be a real stinker instead of basically a R600, things could've been different. It's hard to imagine ATI delivering an even worse new GPU though. Yikes. NV would own almost all of the market right now.

Who's making more money on their cards? Well, NVIDIA sells its products at each segment for more than the ATI equivalent and people buy them in droves. NV also has a number of cards selling in segments above what ATI can cover and most of those are just fully-enabled G92 chips or G94 chips that must be cheaper than RV670. So they get to enjoy greater selling prices on the same GPUs. As long as they aren't yielding dramatically worse because of not having any redundancy, it seems like a nice situation to me. G92 in 8800 GTS and 9800 GTX doesn't have any redundancies, does it?
 
Last edited by a moderator:
Well, perhaps. Rumors aren't really evidence though.
No, certainly not. Arun (and me) can´t really prove something that didn´t happen.

But there´s strong evidence if you followed some of the CCs and Jen-Hsun´s comments about the 8800GT. If you´re already dominant in the high-end and also have great yields (G80), it just doesn´t make sense to release a cost-cutting measure G92 that eats into your own margins, way before you can sell out the majority of your G80-cores. In some past CC (couldn´t find the transcript) Jen-Hsun also said that rather than getting competition from a competitor, he prefered to have the competition take place within his own company, which only makes sense if they actually have a competitor that is going to or already has released a product which is approx. as fast as their own product/s at that very price range at a way lower price.

Now, if they had released a 6-cluster G92 and RV670 would´ve not been like it finally turned out (slower and later), NV would´ve had a perfectly balanced line-up, where they could´ve sold the majority of their ASICs because they´d have had the additional redundancy of 2 clusters (which would´ve improved their yield substancially) but instead they needed a higher performing part and adios to the original 6-cluster G92. Now the 8800GT became more and more attention in the media (rightly so) and practically got bought faster than NV could supply G92 cores for it - and still RV670 was considered a great product from ATi. It was a move that was necessary, but I´m very certain that this was not planned.
 
Well, perhaps. Rumors aren't really evidence though. 3870 is thoroughly beaten by even the partially disabled G92 in 8800GT. 8800GT came first and RV670 couldn't beat it. Even the newer 825MHz RV670s can't beat 8800GT.
It's not a matter of performance; it is one of time to market. 7-cluster and 8-cluster G92 SKUs would have arrived eventually anyway. But the important thing to understand is that G80 was on allocation and supply-limited for several months, and NV placed substantial orders at TSMC to rectify that; the time between those chips being back from the fab and G92 being back was minimal, so that decision only made sense if the first G92 SKU would have been lower-end than the lowest-end G80 solution.

NV likely expected RV670 to come later and at lower performance (i.e. slower than a 6C SKU), which meant that they didn't think it would be a problem at all to keep selling G80s for a few more months. But it turns out they realized they wouldn't really have a major performance leadership with less than 7C at their target clock rates, so they had to change their plans to remain competitive and negatively impact ASPs on the remaining G80 inventory in the process...
 
NV likely expected RV670 to come later and at lower performance (i.e. slower than a 6C SKU), which meant that they didn't think it would be a problem at all to keep selling G80s for a few more months. But it turns out they realized they wouldn't really have a major performance leadership with less than 7C at their target clock rates, so they had to change their plans to remain competitive and negatively impact ASPs on the remaining G80 inventory in the process...

Thats probably very true. Except that this move made by nVIDIA had resulted in screwing up their lineup (and the transistion to the 9 series), and leaving a very messy of a naming scheme.
 
RV670 and G94 were supposed to launch simultaneously in January (and G92 was supposed to launch at the same time or later - I'm betting later).

G92 was never supposed to compete with RV670. It was supposed to be a replacement for G80.

NVidia faced a choice over which chip to rush when discovering that RV670 was a couple of months early (and not knowing its performance). They played it safe, i.e. in order to ensure that RV670 couldn't take any performance crown in the performance sector they rushed G92 and cut corners to get it out there - that's why we heard so much about it running hot and companies being asked to send in cases/systems to check the heat. And why some forums were full of people complaining about the early G92s failing due to heat because the early coolers were under-specified.

G92, if it was on its original schedule, would have been the same chip as we see today with the same SKUs. Just the marketing names would have been different and the prices would have been higher and G80 would have had time to exit stage left gracefully, selling out over Christmas.

G94 came out when it was supposed to. But unluckily for NVidia, ATI had already forced prices lower by the relative lack of performance of RV670 and its couple of months on the market. If RV670 had launched in January, you can bet G94 would have commanded higher prices.

Instead G94 was forced to under-cut G92 prices...

The end result appears to be that NVidia still completely dominated the Christmas period. If RV670 hadn't released so early then NVidia would still have had control with G80 versus R600.

Jawed
 
G92, if it was on its original schedule, would have been the same chip as we see today with the same SKUs. Just the marketing names would have been different and the prices would have been higher and G80 would have had time to exit stage left gracefully, selling out over Christmas.

Given how cold a reception the 9800GTX received at its current pricing it's highly unlikely that Nvidia could have planned to sell it for more. Remember the 8800GTX still has more memory, fillrate and bandwidth so I don't see how Nvidia could have made the case for a more expensive 9800GTX.
 
Given how cold a reception the 9800GTX received at its current pricing it's highly unlikely that Nvidia could have planned to sell it for more. Remember the 8800GTX still has more memory, fillrate and bandwidth so I don't see how Nvidia could have made the case for a more expensive 9800GTX.

Well it certainly doesn't help that you can get the exact same thing as the 9800GTX for under $200 in the form of the 8800 GTS 512.
 
Given how cold a reception the 9800GTX received at its current pricing it's highly unlikely that Nvidia could have planned to sell it for more. Remember the 8800GTX still has more memory, fillrate and bandwidth so I don't see how Nvidia could have made the case for a more expensive 9800GTX.
You're forgetting the chain of events forced prices downwards.

In the alternative history:
  • G80 retains enthusiast class from October to February, with price holding up since there is no competition
  • G94 launches in January at ~$250 alongside RV670 at the same or a bit less
  • G92 launches in February/March at $400/350 in its initial 9800GTX/GT guises. The last few G80's are still available...
  • The prices of G94/92 hold up because G92 never had to compete against a $250 or less RV670. There would even be space for a 9800GS between 9600GT and 9800GT ;)
Jawed
 
Back
Top