RussSchultz said:
Maybe you ought to look at who's market share is growing (Intel), and which market both NVIDIA and ATI are targetting (motherboard chipsets w/integrated video) before you go spouting off that NVIDIA has gone crazy.
As of my last look, Intel is not a player in the 3d-gaming chip & reference design market. The company is a total zero in that market space, which is as of now dominated completely by ATi and nVidia, and as of late mostly by ATi. The last time Intel was a player in this particular market segment was with their i74x/5x 3d chip and reference designs, and Intel literally had its tail handed to it on a platter as 3dfx and nVidia, along with their board OEMs, in addition to ATi and even Matrox at the time, were selling competing 3d-gaming products into that market space which literally ran rings around products done on the i7xx reference designs by Intel. I actually owned an i75x-based product at the time, and have clear memories as to how disappointingly slow it was in comparison to the competing products available at the time. As such, I was not in the least surprised when Intel exited that market space after being whipped so conclusively (and has not ventured into it since.) Again, Intel does not play in what have traditionally been nVidia's strongest markets by far, and the markets which have made nVidia what it is today (varying opinions on "what nVidia is today," of course.)
That's what makes these comments so incredibly bizarre and strange, as the idea that "we haven't lost share to ATi, but only to Intel," is certainly completely untrue and not connected to reality. What is actually the case is that in the *discrete market segments* in which nVidia competes, among which are retail and system OEM 3d-gaming product markets, and the low-end 2d value integrated segment (requiring only minimal 3d hardware API support), nVidia has lost share to both Intel and ATi over the last year. In the first market segment here Intel isn't even present; and so ATi is certainly nVidia's strongest competitor without a doubt. It's only in the second market segment here that nVidia has lost market share to Intel, and that's an entirely different market segment from the one in which nVidia has lost market share to ATi in the past year.
The problem is that the "graphics chip market" does not exist as a single pie. It is often characterized as a single pie, but whenever that is done it presents only a gross distortion of whatever trends and events are actually occuring in the fully discrete market segments within the market as a whole. By way of example, a pie representing companies based on the total number of chips they manufacture, regardless of market, would be just as misleading and uninformative.
The only credible, informative way to view the "graphics chip market" is with multiple pies, each pie denoting a specific market segment which differs from the other segments in terms of products, purpose, cost, profit, and volume. To spell it out, a GF5600 would not be competitive in the integrated gpu market where Intel plays, to the degree that it simply could not be sold in that market, and an Intel integrated graphics chip designed for its corporate market would not be competitive in the GF5600 market to the degree it could not be sold in the 5600's market. Differing product lines are targeted to differing markets which are completely distinct from one another. So, against nVidia's GFFX reference card product line Intel simply does not compete at all, and is not even in the picture.
Secondly, moving to the IGP segment of the market, ATi's IGP chipsets and products use the P4 bus license as of now, and nVidia's IGP products do not--unless something major has happened and I've completely missed it...
Interestingly enough, I read a statement recently which was attributed to JHH in which he stated he was staying away from Intel chipsets not only because of the $5-a-pop licensing fee, which he felt gave Intel an inherent cost advantage it would be difficult to overcome, but also because JHH feels that the P4 chipset market is already far too competitive for nVidia to enter with much expectation of success. So what JHH was saying *then* was for that market segment nVidia did not see itself as being able to compete not only with Intel, but also with an already numerous field of other competitors making P4 chipsets.
But there is *one way* in which all the discrete graphics-chip markets are indeed interconnected for the purposes of a company like nVidia or ATi. This is something nVidia has known and understood relative to its own success for quite awhile, and that is that in order to capture the low end you must first capture the high end. When you capture the high end--in this case the performance 3d-gaming API chip & reference design markets for OEM and retail sales--it then becomes an order of magnitude easier to drive down your product mix into the value market segements, and even eventually into the integrated graphics chip markets which reside at the very bottom, which is where Intel currently feeds ("bottom" here in terms of 3d API hardware functionality and performance, and production cost.)
This has been the key for nVidia's overall success in the last couple of years. In fact, it has also positively affected nVidia's ability to drive its products into entirely different markets, such as the Athlon core-logic chipset market, for instance.
So what's happening is that because nVidia has *lost* its position over the last year in its top and most fundamental market, the performance 3d-gaming API chip & reference design markets for OEM and retail sales, it has not only lost market share there (to ATi, of course), but also in the integrated value market segment where it competes with Intel, and in which Intel is its largest competitor. So this is why, IMO, nVidia does not wish to publicly talk about what ATi has done to it in its traditionally strong market over the last year, but would prefer to publicly discuss only what is going on versus Intel in the low-end, integrated gpu markets. Privately, though, I'm sure nVidia understands exactly what is happening in *both* market segments. When you lose the top of your markets, losing your position in your bottom markets is sure to follow swiftly.
Which brings me to again discuss in closing what I consider to be the most interesting item in the conference as it was reported by Gamespot. That's this really strange proclamation about driver release frequency. First, I thought it very strange to even mention anything about driver release frequency in the venue of this Conference, not to mention talking about drastically reducing it to such an improbably, impossibly low figure as "one driver a year." As some of nVidia's top brass was present for the conference, and for these remarks in particular, according to the Gamespot report, there's simply no way this can be construed as some kind of error or misstatement. But if nVidia wants to continue to compete in the 3d-gaming market segments, then this statement *has to* be simply erroneous.
Something like *driver release frequency* is fundamental to a company's *practical* support of its products sold into the 3d-gaming chip markets. So, if nVidia fully intends to *lower* its frequency of driver releases from their already low frequency of quarterly (in comparison with the ATi frequency of driver releases for its products sold into the same market), and is even contemplating something as non-supportive as a single driver release per year...then I would have to conclude they are seriously considering withdrawing from this market segment entirely. A single annual driver release is not sufficient to sustain product competition in the 3d-gaming gpu & reference design market it now shares with Ati. It's not even sufficent to sustain product *viability* in that market, IMO.
Conversely, I would think that in a renewed effort to regain their former position and standing in what has traditionally been their strongest market and their bread & butter, nVidia would have, if anything, announced an increase in the frequency of it official driver releases, as compared to its competition in the last year their current frequency is already insufficient.
Consider also the people likely to hear and think about such comments aside from you and I--like board OEMs and retail customers currently considering buying a nVidia-based product for 3d games. The negative ramifications of such a statement for the sales of GFFX reference-design products could be profound, which is why I don't think that such a statement might ever be made in this kind of venue by mistake, or without a full appreciation of the consequences it could well produce. It's difficult to imagine such a statement being an off-the-cuff example of mouth-engaged-before-brain PR babble, simply because the stated number of "one driver a year" is far too specific to have resulted from a generaliztion. But it is not impossible that for some reason the statement is in fact completely erroneous, so I'll be watching with interest to see what happens here...
The writing between the lines I see here, if it's true, is that JHH is going to drastically cut back on driver-development funds allocated to that end in the company. If you think of all the man-hours and money the company has burned in the last year producing cheats that were exposed, and optimizations that reduced image quality, and so on, I could certainly understand such a decision from him from strictly a money spent versus results produced point of view. However, such a decision would also render the company unable to compete in the 3d-gaming gpu market at the same time, seems to me.