And make no mistake; this is a disaster of monumental proportions. Understand that I am trying very hard to be as objective as I can be here. And I’m not going to even mention my pet nVidia peeve, PR. Also, remember these are my views, right or wrong.
First, a history – as I see it – lesson. After the fall of 3DFX, nVidia was left in a totally dominant position. They have maintained their industry dominance through a business model that, until recently, has served them well. By using 2 leading edge technologies to build on previous generations, they were able to maintain their lead without having to change their design philosophy. These are higher speed memory & die shrinks. When asked about any other ways of increasing performance they have steadfastly refused to admit there was any other way. The line “pride goes before a fall†comes to mind here.
With the advent of DX9, nVidia realized it did need to create a new core from the ground up. While many may say that the moves from the original TNT through the GeForce 4 series resulted in many “new coresâ€, I believe it’s very easy to see the relationship of every nVidia GPU to it’s predecessor. This is not to say that nVidia took any shortcuts, as I believe that that whole line will go down in history as the most successful line of GPUs so far, if not ever.
So, armed with the past as a blueprint, nVidia proceeded to design what they felt was a worthy successor to the GeForce line of GPU’s. First, they looked at the 2 new leading edge technologies, as they always have & decided that they would be well served going with a new die shrink to .13 & the newest memory, DDr II. This way, they reasoned that they could increase the resistor count & avoid the cost of going to a 256 bit memory bus.
By decreasing the die, they could increase the speed to 400mhz, much faster than any previous product, and, when coupled to the new DDr II memory in the 400-500mhz range, they would have a product that would be a worthy successor to the GeForce line of cards. And worthy it would have been ……EXCEPT…… nVidia never saw the R300 coming.
And just why should they have seen it coming? In the past, every time ATI introduced a new product, nVidia was ready with a better product. When the original Radeon was introduced, the already available GTS just smoked it in most every way. And when the 8500 was introduced, the already available GeForce 3, with its brand new drivers with a 20% increase in speed, just smoked it, too. And lets not even get into ATIs terrible drivers. Again with the blinders of looking to the past as a blueprint and with the pride of a conquering champion, nVidia couldn’t have ever seen it coming…
ATI had learned its lessons well. In order to compete with nVidia, they would need to beat nVidia at its own game. Speed and quality drivers were the first step. Excellent IQ was always an ATI strength. So they took a page from nVidia, and designed a GPU whose job was to blow away everything that came before it. With their knowledge, ATI knew it could design a GPU with proven technology and could use the brute force of a 256 bit memory bus to feed that GPU. Yes, it would be costly, but with the savings of using tried & true technology, it could be done, and done in a timely fashion. ATI had taken a page not only from nVidia’s book, but from 3DFX’s, too. I think it can be argued that ATI used 3DFXs model for the design of the R300 by not using leading edge technology, instead relying on tried & true – and inexpensive - technology.
Lets move to this past summer when ATI first previewed the 9700Pro. You have to believe nVidia never saw it coming, and quickly realized its new product was in big trouble. Bottom line was it could’nt compete, period. So, in order to save face, nVidia had only one recourse – STALL. So while stalling the public with whatever it took, they proceeded to try to get their new product, NV30 up to speed so it would be competitive. The only way they could do this without a complete redesign – which was totally impossible – was to clock the NV30 to whatever speed it took to make it competitive. And that speed worked out to 500MHz. But, along with the speed came the heat - so in order to deal with the heat came the FX Flow. I have to believe that no one in their right mind would have designed such a product IF they didn’t have to. But nVidia was desperate, they had no choice. And, top that off with the memory speed at 500 MHz too, and that came with it’s own problems – heat & the need for a very complex PCB. So now, 7+ months too late (when they will be available) they will introduce a product has, and will be, badly received, to say the least. Competitive? Yes, but at what cost? And only till the next ATI product hit the shelves, probably within a month of the retail introduction of the GFFX. Even the most ardent supporters of nVidia are disappointed, some even heartbroken – and reacting like spurned lovers. Can it get any worse?
Well, yes it can and will. Remember that the GFFX was the first in a series of new products off the same design. A design which was born of a flawed view of the world, a world that nVidia ruled without competition. We have already seen that a GFFX down clocked to the speed of a 9500Pro is barely competitive with it, and just what does that say about the other products that are going to be derived from the NV30? It may be years before nVidia can catch up to ATI. Is this the end of nVidia? Probably not – lets hope not! We need nVidia to be competitive with ATI, as competition only helps us all. But it is the dawn of a new world in the graphic market, one that nVidia no longer rules with impunity.
First, a history – as I see it – lesson. After the fall of 3DFX, nVidia was left in a totally dominant position. They have maintained their industry dominance through a business model that, until recently, has served them well. By using 2 leading edge technologies to build on previous generations, they were able to maintain their lead without having to change their design philosophy. These are higher speed memory & die shrinks. When asked about any other ways of increasing performance they have steadfastly refused to admit there was any other way. The line “pride goes before a fall†comes to mind here.
With the advent of DX9, nVidia realized it did need to create a new core from the ground up. While many may say that the moves from the original TNT through the GeForce 4 series resulted in many “new coresâ€, I believe it’s very easy to see the relationship of every nVidia GPU to it’s predecessor. This is not to say that nVidia took any shortcuts, as I believe that that whole line will go down in history as the most successful line of GPUs so far, if not ever.
So, armed with the past as a blueprint, nVidia proceeded to design what they felt was a worthy successor to the GeForce line of GPU’s. First, they looked at the 2 new leading edge technologies, as they always have & decided that they would be well served going with a new die shrink to .13 & the newest memory, DDr II. This way, they reasoned that they could increase the resistor count & avoid the cost of going to a 256 bit memory bus.
By decreasing the die, they could increase the speed to 400mhz, much faster than any previous product, and, when coupled to the new DDr II memory in the 400-500mhz range, they would have a product that would be a worthy successor to the GeForce line of cards. And worthy it would have been ……EXCEPT…… nVidia never saw the R300 coming.
And just why should they have seen it coming? In the past, every time ATI introduced a new product, nVidia was ready with a better product. When the original Radeon was introduced, the already available GTS just smoked it in most every way. And when the 8500 was introduced, the already available GeForce 3, with its brand new drivers with a 20% increase in speed, just smoked it, too. And lets not even get into ATIs terrible drivers. Again with the blinders of looking to the past as a blueprint and with the pride of a conquering champion, nVidia couldn’t have ever seen it coming…
ATI had learned its lessons well. In order to compete with nVidia, they would need to beat nVidia at its own game. Speed and quality drivers were the first step. Excellent IQ was always an ATI strength. So they took a page from nVidia, and designed a GPU whose job was to blow away everything that came before it. With their knowledge, ATI knew it could design a GPU with proven technology and could use the brute force of a 256 bit memory bus to feed that GPU. Yes, it would be costly, but with the savings of using tried & true technology, it could be done, and done in a timely fashion. ATI had taken a page not only from nVidia’s book, but from 3DFX’s, too. I think it can be argued that ATI used 3DFXs model for the design of the R300 by not using leading edge technology, instead relying on tried & true – and inexpensive - technology.
Lets move to this past summer when ATI first previewed the 9700Pro. You have to believe nVidia never saw it coming, and quickly realized its new product was in big trouble. Bottom line was it could’nt compete, period. So, in order to save face, nVidia had only one recourse – STALL. So while stalling the public with whatever it took, they proceeded to try to get their new product, NV30 up to speed so it would be competitive. The only way they could do this without a complete redesign – which was totally impossible – was to clock the NV30 to whatever speed it took to make it competitive. And that speed worked out to 500MHz. But, along with the speed came the heat - so in order to deal with the heat came the FX Flow. I have to believe that no one in their right mind would have designed such a product IF they didn’t have to. But nVidia was desperate, they had no choice. And, top that off with the memory speed at 500 MHz too, and that came with it’s own problems – heat & the need for a very complex PCB. So now, 7+ months too late (when they will be available) they will introduce a product has, and will be, badly received, to say the least. Competitive? Yes, but at what cost? And only till the next ATI product hit the shelves, probably within a month of the retail introduction of the GFFX. Even the most ardent supporters of nVidia are disappointed, some even heartbroken – and reacting like spurned lovers. Can it get any worse?
Well, yes it can and will. Remember that the GFFX was the first in a series of new products off the same design. A design which was born of a flawed view of the world, a world that nVidia ruled without competition. We have already seen that a GFFX down clocked to the speed of a 9500Pro is barely competitive with it, and just what does that say about the other products that are going to be derived from the NV30? It may be years before nVidia can catch up to ATI. Is this the end of nVidia? Probably not – lets hope not! We need nVidia to be competitive with ATI, as competition only helps us all. But it is the dawn of a new world in the graphic market, one that nVidia no longer rules with impunity.