My take on the disaster that is the GFFX. (long post)

martrox

Old Fart
Veteran
And make no mistake; this is a disaster of monumental proportions. Understand that I am trying very hard to be as objective as I can be here. And I’m not going to even mention my pet nVidia peeve, PR. Also, remember these are my views, right or wrong.

First, a history – as I see it – lesson. After the fall of 3DFX, nVidia was left in a totally dominant position. They have maintained their industry dominance through a business model that, until recently, has served them well. By using 2 leading edge technologies to build on previous generations, they were able to maintain their lead without having to change their design philosophy. These are higher speed memory & die shrinks. When asked about any other ways of increasing performance they have steadfastly refused to admit there was any other way. The line “pride goes before a fall†comes to mind here.

With the advent of DX9, nVidia realized it did need to create a new core from the ground up. While many may say that the moves from the original TNT through the GeForce 4 series resulted in many “new coresâ€, I believe it’s very easy to see the relationship of every nVidia GPU to it’s predecessor. This is not to say that nVidia took any shortcuts, as I believe that that whole line will go down in history as the most successful line of GPUs so far, if not ever.

So, armed with the past as a blueprint, nVidia proceeded to design what they felt was a worthy successor to the GeForce line of GPU’s. First, they looked at the 2 new leading edge technologies, as they always have & decided that they would be well served going with a new die shrink to .13 & the newest memory, DDr II. This way, they reasoned that they could increase the resistor count & avoid the cost of going to a 256 bit memory bus.
By decreasing the die, they could increase the speed to 400mhz, much faster than any previous product, and, when coupled to the new DDr II memory in the 400-500mhz range, they would have a product that would be a worthy successor to the GeForce line of cards. And worthy it would have been ……EXCEPT…… nVidia never saw the R300 coming.

And just why should they have seen it coming? In the past, every time ATI introduced a new product, nVidia was ready with a better product. When the original Radeon was introduced, the already available GTS just smoked it in most every way. And when the 8500 was introduced, the already available GeForce 3, with its brand new drivers with a 20% increase in speed, just smoked it, too. And lets not even get into ATIs terrible drivers. Again with the blinders of looking to the past as a blueprint and with the pride of a conquering champion, nVidia couldn’t have ever seen it coming…

ATI had learned its lessons well. In order to compete with nVidia, they would need to beat nVidia at its own game. Speed and quality drivers were the first step. Excellent IQ was always an ATI strength. So they took a page from nVidia, and designed a GPU whose job was to blow away everything that came before it. With their knowledge, ATI knew it could design a GPU with proven technology and could use the brute force of a 256 bit memory bus to feed that GPU. Yes, it would be costly, but with the savings of using tried & true technology, it could be done, and done in a timely fashion. ATI had taken a page not only from nVidia’s book, but from 3DFX’s, too. I think it can be argued that ATI used 3DFXs model for the design of the R300 by not using leading edge technology, instead relying on tried & true – and inexpensive - technology.

Lets move to this past summer when ATI first previewed the 9700Pro. You have to believe nVidia never saw it coming, and quickly realized its new product was in big trouble. Bottom line was it could’nt compete, period. So, in order to save face, nVidia had only one recourse – STALL. So while stalling the public with whatever it took, they proceeded to try to get their new product, NV30 up to speed so it would be competitive. The only way they could do this without a complete redesign – which was totally impossible – was to clock the NV30 to whatever speed it took to make it competitive. And that speed worked out to 500MHz. But, along with the speed came the heat - so in order to deal with the heat came the FX Flow. I have to believe that no one in their right mind would have designed such a product IF they didn’t have to. But nVidia was desperate, they had no choice. And, top that off with the memory speed at 500 MHz too, and that came with it’s own problems – heat & the need for a very complex PCB. So now, 7+ months too late (when they will be available) they will introduce a product has, and will be, badly received, to say the least. Competitive? Yes, but at what cost? And only till the next ATI product hit the shelves, probably within a month of the retail introduction of the GFFX. Even the most ardent supporters of nVidia are disappointed, some even heartbroken – and reacting like spurned lovers. Can it get any worse?

Well, yes it can and will. Remember that the GFFX was the first in a series of new products off the same design. A design which was born of a flawed view of the world, a world that nVidia ruled without competition. We have already seen that a GFFX down clocked to the speed of a 9500Pro is barely competitive with it, and just what does that say about the other products that are going to be derived from the NV30? It may be years before nVidia can catch up to ATI. Is this the end of nVidia? Probably not – lets hope not! We need nVidia to be competitive with ATI, as competition only helps us all. But it is the dawn of a new world in the graphic market, one that nVidia no longer rules with impunity.
 
Sorry, I wasn't listening, could you repeat that please? :LOL:

My 0.02 rupees worth.

I think most people are disappointed becuase this wasn't just another product launch from just another company. It was to be "the biggest contribution to the 3D graphics industry" from the biggest company in the industry. People just expected more.

I think we will all realize just how much of a stuff up the FX is when the R350 (assuming it hits the market with in 2-3 months) arrives. This will drive home the point that Nv really are a product cylce behind.
 
I don't think things are quite as bad as you make it out to be martrox. For one, people are probably overestimating R350's effects on NV30. Although they are behind ATI now, NV35 will bring them back in line. Additionally, this architecture appears to be transitory in that NV40 will be a pretty big departure from the NV30 design.
 
huh, I don't even really care anymore.. They just make graphics cards, all I care is what kind of cards are available. Today two pretty good ones, except I would never buy one of them because it's too noisy. If that gets fixed I will consider it.
 
CMKRNL said:
I don't think things are quite as bad as you make it out to be martrox. For one, people are probably overestimating R350's effects on NV30. Although they are behind ATI now, NV35 will bring them back in line. Additionally, this architecture appears to be transitory in that NV40 will be a pretty big departure from the NV30 design.

Well, I have to give you the benifit of the doubt, CMKRNL, as you know far more than I do.... and I really hope you are right.
 
CMKRNL said:
Although they are behind ATI now, NV35 will bring them back in line.

I believe so too. Talk of NV35 not coming out till the end of the year is BS - NV30 problems won't have set it back too much and it should be ready a long time before then since it involves only evolutionary changes. Some NV30 functionality is plain "broken" - this has had repercussions for NV31 development which is still not going great, AFAIK. NV35 is basically a fixed/optimised NV30 ASIC. If there are featureset changes (fab process and controller tech aside) they will be only minor.

The only thing that worries me now slightly is the board design (which will be a nightmare if they want to keep it at 12 layers) and the fact that an R300 overclocked to 400/800MHz appears to have be slightly faster than what I know of NV35 performance estimates. :?

They really need developer support at this stage. I am suprised how little talk there has been of DOOM3 which could really help sell the NV30. Not that 9700 Pro will have any problems with game; I just expect NV30 to be quite a bit faster.

MuFu.
 
Its been mentioned to me that DIII is now looking more like its in the latter half of 2003, which is probably why nobody has made much of it.
 
I guess you are probably right with that - a March release looks distinctly unlikely at this time. Still... it will no doubt have a considerable impact on graphics card sales later in the year. If it hits the market before ATi get R400 out of the gates, I'm sure the NV3x popularity will benefit greatly.

MuFu.
 
DaveBaumann said:
Its been mentioned to me that DIII is now looking more like its in the latter half of 2003, which is probably why nobody has made much of it.

This would be good news for ATI if true. Their R400 architecture will be much better equipped for Doom 3 than R300, and if past ATI product cycles are anything to judge by, we should at least get a peek of it in the summer.

If a test/demo is released first, like what happened with Q3, then that may be NV30's major selling point. NV30 is matched very well to it.
 
Hi there,
DaveBaumann said:
Its been mentioned to me that DIII is now looking more like its in the latter half of 2003, which is probably why nobody has made much of it.
Heard the same--more precisely, a possible September/October release. I wouldn't be surprised, though, if Carmack posted another .plan on his impression about the GFFX in the not so far future, similar to his Radeon8500 and GF3 remarks earlier.

*hires a monkey to check /. for him* ;)

Regarding the thread topic: I pretty much agree with MuFu. I Had a discussion about the near NV future just some hours ago, and we pretty much agreed on MuFu's estimate (guesses on our side, mind): NV35 coming pretty soon, mainly featuring NV30 fixes and a slight speed bump, but no fundamental changes. I'm pretty convident this might work out well for NV, but nevertheless, "disasterous" describes the current NV30 launch situation pretty well.

ta,
-Sascha.rb
 
MuFu said:
The only thing that worries me now slightly is the board design (which will be a nightmare if they want to keep it at 12 layers) and the fact that an R300 overclocked to 400/800MHz appears to have be slightly faster than what I know of NV35 performance estimates. :?

I am not sure what you know or who your sources are, but I find it hard to beleieve "NV35 performances estimates" are equal to about an R300 at 400/800.

If CMKRNL is to believed, and I have no reason not to, then the NV35 is to bring Nvidia back on top, then I would expect performance to be way higher than a R300 at 400/800.
 
Regarding DoomIII, I have the leaked Alpha and I've tested it on a Radeon 8500 (ran ok with medium settings 45-50 fps) and tested it on a Radeon 9500 Non Pro (2X FSAA and 8X AF) ans got the same frames as the 8500 with FSAA on.
The 9500 Non Pro is a 4 pipe card on a 128-bit bus with Hi-z disabled with a buggy Alpha build with little optimizations.

I read the interviews from ID regarding DOOM III performance, and anyone that has the leak realizes its not going to require 100 FPS, in fact ID says they will probably cap the frames at 60...which any of the current high end cards could handle.
 
nggalai said:
but nevertheless, "disasterous" describes the current NV30 launch situation pretty well.

I'll have to disagree with that statement. If we want to talk 'disasterous', Parhelia was 'disasterous' because it essentially was the last hope for that company, and came out being much slower than the other competition out--essentially a non-starter.

While NV30 is certainly not as astounding as everybody was expecting, it would have been a solid release 6 months ago, whereas now it looks like it'll get leapfrogged within several months rather than coming of age along with the R300.

But a disaster? No. NVIDIA will weather the storm and it will take at least one or two more serious missteps before they're in danger of having their business go under.
 
I don't agree or disagree with the original post, or really care about what GFFX is or isn't. (Other than a display adapter. Hehe.) Just felt like adding sumthing...

martrox said:
With the advent of DX9, nVidia realized it did need to create a new core from the ground up. While many may say that the moves from the original TNT through the GeForce 4 series resulted in many “new cores”, I believe it’s very easy to see the relationship of every nVidia GPU to it’s predecessor.

Yes? Could they have had (and still have) a unified, single binary driver if there had been really radical changes? If the evolutionary model has nevertheless allowed them to add important new features (HW T&L and VS/PS), why want revolutionary? -- I'm not critisising your post, I'm just puzzled at the need (to fix what ain't too broken).

So, armed with the past as a blueprint, nVidia proceeded to design what they felt was a worthy successor to the GeForce line of GPU’s.

What do you mean, successor to the line? "GeForce FX" sounds much "GeForce" to me :LOL:

Okay, I understand what you mean. I'm just not sure that under the hood, NV30 is such a radical departure from the previous, after all... Essentially DX9 adds to DX8 functionality, instead of replacing.
 
Well, ATI aren't exactly of the woods themselves. Latest Mercury research numbers for for December indicate that NVIDIA still managed to increase their desktot market share despite all of the 9x00 series being on the market.
 
The only disasterous aspect is the negative PR from the cooling noise, that was something they could have avoided by not being so aggressive with the clock rates, other than that, it's a next gen developer's card. NVIDIA's PR (and ego) has always been out to lunch, no card could ever match the expectations that they generate with their blarney, they have been doing this for years. Going for a card people want at any price to a card people will avoid because of the noise is a disaster. I'm sure most vendors selling NVIDIA products will sell most cards without the hoover and clock it to where it's supposed to be. That will be a rocking card that will be able to overclock nicely with the silent cooling solution of your choice. When next gen games using heavy shaders come along the card will probably do quite well, I'd bet it will really show up the 9700 then, stands to reason bandwidth will be less of a factor if the core is busy crunching for longer periods.

The first Geforce cards kinda sucked too, initial hardware issues, and it didn't do much in the games of the day. I'm sure they will get most of this stuff taken care of, as well as the drivers, over the next year. I'm surprised that anybody thinks this is so unusual for NVIDIA, this is just a part of their cyclic pattern with a new architecture. My advice, ignore everything NVIDIA says, look at the reviews when something arrives, and make up your own mind.
 
Russ,
It's a disaster because nothing on this scale has happened since pre-RIVA128 eras. From NVIDIA's internal point of view, this is the most uncomfortable launch yet. Parhelia is more than a disaster, it's a terminal error.
 
Gunhead said:
I'm just not sure that under the hood, NV30 is such a radical departure from the previous, after all... Essentially DX9 adds to DX8 functionality, instead of replacing.

Hardware wise DX9 is much different than DX8 and unless the DX8 chip was designed with much foresight the shaders had to be totally redesigned for DX9. I think people underestimate the work that is involved.

However, to some the architecture will never be a "radical departure" as long as it remains an immediate mode renderer.
 
I am really happy with the way this thread is shaping up. A lot of constructive talk going on here, and a lot of ideas coming together..... sure beats the abusive posts of the last few days.....
 
Fuz said:
MuFu said:
The only thing that worries me now slightly is the board design (which will be a nightmare if they want to keep it at 12 layers) and the fact that an R300 overclocked to 400/800MHz appears to have be slightly faster than what I know of NV35 performance estimates. :?

I am not sure what you know or who your sources are, but I find it hard to beleieve "NV35 performances estimates" are equal to about an R300 at 400/800.

Well, say you are backhanded a figure of "25% faster" than NV30. What would you make of that? Raw clockspeed bump? Realworld performance approximation?

Taking a move to a low-K fab process into consideration, the former looks likely. Such a change would also make would make "25%" seem like rather a conservative figure for realworld performance gains considering tweaks, a mild clockspeed bump and a 256-bit memory bus, right?

If low-K is not to be used then it is most likely they will stick with the same clockspeed and try and get away without the dustbuster. That, paired with the wider memory bus could yield ~25% realworld performance gain, mostly due to bandwidth improvements. That does not seem too impressive to me in light of some overclocked R300 results, hence the worry... No doubt R350 will be even faster.

I am much more inclined to believe that they*will* use low-K (CMKRNL?), in which case we could see some considerable gains compared to NV30. Despite the fact that they could probably then get rid of the dustbuster and still clock the card at ~500-550MHz, I have a feeling that they will attempt to qualify the card at 600-650MHz with a similar device. It really *has* to be considerably faster than R350 and I'm sure it will be - in fact, if nVidia has its wicked way with developers then I can see this nightmare scenario developing in which nVidia-centric games really steal ATi's thunder.

MuFu.
 
Back
Top