Tech-Report blasts GeForce FX

well, semi-blasts it. no swooning over PR claims of film quality rendering here. no anandtech bias.

http://www.tech-report.com/etc/2002q4/geforce-fx/index.x?pg=1


good read. nails GeForce FX on its weak/boring points.


However, the details of GeForce FX's chip architecture are surprisingly tame. We knew ATI had beaten NVIDIA to the punch, but most of us expected NVIDIA's counterpunch to be a little more potent. Now that the GeForce FX specs have hit the street, it's safe to say that ATI produced the exact same class of graphics technology over six months before NVIDIA. At the time I wrote my comparative preview of the Radeon 9700 and NV30-cum-GeForce FX, NVIDIA was being cagey about the NV30's exact specifications. They were claiming (under NDA, of course) that the NV30 would have 48GB/s memory bandwidth, but we now know the part has 16GB/s of memory bandwidth, plus a color compression engine that's most effective when used with antialiasing (where it might achieve a peak of 4:1 compression, but will probably deliver something less—hence the 48GB/s number). The Radeon 9700 Pro has 19.4GB/s of memory bandwidth, thanks to old-fashioned DDR memory and a double-wide, 256-bit memory bus.

NVIDIA was also fuzzy, back then, about the exact number of texture units per pixel pipe in NV30. We now know the GeForce FX has eight pipes with one texture unit each, just like the Radeon 9700. So don't expect any massive performance advantages for the GeForce FX in current games. Only the higher clock speeds, afforded partly by the Black and Decker appendage, will give the GFFX a nominal fill rate higher than the Radeon 9700.

Yawn.
 
another good paragraph:


"That success, when it comes, will be much needed. Only now are the true effects of NVIDIA's missed product cycle with NV30 coming into focus. NVIDIA is no longer the graphics technology leader, in title or, soon, in sales. NVIDIA has held on to its market share over the past few quarters, even with the Radeon 9700 on the scene, because its mainstream and low-end products were still very competitive. That won't be the case for much longer, as ATI pushes its R300 and R200 technology generations down into the mainstream and value segments, respectively. Already, the Radeon 9000 Pro is the best choice for under $100, and soon, the Radeon 9500 and 9500 Pro will fill store shelves, ready to bring floating-point pixels to all the good little Christmas shoppers. All NVIDIA has to counter with are warmed-over versions of the GeForce and GeForce3 graphics cores mated to AGP 8X interfaces. That is to say, NVIDIA is a full technology generation behind in the value and mainstream market segments. Word has it that the GeForce FX-derived NV31 and NV34 chips are just now entering tape-out at TSMC, and in all likelihood, those chips won't hit the market until a month or more after the first NV30-based cards arrive"
 
I wouldn't so much say that the article nails it, they just don't falsely worship NV30, and thus they call a spade a spade.

--|BRiT|
 
Yeah I read that one just a few hours ago. They certainly didn't paint an exagerated picture of the GeforceFX while comparing it to the Radeon 9700 pro.
 
Well, I think most people are expecting some uber-1337 performance figures for the NV30 over the 9700. I think they should be looking more into the programmability and features which seem to put a bigger burden on the 9700 (that is if developers actually take advantage of it).

Comparisons seem rather worthless and a waste of breath at the moment. When NV30's are actually tested for realworld performance and IQ comparisons against the 9700, and it turns out to be a close race, then the complaints will seem a bit more justifiable.
 
Yes, a good read that only a rabid Nvidia fan would say doesn't have any merit.

What is really sad is that of all the articles/previews I've read, this is the first that really even stated what seems fairly obvious. That doesn't even bring up the down-right silly marketing spin and FUD that NVidia is dishing out as if the 9700's lowly 96bit color (etc, etc.) just isn't good enough.

P.S.: My current boad is a Leadtek TI4400 and I love it. I have virtually no desire to upgrade to the GFFX or the 9700. What's the point? Maybe late next year when DX9 games are (still only) on the horizon I'll consider an upgrade.
 
the thing is, GeForce FX isn't lacking in performance. the thing is clocked at 500 Mhz! What it's missing is FEATURES. like a new method of AA.
a wide bus, etc. True, it's got extended shaders but that will be all but useless in games. it doesn't have anything really revolutionary. not that the R300 did, but R300 brought most of the new DX9 stuff first.



That doesn't even bring up the down-right silly marketing spin and FUD that NVidia is dishing out as if the 9700's lowly 96bit color (etc, etc.) just isn't good enough.

yup.
 
performance-wise it could be said to be lacking (given the timeframe of when it will be released).. if all it takes is a moderate overclock of a 9700 to bridge a lot of the small (suggested) gap in performance, the late arrival and premium price of the nv30 doesn't seem entirely thrilling.. no doubt the driver optimizations and eventual/steady price drop of the 9700 will make it a very easy job on ATi's part to maintain a competitive edge..
 
megadrive .... how do you know if its faster ? 500mhz doesn't mean squat. Look at the athlon and the p4.
 
Actually, I found this quite surprising. This is taken from Anands
So there you have it; the elusive NV30 has surfaced in the form of GeForce FX. ATI has won the first round with the Radeon 9700 Pro, what will be most interesting will be what ATI has up their sleeves when the GeForce FX hits the shelves in February.
Normally Anand ends his ATI product reviews with a comment about whats coming from Nvidia soon, in an effort to stop readers from buying ATI products (jmho). Now, it seems the tides are changing.
 
That was a refreshing read, I like NVidia - but I expect them to work really hard for my $$$. I am keenly awaiting a detailed analysis, so critiques like this are a good thing to see.

As NV30 and R300 usher in DX9 I wonder if it will be the second or third generation of this technology that really gets things right, no corners cut.

Sigh, I wish I could just lease 3d cards for 6-9 months at a time, things move so fast.
 
This article is also the only one to mention that the 9700 also has colour compression (and judging by the "Update", it seems like they almost forgot also). Both Tom's Hardware and Anandtech emphasized how the colour compression of NV30 will really help it perform better than the 9700. Anand emphasizes his point in a whole extra paragraph:
The compression engine is completely invisible to the rest of the architecture and the software running on the GeForce FX, which is key to its success. It is this technology that truly sets the GeForce FX apart from the Radeon 9700 Pro.

I guess NV30 came out so late that they completely forgot how good R300 really was. ;)

Kudos to Tech-Report. These guys really seem to be giving a rational look at NV30. I, for one, was expecting a lot more out of NV30, and as it stands, the only advantage that it seems to have is faster shader execution, which won't be useful for quite a while now.
 
T2k said:
megadrive0088 said:
the thing is, GeForce FX isn't lacking in performance. the thing is clocked at 500 Mhz!

And?

MHz != performance

I'm surprised that NVidia is so desperate to get the clock speed to 500MHz. If the mem and core are both at 500MHz, you only get 32-bits of memory access max per pixel pipe per clock!

You need a Z-read, Z-write, texture read, colour read, and sometimes colour write for every pixel reaching the screen. Assumming 4:1 Z-compression (this is quite ideal AFAIK, and will be lower in reality), low res textures, and no alpha, that's over 50 bits per pixel.

In other words, I don't see how NVidia's clock speed advantage will be of much use except when executing shaders, in which case they could have made half the pipes that are twice as fast shader-wise, and saved die space.

We'll see the effect of this clearly in 3DMark2001's single-texture fillrate test. My guess is that NV30 will score below 2000 Mpixels/sec, as these are alpha pixels that are drawn.
 
I'm going to reserve judgement until I see the card actually running.

However, no one can deny that they missed a product cycle effectively, and that hurts a once pretty stellar reputation for delivering new parts with decent performance boosts.

Still the card is feature rich, and that moves the industry ahead as programmers take advantage of the new floating point processors.

For that people should be grateful, just as Geforce1 moved things forward less from relative performance over the V3, but more with how things were effected down the line.

As far as Nvidia's status as king of the 3d graphics realm. Well, I think they are effectively tied now with ATI. The next cycle will either confirm or deny ATI's advance.

One thing that bothers me though. ATI and Nvidia seem to have very, very similar parts. From the multisampling, to the anisotropic filtering algorithms (that seem to be converging) to following DX9 spec almost too religiously.

Where's the innovation like 3dfx Tbuffers, S3 compression, etc.

I was somewhat hoping for something 'new' and exciting.
 
Hmm..

128bit float point, color compression, dx9+ support.
0.13 design... Cg Support. You call this lack of features or
innovation?

Come on give me break. All I hear from this board and Tech
Report is Anti NVIDIA and pro ATI just because ATI has stayed
with safe cource insteading of taking more ambitious courses.
 
sc1 said:
Hmm..

128bit float point, color compression, dx9+ support.
0.13 design... Cg Support. You call this lack of features or
innovation?

Come on give me break. All I hear from this board and Tech
Report is Anti NVIDIA and pro ATI just because ATI has stayed
with safe cource insteading of taking more ambitious courses.

Kinda like ati did when they put out a dx 8.1 chip ?
 
sc1 said:
Hmm..

128bit float point, color compression, dx9+ support.
0.13 design... Cg Support. You call this lack of features or
innovation?

Come on give me break. All I hear from this board and Tech
Report is Anti NVIDIA and pro ATI just because ATI has stayed
with safe cource insteading of taking more ambitious courses.

Fanbois inevitably feel sites that call it down the middle are biased to the other side...
 
128bit float point, color compression, dx9+ support.
0.13 design... Cg Support. You call this lack of features or
innovation?

do you not know that ATi's R300 is capable of 96bit Floating Point (not quite 128bit, but definitely not anything to squawk at) or ATi's color compression, or ATi's DX9 (somewhat +). Yeah, Cg can work on a Radeon9700 to. Is it innovative if someone already has done it? :LOL: :LOL:
 
Personally, all I see in that article (though I will admit that I didn't read through it carefully...) is a guy blasting the GeForce FX based upon specs alone.

Haven't people yet learned that specs don't mean squat? It's the final performance that means everything. It will be very, very interesting if nVidia's current peformance claims pan out (most particularly if they pan out on cards released without that exotic cooling...though it is a very good cooling system, from what I can see...).

One of the things I really didn't like was the implication that nVidia was lying about 48GB/sec of bandwidth.

1. How was nVidia lying on rumored specs?
2. Those among us who had a modicum of intelligence realized that that had to be an effective number.
 
Back
Top