New FX Extremetech Review

Well, I'm disappointed. I thought gffx would pack more punch especially in vertex shaders. Got to give ati some credit for creating good card half a year before gffx and with older technology. About the good thing gffx has going for it are the 2+ shaders but not sure how fast they would run thought flexibility is nice to have. I don't think $400 is well spent on this card unless you detest ati drivers which by next release will fix most things people have trouble with so even that advantage of nvidia drivers is getting smaller.
 
I don't know about anybody else...but is there a glimmer of hope that at least _one_ review might take a gander at image quality? I mean, to at least try and contrast the FX vs. R300 in AA?

I'm betting the farm that R300 will end up with a superior looking result...but I really hope that we get to see just a couple of screenshots! Then again, maybe they figure what's the point? It's not as if nVidia developed a massively new/better/etc. AA implementation.
 
Typedef Enum said:
I don't know about anybody else...but is there a glimmer of hope that at least _one_ review might take a gander at image quality? I mean, to at least try and contrast the FX vs. R300 in AA?

Yeah, but not tomorrow. Maybe a week or so, but most likely that is the kind of thing I'd expect after shipping hardware/drivers are available. Love to be wrong, of course. Isn't that the pattern you'd expect from past experience?
 
I don't know about anybody else...but is there a glimmer of hope that at least _one_ review might take a gander at image quality?

Judging by what was said in another thread, websites didn't recieve their GFFX's until last thursday or friday. So it's no surprise that the couple reviews out so far wouldn't have had time to do image quality comparisons or generally anything interesting besides run their usual gamut of benches. Presumably more in-depth reviews will be out in the coming days. And who knows when B3D will finish a review, but I'm sure it will be great when they do.

Point is, I think it's great that a couple sites rush to put out benchmark comparisons while others take a few days longer to write a proper review. Better to get what we can now than have to wait any longer just to see 20 sites with the same AA comparison screenshots. We'll get better reviews when they're ready.
 
Come to think of it, I didn't notice any pre-fed pablum (vendor supplied screenies and diagrams) in either of the reviews we've seen so far. Wonder why?
 
Too bad Beyond3D is not getting a card 1st round...at least here the screen shots are 'full screen'..leave the blown up FSAA images left to the reviewer or show both.
 
Re: Nvidia is in ATI's shoes

LeStoffer said:
rwolf said:
This is the first major architecture change for nvidia since the Original Geforce.

If you scrap all the hype away I think you very wrong on this one. I would maintain that the NV20 (GeForce 3) was a bigger architecture jump over NV10 than NV30 is over NV20.

Why? With NV20 nVidia introduced LMA, MSAA, shaders (register combiners) amongst other things. And it took some time to get performance up with the drivers. So what is really new in NV30? Much more advanced shader yes, but it seems they keep register combiners in there, and the vertex shaders seems to be a bigger step away than pixel shaders anyway. Has FSAA change alot? No. Has LMA? You add color compression, but it's nothing radical.

I'm not trying to downplay NV30, but as it is the architecture doesn't seem to be a total rewrite but more a NV25 with much more advanced shaders. I guess that's why i feel disappointed.

Yes there were some additons to the Original Geforce Architecture, but 80-90% of the hardware was the same. With a new architecture you must code all the hardware interactions from scratch.

The new GeforceFX is a huge change in hardware design. 128-bit floating point color vs 32-bit color is a massive change. Moving from fixed T&L to programmable T&L is a massive change. The pixel shaders are significantly more complex. Getting legacy programs to run and run well on a completely different architecture is huge undertaking.

ATI has only recently produced a driver capable of doing AA under 16-bit color. Coding the driver in C and then testing for each operating system, motherboard, and software combination (directx 1-9, OpenGL) is a monsterous task. It is amazing that nvidia and ATI are able to get hardware working as fast as they do.

Even though 85% of nvidia's drivers are common code the 15% that must be coded for the architecture is an enormous undertaking.

When you start seeing games produced using natural lighting and 128-bit color precision you will see what directx 9 cards bring to the table....
 
When you start seeing games produced using natural lighting and 128-bit color precision you will see what directx 9 cards bring to the table....


E.T.A...2005

Your 128-bit arguement..is not a arguement...the 8500 supported higher precision vs. Geforce 3's and really didn't affect it much.
 
Can DX8 render that image :?: You are linking to a DX9 realtime rendered image..not Serious Sam

I know what 96-bit is...I'm saying its not a factor running game benchmarks.

Edit: DX7 and DX8 benchmarks
 
Typedef Enum said:
I don't know about anybody else...but is there a glimmer of hope that at least _one_ review might take a gander at image quality? I mean, to at least try and contrast the FX vs. R300 in AA?
Check [H]'s review. It also shows the GFX in a better light, IMO (particularly at lower resolutions like 1280, even with 4x AA and 8x AF).

OTOH, [H] notes that GFFX's AA isn't as purty/effective as the 9700's. They also note some rendering errors, so obviously nV has some driver work ahead.

Hard|OCP said:
One of the things we first saw with the 9700 series was a better quality of usable AA in games we were playing. The above pictures show that ATi still has better AA image quality than even this new GeforceFX, especially at 2XAA levels that are more likely to be used by more people.

One interesting thing to note in this game is that the fog doesn’t seem to appear in it on the GeForceFX. Compare the 9700 Pro pictures to the GeForceFX, see how on the 9700 Pro there is a gray fog in the background, but on the GeForceFX it isn’t present at all. This could simply be a driver issue to be worked out though and not a hardware issue.
 
I can only say "ouch"....

I had hoped for a IL2-Sturmovik-Monster Gfx card.

But no... i now have to buy the cheap ATI card.
(ExtremeTech benched exactly the settings i usually play: 16x12 :) )

Should i now be happy about this?
 
Pete said:
OTOH, [H] notes that GFFX's AA isn't as purty/effective as the 9700's. They also note some rendering errors, so obviously nV has some driver work ahead.

Hard|OCP said:
One of the things we first saw with the 9700 series was a better quality of usable AA in games we were playing. The above pictures show that ATi still has better AA image quality than even this new GeforceFX, especially at 2XAA levels that are more likely to be used by more people.

One interesting thing to note in this game is that the fog doesn’t seem to appear in it on the GeForceFX. Compare the 9700 Pro pictures to the GeForceFX, see how on the 9700 Pro there is a gray fog in the background, but on the GeForceFX it isn’t present at all. This could simply be a driver issue to be worked out though and not a hardware issue.

While AnandTech's review (on a P4-based platform) shows the R9700Pro kicking in most real world benchies, and confirms the AA/AF IQ observations by [H].

Edit: Anand also experienced rendering errors. I find this very suprising given the extra time the driver team would have had thanks to delays.

LW.
 
While AnandTech's review (on a P4-based platform) shows the R9700Pro kicking in most real world benchies, and confirms the AA/AF IQ observations by [H].

That's only if you agree with Anand that the fairest comparison to GFFX's 8x Balanced AF is the 9700's 8x Performance AF. Anand seems not to realize that the difference between Performance and Quality AF on the R300 is just bilinear vs. trilinear filtering. He correctly sees no IQ difference in the segment he zooms in on, but doesn't realize that what he should really be looking for are visible mip-map boundries.

In the absence of any indications to the contrary, we should assume that GFFX's Balanced AF uses trilinear filtering. Thus, IMO, the most apples-apples comparison is vs. R300's Quality AF...and in this comparison (along with 4xMSAA), GFFX wins most of Anand's benches.

However, the IQ of R300's 4xMSAA is clearly better than that of NV30's, and IMO the IQ of R300's 8x AF is slightly better as well (as far as I can tell from Anand's single comparison). But that doesn't negate the fact that "the same settings" as Balanced AF on the NV30 means Quality AF on the R300.

So I'd see it as, the R300 performs slightly worse but looks slightly better at the same settings with AA/AF on. But maybe it's more valid to say "one OGMS sampling pattern is worth one visible mip-map boundries" and compare to R300's Performance AF as Anand does. :?

So of course there's no perfect answer, but just wanted to bring up another way to read these results.
 
That's only if you agree with Anand that the fairest comparison to GFFX's 8x Balanced AF is the 9700's 8x Performance AF. Anand seems not to realize that the difference between Performance and Quality AF on the R300 is just bilinear vs. trilinear filtering. He correctly sees no IQ difference in the segment he zooms in on, but doesn't realize that what he should really be looking for are visible mip-map boundries.

After living with the 8500s AF I can say that in some games the mip maps lines stick out like a sore thumb. But other games you really really have to look for them. So it really depends.

None the less for a card that comes out almost 6 months later, that has an almost 200 Mhz speed advantage I am not impressed. Maybe when DX9 Titiles show up.
 
Chalnoth said:
Well, one thing to note is that those UT2k3 benches were in botmatch mode, and are therefore both CPU-limited and unreliable.

Its pathetic how u try to conjure up excuses. U are losing credibility. These type of scenario is precisely what I would call a realword situation, and that doesn't count?
 
Back
Top