Excellent FX Review

http://www.3dvelocity.com/reviews/gffx5800u/gffx.htm

I highly recommend this review as it really covers a lot of ground.

I think the conclusion is spot-on...

In the end I find it hard to get excited about the FX. While future games like DoomIII may exploit its potential power and colour fidelity that hardly makes for a compelling reason to buy it today. At the very best it can match the soon to be replaced Radeon 9700 Pro and if we pay attention to overall image quality and run both cards at similar settings then on balance I'd say the FX is slower. But a word of caution, don't treat this as a definitive conclusion. The complex nature and the supreme flexibility of the FX GPU makes it almost impossible to judge what it might be able to pull off with a little jiggery pokery and there's plenty more testing and driver revisions needed before anyone can write it off as a poor second to the 9700 Pro. Unfortunately at this stage I have to accept what I have in my hand as being the finished product and in that respect the FX is noisy, hot, bulky and not particularly fast at the moment. Sad but true!
 
some interesting IQ tests.

What kind of AA is between application and 2x? The pic shows there is one in between.
 
No, it's
- Application (let the app chose)
- No AA (never allow AA)
- 2x
...
 
Yeh, I've just read that review. I though that it was very well done as well.

Very interesting IQ findings....
 
Indeed....the perhaps the most informative and relevant FX review to date. In fact, I dare say it was more informative in many ways than the B3D review, Precisely because of the performance / quality image comparisons.

I'll have to re-read some of the sections on the texture detail sliders though...it wasn't always initially clear at what aniso / texture settings things were done at. First I've heard of a "Blend" setting for FX filtering? Perhaps because no one paid much attention to it?

There was a lack of information though, on the specific impact of "real" trilinear, bilinear and "fake trilinear" (GeForce Balanced/Aggressive) in the applications. Presumably because he didn't notice any?

I particularly repsect this statement:
For this reason, and because getting an exact match in IQ between the Radeon 9700 Pro and the FX varied from game to game, I pretty much gave up on all hopes of doing a like for like comparison between the two cards. That didn't stop me drawing a conclusion on the matter though but that's for later.

He obviously tried to get as apples to apples as possible, ADMITS that it can't really be done, but goes on to state his overall opinion on the matter the best he can, backed with screenshots and performance numbers.

Bravo!
 
It's just unfortunate they felt it necessary to dumb down the overall texture quality with some of the most dreadfully severe driver settings I've seen in a long time.

If that doesn't remind me of Gary Tarolli's email to Rev about companies winning the benchmark wars by fiddling with default settings. . . .

And I wonder how many reviewers will even notice.
 
The complex nature and the supreme flexibility of the FX GPU makes it
This seems to be the Mantra getting repeated when the FX gets talked about. The problem is. Its Complete PR driven Techno Nonsense. Its like this little Halo around the FX that says "Forget the raw data buddy, this cards got Angels in the Architecture".

Supreme Flexability....

Will someone please Define that??? It sure as heck does not seem very "flexable" to me. This is nothing more than some lame PR slogan that people have picked up on, that was used by Nvidia, To Cloud the Fact that they are just a 4x2.
 
BTW,

Yes, that was one really nice complete and honest Review. I really appriciated the work the guy bput into it. Very nice job and really layed an honest representation down of the GFFX.
 
Brent said:
Is this right?

The FX has no Pixel Shader 1.4 functionality and so uses a fall-back 1.1 shader routine.

http://www.3dvelocity.com/reviews/gffx5800u/gffx_12.htm


Because the GFFX ahears to DX9 spec doesn't it support PS 1.4?

It is supposed too. I really think that nvidia when they revamped their drivers for 3DMark2003 decided they would have to use PS1.4 to get the performance boost they needed to beat the Radeon 9700. I asked this question some time ago with regards to these drivers. Could or is it possible that the GeforceFX get these scores without using PS1.4?
 
Brent said:
Is this right?

The FX has no Pixel Shader 1.4 functionality and so uses a fall-back 1.1 shader routine.

If an application is using DX8 interfaces then I believe that they could legally export only PS1.3 caps back to the application so they could then use their older integer paths in this case (3DMark2001SE naturally uses only DX8.1 interface definitions).

Really this solution should only ever be faster (on any given piece of hardware) if that hardware's support for PS1.4 shaders is pretty poor since you will have to multipass on the advanced shader test using 1.1->1.3. I don't know why any hardware that is designed to be quick at PS2.0 should be too slow on PS1.4 to gain an advantage from using it over multipassing earlier shader models.

If an application uses DX9 interfaces (note that you can use the interfaces without necessarily using the features - it's just the generation of DirectX runtime that you develop under) then by definition any driver that exports PS2.0 caps is also exposing PS1.4.
 
Brent, can I ask you the same question I asked Dave ? Did you encounter any down clocking of the FXultra when you tested it. Let me quote the Firingsquad ...

Complicating matters is the heat output of the card and noise from the FX Flow cooling unit, both of these issues are turnoffs to many gamers. In addition, the scalable clock frequency “featureâ€￾ can sometimes underclock your GeForce FX 5800 Ultra card right in the middle of gaming. We had to repeat multiple runs of Serious Sam and Quake 3 running with 4xAA/8xAniso enabled to get our final numbers, in some cases the margin between the scores was as high as 30%!

What was your experience ?

If this is common with the shipping version of the card I think alot of people are going to be real mad. While not a hardcore gamer myself, when I do play it is usally for aleast 1 or 2 hours and would like to think that if I paid that much for a card it would perform the same 24/7 .
 
Sabastian said:
Brent said:
Because the GFFX ahears to DX9 spec doesn't it support PS 1.4?

It is supposed too. I really think that nvidia when they revamped their drivers for 3DMark2003 decided they would have to use PS1.4 to get the performance boost they needed to beat the Radeon 9700. I asked this question some time ago with regards to these drivers. Could or is it possible that the GeforceFX get these scores without using PS1.4?

If 3DMark03 was written correctly it checks for capabilities only once on startup. That means nVidia cannot force PS1.1 and still run Nature.

I think they just analized the shader and optimized it to their architecture.
The only reason the per pixel lighting PS1.4 shaders in GT2 and GT3 are not possible to do in PS1.1 is that they require more than 4 texture lookups - a limitation the FX doesn't have.
 
I think gffx emulates 1.4ps with 2+ ps. The reason being are two extra texture stages and two extra samplers over 1.1 ps. Not sure how you could break down 6 stage setup into 4 stages and have the maths work out right, including dest. alpha and masking of rgba writes.
 
nelg said:
In addition, the scalable clock frequency “featureâ€￾ can sometimes underclock your GeForce FX 5800 Ultra card right in the middle of gaming. We had to repeat multiple runs of Serious Sam and Quake 3 running with 4xAA/8xAniso enabled to get our final numbers, in some cases the margin between the scores was as high as 30%!

If this is present in the final shipping version of the GFFX, it would be a MASSIVE turndown. There's nothing worse than the videocard suddenly overheating and throttling down at the stressful situations where it is needed the most.

After all, what really matters for playability isn't so much the average framerate as the framerate during large battles.

Also, could this clock scaling problem be the reason why the GeForceFX 5800 Ultra occasionally performs worse than the standard GeForceFX 5800?
 
Yep, finally a decent review

Well I can't believe I'm saying this but finally a genuinly useful consumer review. Sure it may be a little in accurate in places (and I'm not saying it is) but overall the reviewer explains the reasoning and follows through with good unbiased IHV conclusions. I must add mind I hope I'm not being blinded by the favourable R300 views but my initial feelings are it's a well balanced gaming/consumer review.
Wow, what a breath of fresh air, the site has certainly won my repsect and I think it can be used as a model for all. Now if Dave can combine this style of review with B3Ds there may be hope for web site review sites after all?

:)
 
Back
Top