Finally some 3Dmark benchmarks Geforce Ultra - ATI 9700Pro

Nagorak said:
Althornin said:
Millennium said:
I want a review from beyond3D and hardocp ( not made by kyle though )

hear hear
I dont 100% trust the numbers that are out now.

I want a complete, full, comprehensive review, that covers IQ as well, from a site i trust and respect.
*stares at B3D*

Why would the image quality be vastly different from the R9700s?

Gamma correct FSAA?
 
I think its common knowledge from screen shots that ATI also runs more aggresive default LOD bias...and the results of lowering the LOD does for speed...baseline IQ threads we had here for a while.

** Waits for Chalnoth to enter any time with a aliasing arguement** :p
 
IIRC, the GF FX has gamma corrected AA, though I haven't seen a mention at all recently. I believe the interview here mentioned it, though. The difference is the sampling patterns are presumed to be inferior.

To the original question...we have no idea what aniso looks like on the new card (unless it is the same thing as the driver level "adaptive aniso" on previous cards, in which case GF card owners who've tried it might know).

EDIT: Actually, I can't find mention of gamma-corrected AA in the nVidia whitepapers now, nor the interview, I only see gamma-correct pixel shading. Maybe someone else can find it.
 
FX has gamma-corrected FSAA. I think the biggest difference in image quality would be the OGMS on the FX vs. the nice 4x and 6x sampling patterns for MSAA on the 9700.
 
CPU-limited on a 3GHz P4? Though it wasn't using RDRAM, I think it was using Granite Bay.

You can't buy a faster CPU, so I consider the point moot for now. Things might change in a few months, when nV gets better yields than their reported 20%, DDR2 becomes more commonplace, they figure out how to make the chip work on a cheaper card (a PCB less than 12-layers, a HSF less than two stories), and faster CPUs are not only available, but commonplace.
 
IT's really not going well for the FX Ultra

HardOCP
"One interesting thing to note in this game is that the fog doesn’t seem to appear in it on the GeForceFX. Compare the 9700 Pro pictures to the GeForceFX, see how on the 9700 Pro there is a gray fog in the background, but on the GeForceFX it isn’t present at all. This could simply be a driver issue to be worked out though and not a hardware issue"

Before I upgraded from a Ti4200, Nvidia released some drivers that allowed setting no Bilinear or Trilinear (40.something) that also produced popping and general texture corruption / missing textures, loss of fog in certain games but you did get better 3Dmark and fps scores. I think it was called point sampling.

HardOCP
"6xS and 8xS worked fine and did clear up the aliasing. However, if you look at the Radeon 9700 Pro AA quality tests in comparison 6XAA on the 9700 Pro looks much better then any of the aliasing techniques of the GeForceFX. You can notice it most on the rails that are in the background, look closely at how aliased they look on the GeForceFX in every AA mode, then look at 6XAA on the 9700 Pro. They are simply much smoother"

And those AA AF image comparisons on Anandtech really say everything.

Anandtech
"An interesting phenomenon we discovered while attempting to overclock the GeForce FX was that when the card got too hot (courtesy of our overclock), it automatically throttled itself down to its 2D speed (300/600MHz) and reduced the fan speed accordingly in the middle of a game. We're not exactly big fans of this method of protection, since it would make more sense to reduce the clock speed to its 3D default setting and not the significantly slower 2D clocks"

Can you imagine.. yer in a match playing something like UT2003 and suddenly your fps drops to the floor!!!

"And as for that fan "Double the noise" Holy ****- and whats worse is, this is noise OUT OF THE CASE!!! so not much can be done to quieten it."

Not to mention it's using 21Watts more power to achieve this!


I've got to say I'm truely amazed, AF that doesnt even compare well to ATI's performance AF. AA that doesnt seem to start doing anything till x4 and at x8 still looks rough compared to ATI's x6. Possibly using drivers with point sampling as the default. Heat, Noise, and the chance that the fps will suddenly drop while you're gaming.

Is it April 1st or something??
 
Considering all the negativity the FX is generating, I wonder if Nvidia might not have been better off releasing nothing. Up till now, they were winning the battle practically based on nothing but mind share. No one believed they could screw up, but now that the FX has been released it looks like that has been proven wrong. I'm really starting to think that the whole "super OC fan" on the FX is more or less akin to ATi's Rage Fury Maxx. The part can't compete on its own merits, so hey, just slap on a huge heatsink (or another chip).
 
Agreed matey, agreed.. and it keeps getting worse deeper you delve:

For instance, NV's new adaptive Texture Filtering
http://nvidia.com/docs/lo/2415/SUPP/TB-00651-001_v01_Intellisample_110402.pdf

ChrisW pointed this out:
Read the section "Adaptive Texture Filtering". The card changes the texture filtering while playing games just to improve performance (and reduce image quality). This would help benchmarks.

i.e. When you are playing a game or benchmarking, it does crappy texture filtering, but when you stop to take a nice snapshot, it changes back to the best quality texture filtering.
 
Back
Top