A look at IQ at Firingsquad...

StealthHawk said:
First we have claims of "I don't see this in any game I play. Then, "well, I admit it's there, but it doesn't detract from gameplay." Jesus, this is the same "logic" that has been thrown around by defender's of NVIDIA's optimizations. And I think we all agree that was flawed logic. It's great to see that the double standard is alive and well.

The differentiation being this has been known about since day one from the 9700 reviews (or at least, the good ones) and the public have gone in with their eyes wide open when they make thei purchasing decision on this matter - the came can't be said for NVIDIA's optimisations until sites like this unearthed them.
 
StealthHawk said:
First we have claims of "I don't see this in any game I play. Then, "well, I admit it's there, but it doesn't detract from gameplay." Jesus, this is the same "logic" that has been thrown around by defender's of NVIDIA's optimizations. And I think we all agree that was flawed logic. It's great to see that the double standard is alive and well.
What double standard?
Everyone i know admits that ATI could have better Aniso filtering - even DoomTrooper!
Everyone i know admits that in some cases, the degenerate angles make ATI's aniso look worse than nVidias.

There would only be a double standard IF full non angle dependant aniso was displayed by the aniso tester, but the angle dependency was turned on in games for extra perf - regardless of what the user wanted. As it is, there is no deceit - what you see is what you get!
 
StealthHawk said:
First we have claims of "I don't see this in any game I play. Then, "well, I admit it's there, but it doesn't detract from gameplay." Jesus, this is the same "logic" that has been thrown around by defender's of NVIDIA's optimizations. And I think we all agree that was flawed logic. It's great to see that the double standard is alive and well.

There is no flawed logic, I prefer useable AF at high filtering levels, even if if drops down to 4X on rotation. Could it be better YES, is it as bad as some people make it out to be in a majority of games NO.

What else do you want, email ATI and ask them to improve it...getting old as this is not a 'driver' limitation.

I also find the double standards quote hilarios Stealth considering this review:

http://www.3dvelocity.com/reviews/gffx5800u/gffx_5.htm

:rolleyes: :LOL:
 
Chris Angelini wrote

"Thanks for the info on UII - I think that helps show that full trilinear isn't being used in order to augment scores."


Just to make sure I understoon this clearly..Is chris saying the quasi-trilinear is only being used to augment ut2003 scores..since ut2003 is a "benchmark" used by many web sites and PC OEMs?
And this quasi-trilinear trick is not a great image quality/performance trade-off that Nvidia is providing for their end users?

If this quasi-trilinear thing is so good, why doesn't Nvidia enable it by default? Why do they have to app-detect and not help many other games that use trilinear by default too?

It appears to me that in-order for the gamer to get the best experience possible on his favourite games, his favourite games must be "benchmarks" too...If not your app gets no "special" preference from the video card drivers

For all those enthusiasts who argue app detect and direct shader detect optimizations are ok..Think again
 
Doomtrooper said:
Not according to Daves animated Anisotropic Tester image. :?:

Eh, yes.

2X and 16X compared:

2x_qual.gif

16x_qual.gif
 
deflate said:
StealthHawk said:
First we have claims of "I don't see this in any game I play. Then, "well, I admit it's there, but it doesn't detract from gameplay." Jesus, this is the same "logic" that has been thrown around by defender's of NVIDIA's optimizations. And I think we all agree that was flawed logic. It's great to see that the double standard is alive and well.

The differentiation being this has been known about since day one from the 9700 reviews (or at least, the good ones) and the public have gone in with their eyes wide open when they make thei purchasing decision on this matter - the came can't be said for NVIDIA's optimisations until sites like this unearthed them.

Exactly, it is supposedly well known. Which makes it all the worse when people start saying there is no difference in real games, it doesn't only do 2x AF at certain angles, etc, etc.

Just to make it clear what I'm getting at, as people seem to misunderstand what I am saying. I am not saying ATI is deceiving people like NVIDIA is. I am not saying ATI is culpable for providing AF the way they do either. What I am saying is that the line of apologetics between defenders of NVIDIA and proponents of ATI seems very similar.

I would have thought someone as vocal as Doomtrooper would have his ATI facts straight, but look at these quotes:

Doomtrooper said:
CosmoKramer said:
Well, 16X AF does not in any way improve upon the weakness of ATI's
AF, so don't expect miracles in the follow up.

How do you figure that, the filtering is pushed back much further at 16X AF which gives better depth perception, something AF is supposed to do.

16x absolutely does not help ATI's AF at the off angles compared to 8x while DT says otherwise.

Doomtrooper said:
Angle dependent yes, but the filtering is applied to the view plane at 16X does give better depth perception.

Looks fine to me where I drive :LOL:

He then posts screenshots saying that there is no off angle problems when playing games when his shots illustrate the problem.

Doomtrooper said:
I see it ;)

Doesn't bother me, as I get much better filtering in front of me. In fact I compared a Ti4400 to a 9700 on this exact game, and I could read the labelling on the walls and on the cars much further on a 9700.

Something I prefer.

I don't think people quite understand here what I'm saying, lets take BF 1942, show the Frame Counter.

Take a 9800 and 5900, 1600 x 1200 with 4X AA and 8X AF, well right away the end user must lower the AF on the FX card to make the game playable, while the 9800 will be able to execute 16X.
Now compare screen shots at Playable frame rates (12 is not), lets see the filtering comparisons now ?? Real World.

Backpedaling? Overgeneralizations about NVIDIA parts? Just because AF performance tanks in one game it means AF on NVIDIA cards is automatically "unplayable" in all situations? Please.

One more edit.
Doomtrooper said:
There is no flawed logic, I prefer useable AF at high filtering levels, even if if drops down to 4X on rotation.

False.
 
Doomtrooper said:
StealthHawk said:
First we have claims of "I don't see this in any game I play. Then, "well, I admit it's there, but it doesn't detract from gameplay." Jesus, this is the same "logic" that has been thrown around by defender's of NVIDIA's optimizations. And I think we all agree that was flawed logic. It's great to see that the double standard is alive and well.

There is no flawed logic, I prefer useable AF at high filtering levels, even if if drops down to 4X on rotation. Could it be better YES, is it as bad as some people make it out to be in a majority of games NO.

What else do you want, email ATI and ask them to improve it...getting old as this is not a 'driver' limitation.

I also find the double standards quote hilarios Stealth considering this review:

http://www.3dvelocity.com/reviews/gffx5800u/gffx_5.htm

:rolleyes: :LOL:

:rolleyes: You keep linking to old reviews when the IQ provided sucked. No one is disputing that IQ with old drivers is horrible. Things change, it's time to move on.

Your blanket statements of NVIDIA's AF being unusable are just as ridiculous as those statements from people who say ATI isn't doing real AF because it's algorithms are angle dependent.
 
Exactly how do you explain the 'new improved' IQ :rolleyes:

There is a company intentionally lowering their AF filtering method with detection of UT 2003, once you figure out why...come back. :!:
 
Doomtrooper said:
Exactly how do you explain the 'new improved' IQ :rolleyes:

Exactly as reviewers have explained it. They wanted driver settings from NVIDIA that matched what ATI was doing, and guess what, they got it. Notice how Quality AF now does trilinear and Performance AF now does bilinear, just like ATI's control panel options.

I don't see what you're getting at with your eye rolls and your skepticism. It is proven that NVIDIA's hacked filtering is isolated to UT2003. All other modes have better quality than they did with old drivers, where's the problem? I interpret your comment as meaning that NVIDIA is not in fact, providing better IQ in 99% of situations compared to older drivers. Please correct me if I'm wrong.

There is a company intentionally lowering their AF filtering method with detection of UT 2003, once you figure out why...come back. :!:

No kidding. This is what I see.
1) ATI scores higher, so NVIDIA optimizes until their scores are at the same level.
2) ATI's global AF optimization degrades quality and increases scores in UT2003. NVIDIA follows suit.
 
Then obviousally we see things differently, Nvidias AF is superior in IQ when it is allowed to run at 8X Trilinear, but is too SLOW, ATIs is flawed at certain angles but delivers good speed.

Nvidias option requires constant tweaking, and 80% of time will require the end user to lower to 4X, ATIs allows you to stay at 16X all of the time....

I stand by my opinion that I would rather have useable AF, all of the time..

There was 70 applications being detected in Detonator drivers stealthawk...you are naieve if you think UT 2003 is the only 'optimization'.

A complaint I see all the time, 8X AF=:

http://www.nvnews.net/vbulletin/sho...d0aaf9a59&threadid=16729&pagenumber=1
 
Doomtrooper said:
Then obviousally we see things differently, Nvidias AF is superior in IQ when it is allowed to run at 8X Trilinear, but is too SLOW, ATIs is flawed at certain angles but delivers good speed.

Nvidias option requires constant tweaking, and 80% of time will require the end user to lower to 4X, ATIs allows you to stay at 16X all of the time....

Where is this 80% number coming from? What data are you drawing your conclusion from? I see you talk a lot about the horrid performance in one of BF1942's expansion packs. How you extrapolate that to be something you see "all the time" makes no sense, as it is a new complaint and exhibits itself in one expansion pack for one game.

I stand by my opinion that I would rather have useable AF, all of the time..

There's nothing wrong with that opinion. I never said there was. There is a problem with you trying to "cover up" the deficiencies of ATI's AF. Maybe it was just an honest mistake.

There was 70 applications being detected in Detonator drivers stealthawk...you are naieve if you think UT 2003 is the only 'optimization'.

Prove it. I have not seen a single report or claim that NVIDIA is hacking up their AF in other games. You would think that someone on the internet would notice the degraded quality, wouldn't you?
 
StealthHawk said:
Where is this 80% number coming from? What data are you drawing your conclusion from? I see you talk a lot about the horrid performance in one of BF1942's expansion packs. How you extrapolate that to be something you see "all the time" makes no sense, as it is a new complaint and exhibits itself in one expansion pack for one game.

Hmmm I play with about 60 people on the UT 2003 server I play with...some use FX cards, I hear their 'pain'...I also have a friend with a 5600 U and I played with it ;)
Umm, I play the expansion pack fine with 16X AF with my 9700, so did AnteP in that thread with a 9800....go figure eh :devilish:

REAL WORLD ;)

Prove it. I have not seen a single report or claim that NVIDIA is hacking up their AF in other games. You would think that someone on the internet would notice the degraded quality, wouldn't you?

Ummm, I'll let the evidence speak for itself. :D
 
A new version of AF Tester for D3D is avaiable:

http://demirug.bei.t-online.de/D3DAFTester.zip

This version has a new function. It can count how much filtered texels are used.

My R9700 Pro is using 4395179 texels at 16x tri-AF.

And here are some GFFX numbers:

Code:
tri
     SUM     MAX P          TAB 
1xAF 3281571   8  296.779 G  6.837 
2xAF 3839185  15  556.249 G  7.998
4xAF 4585321  26  841.022 G  9.553
8xAF 5390039  44 1056.263 G 11.229

Look at the numbers at 4X AF and above. This means, that Nvidia's 4X Tri-AF is superior to ATi's 16x Tri-AF. :)
 
Exxtreme said:
Look at the numbers at 4X AF and above. This means, that Nvidia's 4X Tri-AF is superior to ATi's 16x Tri-AF. :)
Doesn't that just mean that nVidia's 4x does more work than ATi's 16x? I don't see how the numbers really reflect quality. :|
 
digitalwanderer said:
Exxtreme said:
Look at the numbers at 4X AF and above. This means, that Nvidia's 4X Tri-AF is superior to ATi's 16x Tri-AF. :)
Doesn't that just mean that nVidia's 4x does more work than ATi's 16x? I don't see how the numbers really reflect quality. :|
This number is a sum of all filtered texels in this one 3d scene. Higher number means more texels are used... more work and higher image quality.
 
Doomtrooper said:
And much slower performance ;)
True.

But Nvidia's 4x AF and ATi's 16x AF are equal when you looking at this numbers. Of cource it can vary from game to game. In a simple 3d shooter like Serious Sam ATi's AF-implementation should be better, in a rpg like gothic Nvidia's implementation should produce better results.

A switch in the control panel, which allows you to switch between the different implementations, would be the best solution for everyone... IMHO.
 
Back
Top