ATi is ch**t**g in Filtering

Dave you said this happens on the 9600pro too. Does it happen when you have the texture quality bar to max ?
 
Quasar said:
Well then let's hope the next driver release does fix this issue with no performance drop.

Have you actually shown where any of this gives a detrimental output in games?
 
jvd said:
Dave you said this happens on the 9600pro too. Does it happen when you have the texture quality bar to max ?

As my earlier shots show - this has been applicable for over a year now, in RV350, RV360 and now R420. In that time we appear to be lacking any complaints as for actual IQ - wonder why...?
 
It is detrimental in IQ or is it not?

Speedwise the issue is a lot more complicated than switching the feature off, rendering correctly (tri) and getting lower FPS.
 
DaveBaumann said:
Quasar said:
Well then let's hope the next driver release does fix this issue with no performance drop.

Have you actually shown where any of this gives a detrimental output in games?
If i'd say both me and my co-author did notice IQ-differences in the published UT2003-Shots after an 18-hour day of hard labour late at night, would you believe me?
Without magnification or contrast-changing tools, that is.

I guess not.

As far as detrimental (i had to look up that word) ist concerned - I am not sure if i'd call myself the authority to call upon in such cases.
I think, everyone should decide for him or herself.
 
DaveBaumann said:
jvd said:
Dave you said this happens on the 9600pro too. Does it happen when you have the texture quality bar to max ?

As my earlier shots show - this has been applicable for over a year now, in RV350, RV360 and now R420. In that time we appear to be lacking any complaints as for actual IQ - wonder why...?

Well I haven't noticed it on my little sisters 9600xt . I will check it out and see if I can see a diffrence between that and my pro .
 
Quasar said:
If i'd say both me and my co-author did notice IQ-differences in the published UT2003-Shots after an 18-hour day of hard labour late at night, would you believe me?
Without magnification or contrast-changing tools, that is.

I guess not.

Sorry? Is that a yes, or a no? Are you going to provide comparisons of R420/RV3x0 for people to see the difference?

http://www.beyond3d.com//previews/nvidia/nv35/index.php?p=11
 
I'm sorry, i do not seem to be following you right now...

What exactly are we talking about? I thought it was about R420s way of interpolating between mipmap-levels when no one's looking.

Then there was this remark about this being similar to the R8500-quack issue, which was fixed one driver release later with no performance drop (and called a glitch (?) by ATi).
I was referring to that with my remark in turn.

The shots of UT2003, which were published to illustrate this issue, were chosen purely coincidentally - we did not even try to look for a situation, where this gives a real real bad IQ (if there even is such a setup).

In these shots there are differences in texture filtering between R360 and R420 - visible ones.

And, as i said, i am in no position to decide for anyone, to what degree a given image has to be reduced in quality to call it detrimental.
Some of my buddies do not even see, if AA is switched on or not - to those it surely would not be very harmful.
 
What Dave want's to know is, does the X800 actually look worse than the 9800.

I myself can see differences in the screenshots, but I really can not say if it actually looks worse or not.
 
Have you actually shown where any of this gives a detrimental output in games?

Some reviewers have found NV's AF algorithm to be slightly clearer/shaper than ATI AF algorithm, using game photos.

Also, there really is no clear definition of "detrimental" output in games, especially since gamers typically do not have a reference output to compare to.

NVIDIA was able to "optimize" in some ways for 3dmark03 without noticeably changing visible output (ie, output as seen in demo and on screen).
 
Quasar said:
I'm sorry, i do not seem to be following you right now...

What I’m getting at is that you have demonstrated there are differences between two images using bit comparison tools, however that may only be telling part of the story. Whether or not I’m just missing it, as yet I’ve not seen you evidence that this is noticeable in screenshots – for instance, the two UT shots you put in the image, if I just look at those (unless I’m just being blind) I can see no appreciable difference between the shots, which is unlike the solution NVIDIA initially provided as demonstrated by the shots in the link above.

So far you’ve called for changes but have you effectively demonstrated that there need to be changes? Bit comparisons are one thing, but what the eye does isn’t necessarily the same – whats in the texture and even the frequency of information in it can alter how it needs to be handled to be perceived adequately. I might be off base in what I believe is happening but I don’t necessarily believe it to be something as simple as “we’ve seen from the comparison shots something is going wrong, they need to fix in their drivers†-- if I am wrong then, yeah absolutely they do need to do something, but without actual straight images displaying differences that show "Brilinear" is enabled on the R420/RV3x0 in comparison to 9800 I’m not sure the job is done.

[ Unfortunately I’ve been a bit of a plank and reinstalled my test rig wiping all my test UT maps that I had when I noticed the issues with the FX boards. ]
 
Dave shouldnt this trilinear vs brilinear issue be more visible in motion? i.e. do you see mip map transitions when the games in motion

Admitedly I havnt heard of any 9600 owners see any problems with mip map transitions...
 
dan2097 said:
Dave shouldnt this trilinear vs brilinear issue be more visible in motion? i.e. do you see mip map transitions when the games in motion

Well, the point of "brilinear" is to effectly 'fudge' the fact that you are not seeing full trilinear in the first place, however dependant on the implementation / agressiveness then sometimes you'll notice it more than others. Again, the images I posted in the 5900 preview are fairly evident just in those still shots.
 
Zooming in 300% on the darker tiles in the UT images I can see slight differences. If anything, the X800 appears slightly sharper.
 
DaveBaumann said:
...whats in the texture and even the frequency of information in it can alter how it needs to be handled to be perceived adequately.

Agreed, and It's good to see a thread with some balanced debate :)

The question remains that while I would accept that driver intelligence which bumps up speed at the expense of imperceptably worse IQ is fine, it seems a happy side effect that the IQ tools using colour mipmaps give a 'absolute best case' result, which in turn isn't at all likely 'in-game'.

Would this not invalidate any compatisons made with other hardware/drivers. Apple to apples and all that ?

I feel sorry for the driver teams at ATI and Nvidia. They have to be so careful not to attract accusations of *cheating* while trying to produce the best 'consumer experience' for the game playing public. On top of that we have both DX8 and DX9 games which don't always 'behave' in a perfect way and things have to be done 'under the covers', which only raises more accusations of dodgy drivers.
 
a.JPG


If you compare R9800 and RX800 at this place you will see that the R9800 do a better job with the joints between the plates. Event the plates are more smoothly.
 
Back
Top