Filtering verdict

[H], for one, said they couldn't see a difference in their extensive UT2K3 testing.

Edit: jvd, I wasn't saying [H] was right or wrong. I was just providing an example of a "whole site" that said it was a non-issue at the time, IQ-wise.
 
Pete said:
[H], for one, said they couldn't see a difference in their extensive UT2K3 testing.


Didn't they later came out and complained about it? [H] that is, i remember and article calling nvidia out about the brilinear thing in the driver that was suppose to be fix but wasn't.
 
Pete said:
[H], for one, said they couldn't see a difference in their extensive UT2K3 testing.

Edit: jvd, I wasn't saying [H] was right or wrong. I was just providing an example of a "whole site" that said it was a non-issue at the time, IQ-wise.
I know what you were doing. I was providing another that said they did see an iq issue :) One that i would believe over hardocp any day of the week :)
 
CyanBlues said:
Didn't they later came out and complained about it? [H] that is, i remember and article calling nvidia out about the brilinear thing in the driver that was suppose to be fix but wasn't.
Yup. Someone from nVidia personally promised Kyle that either the optimization would be gone or they'd include an on/off switch for it, when they did NOT Kyle got pissed and jumped onto the "nVidia is LYING!" bandwagon! ;)
 
caboosemoose said:
OK kids, I know you've been chucking this hot potato back and forth plenty recently, but I've been a terribly busy bee of late and haven't been able to join in on the festivities. So...what's the bottom line regarding the latest filtering 'scandal'. Does it boil down to ATI not actually doing anything worse than NV's filtering optimisations but suffering from the fact that they (ATI) have been playing holier than thou regarding optimisation?


EDIT:
No we don't need another of these threads, but one or two well in formed posts might be useful for those who don't have time to read through 26,378 posts.


UT 2004 one of the original articles about this - screens were taken and there was no quality difference between any of them, funny considering the article was about that but at least they didn't fake anything or force photoshop the images

Call of duty - it was concluded the person taking screenshots had the settings wrong, since COD resets them when you change cards, invalid test

Far Cry - the effect shown is the way the engine handles an issue right now with regular bump mapping and gloss maps on any card... it can actually be seen in both videos if you scale them down to be basically identical... invalid test

various photoshop manipulations that probably don't even make technical sense - invalid. please, unless you're a graphics programmer or engineer don't pretend you can't get meaningful results by coming up with your own photoshop filterorama and telling us it means something


What's missing - actual screenshots that show an irrefutable quality difference. I haven't seen anything remotely like that yet.
 
I have to admit, it's looking more and more like an issue with FarCry... though ATI have themselves mentioned something to do with Max Payne (2?) which they are looking into. Anyone know what the issue is there?
 
Mr. Travis said:
Far Cry - the effect shown is the way the engine handles an issue right now with regular bump mapping and gloss maps on any card... it can actually be seen in both videos if you scale them down to be basically identical... invalid test
While I agree that both videos show shimmering, if to a very obvious different amount, I still fail to see where these two videos have been created differently. What do you mean by "scale them down to be identical"? They have the same resolution after all...

I have *seen* the difference on my monitor. This is why I decided to create these videos in the first place. This difference is for *real*.

I agree that the textures in this spot are very susceptible to shimmering, but no matter what you can say, the x800 *does* amplify the problem *in this case*. No matter what you or ATI say can change that.
 
jimmyjames123 said:
For performance, the NV3x path should be optimal for the NV4x since it uses less complex shaders. You think things are going to get faster with more PS 2.0/3.0 shaders replacing PS 1.x shaders?

Not necessarily. In some of the synthetic tests done, the NV40 was actually faster in some cases using PS 2.0 vs using PS 1.1. The NV3x cards share FP16/FP32 precision with the NV4x, but architecturally the differences are eye opening (NV4x has superscalar architecture, different pipeline structure, different AA/AF algorithms, FP16 blending, SM 3.0, etc). Also keep in mind that the forced R300 path apparently has some coding optimizations for the R3xx cards, hardly what one would call "optimal" for the NV4x. SM 3.0 is, of course, designed to help make certain effects more efficient vs PS 2.0. Still, "optimal" in this case would refer to combination of image quality and performance.

Interesting... Could you provide some documentation for these specific R300 optimizations? Very intesting that a TWIMTBP game would have specific optimizations for a R300 card..

I also find it humorus you quote synthetics being that when NV30 came out and R300 crushed that in synthetics that Nvidiots clamored that it wasn't representitive of real world gaming... But 3Dmark has shown that to be false.. Regardless of all that, I would like to know what these R300 optimizations are, because AFAIK the R300 path is straight up DX9..
 
I beleive anandtech (of all places) had an article out that compared brilinear and trilinear (with animated images no less) when the brilinear controversy was at it's height. Tom's might have had something too.

Nite_Hawk
 
Back
Top