WaltC said:
Heh-Heh....Always nice to encounter folks more long winded than me, Dave H...
I know the feeling. That's why I love having demalion around sometimes...
I think you're for some reason missing the obvious...
It was because people noticed a difference that they first started looking into whether or not the Dets were doing Trilinear in UT2K3. Your position seems to effectively be that they are doing trilinear filtering without doing trilinear filtering. They aren't, and the differences are noticeable, which is why this topic has come up. If there were no visible differences, the topic would most likely never have been raised in the first place, right?
Actually I think it was noticed because the new UT2003 build offered the option to display colored mipmaps, and when Wavey or Rev (looking at the
original thread, I think Rev) tried it out, they immediately noticed the lack of trilinear filtering. I don't know this for a fact--perhaps they noticed something suspicious in normal mode and subsequently decided to turn on the colored mipmaps to check it out. But that's the impression I got.
Perhaps Rev (or Wavey, or whoever it was that first discovered this) would like to comment?
You're reading way too much into [H]'s pronouncements that they couldn't tell much difference. As well,
not even [H] says they can't tell any difference.
...
Again you keep reaching an erroneous conclusion that the "output isn't subjectively noticeable"--of course
that's simply not so. Were it so, no one would ever have been able to notice the difference, hence none of us would be talking about it right now.
...
The reason you are "curious" is because nVidia, in this game, has substituted this performance bi/trilinear mode for full trilinear, and [H] editorializes that the substitution is A-OK with them because
they "can't tell much difference" in the resulting IQ, although as I said
even they don't deny differences exist.
...
You seem to have a problem with "almost as good" and "as good"....there is a distinct difference between these two states. Nobody,
not even [H], claims that it's "as good." The entire [H] piece revolves around
their subjective opinion that it's "almost as good" as far as they can see and where it isn't "as good" they flatly state they don't care. That's what subjective opinions do for you...
...
Nope, that is not what the [H] article states at all. They've said that what differences they can see, and they do see them, are so "minor" in their opinions (because "all you do in the game is run around fragging people and so don't have time to look at the image quality" if I got that right) that they just don't care whether nVidia provides full trilinear like the Cats do or not. That's very, very different from your characterization of what they said.
...
So.....what
[H] views as "minor" IQ degradation because of the lack of full trilinear support in this game, someone else might view as a "major" IQ difference.
You can repeat it as often as you like, but that's not what the article, or Kyle and Brent's forum comments, said. Unfortunately the article is down, but some quotes by Brent from the forum discussion:
"as we shown in the Non-AF screenshots there aren't any mip-map boundary difference"
"Actually the conclusion is that we can't tell any in-game mip-map boundary difference between NVIDIA's "Quality" setting and ATI's Application Preference setting in a Non-AF situation. The 5900u seems to have a SLIGHT sharper texture in this situation as well.
With 8XAF there is also no in-game mip-map boundary differences and the 9800 Pro has a slight sharper texture."
"But currently we are NOT seeing it differ from the quality of ATI's mip-map transitions in a regular game view."
"My stance is purely observation, you see what i saw with the screenshots. The same is also said with movement as we indicated.
I played this game, first in all, yes ALL, the maps with NO bots, running around, backing into corners, looking in open spaces, looking at the floor, the walls, the ceilings, slopes, slants, gradiants, moving back and forth sometimes in one spot or one path looking hard for mip-map boundaries."
"We are just saying that NVIDIA's current filtering has no IQ difference in a regular game view compared to ATI's. The only place it shows a difference is with the mip-map levels highlighted, and i don't know anybody that plays the game in that mode."
"remember, the mip-map boundaries are what was in question and is what we are saying are not noticeable between the two"
There's more; they can repeat themselves almost as much as you can, albeit not all in the same post. Meanwhile, even though the original article is down the uncomressed pics are still
available for download. Since you obviously do care about those minor differences that Kyle and Brent can see but choose to ignore (no doubt helped by Nvidia slush money), why don't you point them out for us in those screenshots?
As I said, I have no problem with nVidia offering a performance trilinear which is a mixture of tri and bilinear....none whatsoever. I understand what that is and am not curious about it (what's there to be curious about?)
I'm curious how well it is actually working in UT2003. (Kyle/Brent and Wavey seem to disagree about this, and neither has shown enough evidence for me to have a real sense of the actual IQ costs.) I'm curious whether there are specific features of UT2003 that make it particularly applicable for this optimization, or whether this optimization would be a good idea for more or all games. I'm curious why this sort of thing has been rejected in the past. I'm curious whether it could be improved in either the IQ or performance directions.
My problem is with the fact that they *substitute* it for full trilinear, even when the application asks for full trilinear. Very simple.
What the game asks for is irrelevant. Nvidia doesn't have an "application preference" mode in their drivers. The issue is that Nvidia's "quality" mode doesn't do the same thing in all games, and that they led reviewers to believe that it always did full trilinear.
[H] at no point ever denies that nVidia is not doing full trilinear filtering in UT2K3--in fact, their entire article affirms and confirms it...
Their only contribution otherwise is to state that they don't care, for one reason or another.
I think only for one reason--it's not visible in the game. And going by the criteria on which they conduct their reviews, that's the only reason that matters.
Missing the obvious here Dave H, again... First of all there are no similar driver hacks in the current Catalysts, are there? All one need do is to use the UT2K3.ini to turn on trilinear--and presto, the game is fully trilineared.
Gee how come you can't just turn on "quality" in the drivers? Doesn't that mean trilinear?? Why should you have to muck around in the .ini???
There are many, many obvious differences between the IHVs aside from the products they produce.
Don't disagree with you there. So why not focus on the Nvidia scandals that actually do hurt the end-user?
[Q: then why doesn't Unwinder's anti-detect impact ATI performance on more games? A: we have no proof it is detecting all game-specific optimizations in either ATI's or Nvidia's drivers]
Did you write the Anti-Detector software, DaveH? The guy who wrote the software claims it does the same thing for the Catalysts and the Dets. Argue with him if you like...
Completely false. But thanks for being a sarcastic jerk.
[url=http://www.beyond3d.com/forum/viewtopic.php?p=132527#132527 said:
the guy who wrote the software[/url]]ATIAntiDetector scripts is a bit more complicated. ATI use different ways of application detections so it's much more difficult to collect and block _all_ the application detection related pieces of code. At this time I was able to identify and block at least 8 application detection routines in the driver, but I cannot give you any warranties that there are no more detections left (this applies to NVAntiDetector too).
So actually, it doesn't do the same thing for the Cats as for the Dets. And he doesn't claim it "is detecting all game-specific optimizations in either ATI's or Nvidia's drivers", like I said. But you can
argue with him if you like.
Has it ever struck you that their article is so subjective it's worthless? Listen, opinions abound about IQ. Whereas I run with 2x FSAA enabled, 16x AF in my games by default--some people state they prefer 0x FSAA/8xAF for their own reasons. Which of us is "right?" Correct answer is "neither" because it's a matter of subjective preference only.
If some people posted 81MB of screenshots comparing 2xFSAA to no FSAA I think it would be easy to tell the difference.
Look. IQ is subjective. I know you want to make it objective by replacing IQ with a big checklist of rendering features, but that doesn't serve any purpose. The goal of 3d rendering is to appear as realistic as possible to most people, and thus success can only be judged by a person commenting on how well he/she thinks this has been done. But "subjective" doesn't mean "meaningless". You are well aware of this, but as your only chance of winning this argument is to muddy the waters, you choose to ignore it.
The important point about this whole affair is this: nVidia has removed the option of full trilinear support from its drivers for UT2K3. Nothing else matters--at all. Had they not done this, there would be no issue whatever as no one would care what lesser IQ modes nVidia built into its driver support. There is no issue apart from this one in my view and as such [H]'s entire attempt at apology is a waste of epaper.
This is like the pot calling the tablecloth black.
They can "call on" nVidia all they like but until nVidia *does something* relative to the issue such statments are pompous and mean nothing, right?
Something like...
[url=http://www.hardforum.com/showthread.php?s=d6fbedd5ab52349ab5d713b7698ba078&threadid=644163 said:
Kyle[/url]]We specifically asked for NVIDIA to add a mode that allowed the application/game being used to set those settings as well a some others without entering a "debug" mode as their previous driver did (that was removed after our 5200/5600 article). AA etc..
NVIDIA has implemented this and tested this and has given me written re-verification as of Monday of this week that this feature will surely be included in their next driver release.
Is that what you wanted?
Heh...would have been nice if [H] had ONCE "called on" nVidia to stop cheating in its drivers relative to benchmarks...! What did [H] do instead? Tell everyone to dump their benchmarks, that's what [H] did... That's pretty funny, Dave H....
I'm assuming you remember how strongly I argued against [H]'s position on 3dMark03? Or is there some other reason for bringing up this totally irrelevant subject??
Sigh--what would satisfy me is simply [H] stopping its infantile behavior of apologizing for nVidia and plainly stating that you can't compare nVidia's faux-trilinear to the Catalysts' full trilinear in terms of performance because its not an apples-apples IQ comparison.
If it looks the same then it's by definition an apples-apples IQ comparison. It's not an apples-apples workload comparison, but that's not what [H] is interested in.
A subjective opinion that something is "almost as good" for reasons I've already stated doesn't suffice, no.
Their opinion is that it is as good. Is better, actually, with no AF. And their screenshots agree with them, subjectively speaking. Incidentally, you haven't stated any reasons why it doesn't suffice, except that Nvidia doesn't also offer a full trilinear mode, which is neither here nor there when all that's required is "an apples-apples IQ comparison".
Excuse me--I don't want to personally verify the old saying that "a fool and his money are easily parted"....
Presumably because you'd be broke awful quick.
And by the way, in the future it might be nice if you used "quotation marks" to enclose words that people "actually used" instead of "misleading paraphrases".