Help me understand the AF/[H] controversy

Dang it I tried to post earlier and it didn't go through.

Here's the situation as I understand it thus far:
1. nVidia release some drivers that are not doing true AF for UT2K3.
2. It has shown obviously noticeable IQ degradation in the HallOf Anubis add-on map.
3. It has not shown obviously noticeable IQ degradation for other maps.
4. It does show up in mip map highlights.
5. nVidia has promised to resolve the issue by providing the end-user control over AF.
6. Some people are VERY mad at [H] about this.

Now, for some people, this is difficult, but I personally haven't chosen sides in some sort of for/against [H] or B3D "war". It seems like many people have done so.

I just look at what I have been able to find, and make conclusions and ask questions from that. I guess I'm setting myself up to be attacked by BOTH sides, lol.

Two main things I'm curious about:
1. Has there been any map besides this add-on HallOfAnubis that shows decreased image quality from actual in game screenies? I have looked and haven't seen anyone posting them.
2. Why is everyone mad at [H]? It seems [H] ran their tests, showed that the actual in-game IQ is not affected noticeably, AND STILL went to nVidia expressing concerns over the lack of true AF. [H] reports that nVidia intends to resolve this.

It seems everyone (even nVidia themselves) agrees that nVidia should allow the end-user full control over graphics features/settings.

So what exactly is causing the big rift between B3D and [H] or at least their audiences?

If B3D, [H], and nVidia all agree that nVidia needs to change their drivers, what's left to argue about?

I just don't get it...
 
The thing is that [H] doesn't care about the IQ issue and just say that it's not noticeable when playing. But back at 2001 they raised big woowoo for the Quack thing. Double standards.
 
And it's on every map on UT2003. Nvidia applies the hacked filtering based on the executable name.
 
You really have to go back to last year when Nvidia promised all kinds of great things with NV30 (which kept getting delayed and delayed before finally being killed off) at the same time as ATI suddenly brought out a very competitive card.

You need to search for old threads on 3Dmark2003 and Nvidia cheating on it to get higher scores and badmouthing 3Dmark2003 after leaving Futuremark membership just before the new version got released, the subsequent response from Futuremark about the Nvidia cheats, and what various industry websites said (or didn't say about it).

There's loads of info, but it will be a lot of reading.
 
breez said:
The thing is that [H] doesn't care about the IQ issue and just say that it's not noticeable when playing. But back at 2001 they raised big woowoo for the Quack thing. Double standards.

First off--is there any IQ issue? Is [H] wrong when they say it's not noticeable when playing? This is why I asked if there is any map besides the HallOfAnubis (which is an add-on map) that shows IQ issues? I would think if there were, screenies would be plastered all over the place. I've looked for the screenies, and can't find them. So I'm beginning to think that either it doesn't affect in game IQ (HallOfAnubis being an exception, and being a 3rd party map, there may be other issues involved), or that the places it does present itself are too limited to notice.

Second... If [H] doesn't care about IQ, why would they do such a lengthy article at ATI's prompting? Also, why then would they ask nVidia to implement proper AF controls in the driver, DESPITE not seeing a noticeable visual difference?

Are there double standards on the Quack issue? I've seen both sides point to this to both bolster and refute their stances. On the one hand, we have the ATI people saying this is hypocritical treatment from [H]. On the other hand, we have nVidia people saying "ATI did it, so can we". Then we have people on the ATI side saying, "This is totally different, and not comparable because the ATI issue was a bug that was fixed and the IQ was repaired with no performance penalty." Yet we have people on the nVidia side saying, "This is totally different, nVidia is not denying the optimization and they aren't affecting image quality".

I can see some BASIC similarities: both companies did in fact optimize drivers for a specific application.

But from what I've seen, the ATI scenario did affect IQ much more drastically. Meanwhile, ATI denied the situation, despite being presented proof. So I think that [H] has treated the situatino differently because it IS DIFFERENT. They raised a "big woowoo" I think because ATI was lying about the optimization while hurting IQ. nVidia has handled it differently, and thus so have [H]. Is that not appropriate?

ATI did fix the "problem" and nVidia has reportedly promised to fix their "problem" as well.

I can see, though, where people come off saying [H] is hypocritical here, b/c part of their point with ATI was that Q3 was widely used as a benchmark. Now the same is true of UT2K3, and I think [H] hasn't tried to drive that same point home. However, it's somewhat different this time because nVidia is promising to fix the situation which will make the benchmarking "fair" again.

It's my opinion that benchmarking should be done in two modes, though. I would like to see straight benches that compare technique to technique (i.e. 4XAA/8XAF from both cards) as well as a comparison of benches at relatively equal IQ levels (which is much more subjective). For instance, from what I've seen, ATI AA is superior to nVidia AA at this point. Some say 2X ATI == 4X nVidia. I'd like to see benches and screenies that demonstrate this. The AF issue may fall into a similar category. We can compare nVidia optimized IQ to ATI (optimized) IQ when the IQ is comparable. I think this would have a lot of value to potential consumers when evaluating cards.
 
breez said:
And it's on every map on UT2003. Nvidia applies the hacked filtering based on the executable name.

Where are the screenies that show it in game? Looking at [H]'s mip map highlights, it's obviously affecting every map, but when you compare screenshot to screenshot, the difference is negligible.

The only map I've seen show a distinctive and obvious difference is the HallOfAnubis.
 
Park the Shark said:
Dang it I tried to post earlier and it didn't go through.

Here's the situation as I understand it thus far:
1. nVidia release some drivers that are not doing true AF for UT2K3.
2. It has shown obviously noticeable IQ degradation in the HallOf Anubis add-on map.
3. It has not shown obviously noticeable IQ degradation for other maps.
4. It does show up in mip map highlights.
5. nVidia has promised to resolve the issue by providing the end-user control over AF.
6. Some people are VERY mad at [H] about this.

<snip>

You are confusing two different issues that currently reside with UT2003 and NVIDIA 44.03 drivers.

1.) The issue of the 44.03 not allowing the application, i.e., UT2003, to use full trilinear, but is rather applying their own mix of filtering that is currently between Bi/Trilinear filtering. While the ATI cards have an App Preference option that lets the app decide the filtering. The result of lesser Trilinear filtering can result in reduced transitions between mip-maps.

2.) The other issue is regarding Anisotropic filtering being enabled with the 44.03 drivers on the GFFX 5600. So far testing indicates it is restricted to that card specifically in one map specifically, CTF-HallOfAnubis which is a map that is not included with the game, but can be downloaded. This so far appears to be a bug.
 
Park the Shark said:
Are there double standards on the Quack issue?

Most definitely.

Park the Shark said:
I can see some BASIC similarities: both companies did in fact optimize drivers for a specific application.

Yes.

Park the Shark said:
However, it's somewhat different this time because nVidia is promising to fix the situation which will make the benchmarking "fair" again.

It's been proven that [H] didn't even ask ATi about the thing before posting the woowoo.


The quack thing didn't affect all textures. Is there a noticeable IQ difference when playing?
 
Bouncing Zabaglione Bros. said:
You really have to go back to last year when Nvidia promised all kinds of great things with NV30 (which kept getting delayed and delayed before finally being killed off) at the same time as ATI suddenly brought out a very competitive card.

You need to search for old threads on 3Dmark2003 and Nvidia cheating on it to get higher scores and badmouthing 3Dmark2003 after leaving Futuremark membership just before the new version got released, the subsequent response from Futuremark about the Nvidia cheats, and what various industry websites said (or didn't say about it).

There's loads of info, but it will be a lot of reading.

What does nv30 delays have to do with being mad at [H]? When they preview the nv30, they had this to say:

As we’ve seen, the GeForce FX is no slouch in the 3D accelerator world, however it is not the "9700-killer" many have expected. It is, at best, mildly faster in most games, and the same or slightly worse in a few. Had it arrived when most of us thought it should, there is no doubt it would be much better received.

However, its best performance was in 3dmark2001, a synthetic benchmark, and its worst in ut2003, an actual game compared to the 9700 Pro.

There was also commentary about how hot and loud it was. I don't think they were sugar coating the nv30.

RE: [H] on 3dMark2K3... They wrote an article about how worthless it was. It had nothing to do with "cheating" or optimizations. It had to do with 3DMark2K3 using mostly DX7 and DX8 games and calling it a DX9 benchmark. Particularly, this page:
http://www.hardocp.com/article.html?art=NDI4LDM=

I've never seen anyone disagree with the accuracy of the characterization of the DX tests there. In that regard, I can agree with [H[H] that the overall 3DMark score is worthless. I use it for stress testing, to test fillrate, and to make sure a card is performing comparably to others. For gaming purposes, it simply doesn't reflect much in the real world. Now, if [H] was truly nVidia biased, wouldn't they have been singing the virtues of 3DMark because nVidia's nv30 was the leader in 3DMark when [H] was blasting it.

Who honestly thinks 3DMark's overall score is a good measure of video card performance in real world applications or even future DX9 titles? Even Extremetech, who found nVidia was cheating, made comments about how 3DMark's game tests were done in strange ways that would not likely ever be seen in a real game title.

blah blah blah... What does this have to with the AF issue? I just don't see where you're going with this?

Now I know I'm starting to sound like an [H] defender, but I just don't think 3DMark is a good tool, and I don't think [H] was EVER singing the praises of the nv30. I honestly never cared if nVidia cheated at it or not, I didn't trust the scores ANYWAY! lol
 
Brent said:
You are confusing two different issues that currently reside with UT2003 and NVIDIA 44.03 drivers.

1.) The issue of the 44.03 not allowing the application, i.e., UT2003, to use full trilinear, but is rather applying their own mix of filtering that is currently between Bi/Trilinear filtering. While the ATI cards have an App Preference option that lets the app decide the filtering. The result of lesser Trilinear filtering can result in reduced transitions between mip-maps.

2.) The other issue is regarding Anisotropic filtering being enabled with the 44.03 drivers on the GFFX 5600. So far testing indicates it is restricted to that card specifically in one map specifically, CTF-HallOfAnubis which is a map that is not included with the game, but can be downloaded. This so far appears to be a bug.

Thanks for clearing that up. So there is no AF controversy, that's just a bug? The bi/tri controversy remains, though...

The big question for me, then, is has ANYONE produced any screenshots showing the bi/tri issue as actually degrading in game image quality? (obviously, mip map highlights DO show a difference).
 
Park the Shark said:
Now I know I'm starting to sound like an [H] defender, but I just don't think 3DMark is a good tool, and I don't think [H] was EVER singing the praises of the nv30. I honestly never cared if nVidia cheated at it or not, I didn't trust the scores ANYWAY! lol
That's the thing, a lot of people did and do think that nVidia cheating on 3dm2k3 is still wrong even if you don't happen to like the benchmark and that it was newsworthy enough that [H] should have covered it rather than just say, "Well, we don't like that benchmark so it doesn't matter". :rolleyes:

A whole lot of people don't agree with that attitude, and Kyle has never directly addressed the issue of cheating. :(
 
Someone said in the [H] forums that Kyle was doing Nvidia a disservice by not bringing their "optimizations" to the forefront. He did it with ATI and ATI then improved dramatically(it's amazing what a little bad press can do). By not putting pressure on Nvidia he is giving them a green light to continually go down hill when it comes to their drivers.

I do give Kyle credit for at least having an opinion on the matter even though the vast consensus disagrees. At least its making people aware of the issues. There are far too many other sites that are so silent on the matter you could hear a pin drop. For instance, if you go to Anandtech forums, 99% know nothing about the latest UT2k3 saga. I don't believe anandtech has made an official statement on anything regarding nvidia drivers, the silence is deafening
 
Park the Shark said:
What does nv30 delays have to do with being mad at [H]?

Because that is when the PR war that Nvidia has been waging started. That's when websites chose sides instead of being objective. That's when Nvidia decided to trash 3Mark2003 because they couldn't compete, and recruited websites to voice their PR documents. That's when cheating on benchmarks became a viable alternative to having the fastest part out there.

It's the direct source for why Nvidia is touting a huge increase in speed on one of the most used benchmarked programs (UT2K) by cheating on the filtering, and why sites like [H] are supporting them.
 
It's been proven that [H] didn't even ask ATi about the thing before posting the woowoo.

The quack thing didn't affect all textures. Is there a noticeable IQ difference when playing?

So [H] was flat out lying when they said:
ATi engineers were asked last week if ATi drivers used any game specific instructions and we were told "No."

I don't know how many textures were affected by "Quack", but I have seen screenshots that show obvious differences. I would definitely say there is a noticeable IQ difference.
 
Park the Shark said:
The big question for me, then, is has ANYONE produced any screenshots showing the bi/tri issue as actually degrading in game image quality? (obviously, mip map highlights DO show a difference).

That's only part of the issue. Image quality is indeed subjective, and what one person might notice, others may not. Though you have to ask, why would nVidia make this a UT specific setting, and not a global setting, if it really doesn't "impact" image quality?

The other part of the issue, is that you CAN get a similar quality level on ATI cards by setting (I believe) the texture slider lower.

So why isn't that done on ATI cards when comparing performance to nVidia ones?
 
Joe DeFuria said:
The other part of the issue, is that you CAN get a similar quality level on ATI cards by setting (I believe) the texture slider lower.

So why isn't that done on ATI cards when comparing performance to nVidia ones?
"Psssssst, it's because then the 9800 would absolutely KILL the 5900 in all the reviews!", whispers the Dig quietly to Mr.DeFuria

;)
 
Park the Shark said:
Dang it I tried to post earlier and it didn't go through.

Here's the situation as I understand it thus far:
1. nVidia release some drivers that are not doing true AF for UT2K3.

Actually, it's "not true trilinear filtering".

2. It has shown obviously noticeable IQ degradation in the HallOf Anubis add-on map.

Well, perhaps that is one place where it is evident even in selected still screenshots? In any case, trilinear filtering being absent is most evident in motion, not screenshots. Also, it doesn't show as readily on dark/dark detailed or low contrast surfaces (the darker a surface and its details, the less color brightness values available for contrast) and irregular surfaces (i.e., not flat, or "floor-like").

There are plenty of maps with "flat" surfaces in UT 2k3 (I played the "ball" maps a lot, and I know they are highly evident there), but there are also plenty of spots in almost every map that are irregular, or where the surfaces are dark/low contrast. Picking screen shots can make it look like a map is completely affected by the issue even if it is only the one spot picked for a screen shot where the issue showed up, as well as making it look like a map is completely unaffected by the issue even if it has the one spot you picked where the issue isn't evident.

Also, with increasingly detailed texture usage (as in UT2k3), using a still screenshot as "proof" of trilinear not being needed is a fallacy. That's backwards...seeing mip map transitions with the naked eye in a still screen shot is proof that it would be evident in (most types) of motion, all their absence proves is that they haven't risen above a certain threshold of image degradation.

3. It has not shown obviously noticeable IQ degradation for other maps.

No, I don't think that's right. More accurate to say that "[H] has not shown obviously noticeable still screenshot IQ degradation for the maps selected", or anyone involved with posting screenshots you may have looked at.

I've used bilinear/16xAF, and with boosted LOD (makes mip map transitions worse), and I can certainly like it when playing UT2k3, but that's different than saying that one card can arbitrarily switch trilinear off if the IHV wants to boost fps because the selected still screenshots don't show it. It's not like the nVidia cards can't do trilinear...nVidia decided to disable it for fps boosting.

4. It does show up in mip map highlights.

Yes, this is an analysis tool. It highlights the issue regardless of the significance for the textures used, as that's its purpose. Without being able to feasibly deliver pixel accurate (no encoding artifact) video of in game motion, this allows the issue to be made evident in still screenshots.

5. nVidia has promised to resolve the issue by providing the end-user control over AF.

Well, [H] indicates that they have, I haven't seen a specific statement from nVidia...again, I'm correcting your "AF" to mean "trilinear filtering". Note that nVidia's review guidelines already propose that the end-user has control over trilinear filtering ("Quality" mode), though through application detection as evidenced in UT2k3 that is not the whole truth. The whole truth about the change remains to be seen.

6. Some people are VERY mad at [H] about this.

Well, no, ATI is "concerned" with [H]'s review that this article addresses. I'd presume they are still "concerned", because the central issue of unfair fps comparison, as determined by nVidia's decisions for application detection alone, does not seem successfully addressed by the article.

Why "some people are VERY mad at [H]" are for a long list of reasons leading up to this article, all having the pattern of giving nVidia and their interests special treatment. Actions associated with this include dismissing ExtremeTech, Beyond3D, and other sites involved in the nVidia/3dmark 03 benchmarking image quality degradation issues as "police" or lackeys, including saying that the issue's exposure was "payback" for not having access to Doom 3 granted by nVidia (he's only expressed "regret" concerning ExtremeTech, which happens to be the particular site which is part of a large media group); attacks on Futuremark for nVidia's decisions to cheat, apparently based on a "technical article" that seems exclusively based on a 'technical' PR document nVidia circulated to several websites (though this was not indicated in the article, IIRC); and having thoroughly exposed the Quack issue, for it to be later found out that nVidia provided the info and tools (again, this was not indicated in the article...the source for them was listed as "friends").

This seems a stark contrast in IHV treatment, and "some people" find it cause for disappointment. Also, there is the issue of post deletions and banning of posters in this issue, and other issues, such as when faulty benchmark results were posted, changed, people at the site denied they were changed, and posters in the forums insisted and showed screenshots for comparison to the changed benchmark results.

Now, for some people, this is difficult, but I personally haven't chosen sides in some sort of for/against [H] or B3D "war". It seems like many people have done so.

Hmm? I don't get this B3D/[H] "war" thing. I have problems with specific actions of specific people at [H]...there are lots of forum posters at [H] I haven't met, so why would I want to "war" with them? B3D is just a place where people who are aware of these issues can discuss them, because the [H] forum's policy of banning, deletions, and restricted registering prevents it from being possible there. That policy is that tolerable disagreement is determined by Kyle, and the listed objective standards of conduct are arbitrarily dismissed as being necessary for the listed actions when it suits Kyle. Deleting posts serves to obscure the frequency of this issue.

I just look at what I have been able to find, and make conclusions and ask questions from that. I guess I'm setting myself up to be attacked by BOTH sides, lol.

There's a lot to look at.

Two main things I'm curious about:
1. Has there been any map besides this add-on HallOfAnubis that shows decreased image quality from actual in game screenies? I have looked and haven't seen anyone posting them.

See above.

2. Why is everyone mad at [H]? It seems [H] ran their tests, showed that the actual in-game IQ is not affected noticeably, AND STILL went to nVidia expressing concerns over the lack of true AF.

They also left a review standing that showed an nVidia card leading an ATI card in a mode that is represented as being equivalent. nVidia cards can do trilinear filtering, they do trilinear filtering in other apps, including those they recommend to show whether they are doing trilinear filtering or not, they do not do trilinear filtering in UT2k3, which happens to be an application they reported as having a performance boost in the driver set, and which happens to be often used for fps comparisons. In response to ATI apparently indicating they had a problem with this, this image quality article concluded that "it didn't matter anyways", and let the fps comparisons stand.

You see no problem in that sequence of events?
[H] reports that nVidia intends to resolve this.

Some people have issues with considering [H] and nVidia as trustworthy in their statements, a partial list of reasons listed above. Once the next batch of "fixes" (there have been many, including the one that put this UT2k3 specific behavior in) is released and can be evaluated, we can check for ourselves.

It seems everyone (even nVidia themselves) agrees that nVidia should allow the end-user full control over graphics features/settings.

Yes, but it has seemed this way before too.

So what exactly is causing the big rift between B3D and [H] or at least their audiences?

Is there a rift between the audiences? :-? There might be a rift between those who think [H] has serious problems and those who don't, but I'm pretty sure there are members of both groups in at least B3D's audience. The apparent rift between [H] and B3D is, IMO, simply a rift between what can be posted without deletion/bans, with those who think [H] has some serious problems therefore disproportionately represented in one of those places.

If B3D, [H], and nVidia all agree that nVidia needs to change their drivers, what's left to argue about?
I just don't get it...

Hmm...you seem to be missing a few months and a few issues in your consideration.
 
digitalwanderer said:
That's the thing, a lot of people did and do think that nVidia cheating on 3dm2k3 is still wrong even if you don't happen to like the benchmark and that it was newsworthy enough that [H] should have covered it rather than just say, "Well, we don't like that benchmark so it doesn't matter". :rolleyes:

A whole lot of people don't agree with that attitude, and Kyle has never directly addressed the issue of cheating. :(

Well, you nailed me there... I'm pretty much in the boat of "that benchmark sucks, who cares" lol...

I can understand and appreciate the point, though, of disliking nVidia cheating on a benchmark, period. Obviously, I do read [H], and sometimes I disagree with Kyle, but as far as 3DMark03 goes, he posted numerous links and stories regarding the cheating issue. I did think it was pretty odd the way he linked to ExtremeTech's article and at the same time roasted them a bit, questioning their motives. It's not like he was in some conspiracy to deny nVidia was cheating.

What you're saying, though, is essentially that people are made at [H] because they didn't proclaim at the top of their longs that nVidia is the devil? What do you expect them to add? They posted the links to other works that clearly demonstrated what was happening, as well as concluded it was done intentionally to "cheat". Put another way, are you not saying that [H] has a duty/responsibility to not just link to other people's findings, but independently verify everything they link to and add their own commentary?

If someone has already done the work, what would be the point in wasting resources only to say, "In conclusion, ExtremeTech was right!"

Finally, what bearing does this really have on the bi/tri issue?
 
Back
Top