Help me understand the AF/[H] controversy

Park the Shark said:
It's been proven that [H] didn't even ask ATi about the thing before posting the woowoo.

The quack thing didn't affect all textures. Is there a noticeable IQ difference when playing?

So [H] was flat out lying when they said:
ATi engineers were asked last week if ATi drivers used any game specific instructions and we were told "No."

I don't know how many textures were affected by "Quack", but I have seen screenshots that show obvious differences. I would definitely say there is a noticeable IQ difference.

http://www.beyond3d.com/forum/viewtopic.php?t=7073

There is some info on the "Quack" issue. It's not even clear if [H] asked ATi. The wording indicates that it may have been someone else (possibly Nvidia) who asked ATi and the [H] just took someone elses (possibly Nvidias) word for it ;)

It affected total of 5 textures in the game.
 
Park the Shark said:
What you're saying, though, is essentially that people are made at [H] because they didn't proclaim at the top of their longs that nVidia is the devil?

No, people are upset with [H] because he does not treat IHVs equally, nor does he have a consistent approach to benchmarks.

What do you expect them to add?

We expect fairness and consistency.

Finally, what bearing does this really have on the bi/tri issue?

Because benchmarking:

1) A card that CAN do trilinear, and doesn't, against.
2) A card DOING trilinear, even though it CAN do bi/tri...

Is in no way an attempt to fairly compare products.
 
Some UT2003 Screenshots with 44.03 drivers:

http://www.beyond3d.com/forum/viewtopic.php?p=145076#145076

The mip maps are visible - the transitions are blended, but I can make out there is a difference between one mip level and the next (look at the bottom if the image upwards - there is a difference between the detail texture to the next mip level down). With full Trilinear all mip levels should be blended so you wouldn't be able to notice one mip from the next.
 
Dean said:
Someone said in the [H] forums that Kyle was doing Nvidia a disservice by not bringing their "optimizations" to the forefront. He did it with ATI and ATI then improved dramatically(it's amazing what a little bad press can do). By not putting pressure on Nvidia he is giving them a green light to continually go down hill when it comes to their drivers.

What more should he do? He showed the mip map highlights, they published an entire article dedicated to the issue, and he directly contact nVidia and nVidia has promised to fix it. Either he's lying and you have some proof regarding that, or he doesn't have to "pressure" nVidia b/c nVidia is already working on the resolution. What am I missing there?
 
Joe DeFuria said:
Park the Shark said:
What you're saying, though, is essentially that people are made at [H] because they didn't proclaim at the top of their longs that nVidia is the devil?

No, people are upset with [H] because he does not treat IHVs equally, nor does he have a consistent approach to benchmarks.

What do you expect them to add?

We expect fairness and consistency.

Finally, what bearing does this really have on the bi/tri issue?

Because benchmarking:

1) A card that CAN do trilinear, and doesn't, against.
2) A card DOING trilinear, even though it CAN do bi/tri...

Is in no way an attempt to fairly compare products.
Yup. ("Thanks Joe!" :) )
 
Park the Shark said:
Dean said:
Someone said in the [H] forums that Kyle was doing Nvidia a disservice by not bringing their "optimizations" to the forefront. He did it with ATI and ATI then improved dramatically(it's amazing what a little bad press can do). By not putting pressure on Nvidia he is giving them a green light to continually go down hill when it comes to their drivers.

What more should he do? He showed the mip map highlights, they published an entire article dedicated to the issue, and he directly contact nVidia and nVidia has promised to fix it. Either he's lying and you have some proof regarding that, or he doesn't have to "pressure" nVidia b/c nVidia is already working on the resolution. What am I missing there?
"What more should he do?"?!? :oops:

His article on it was just a PR justification piece for nVidia, what he SHOULD have done was find some drivers that use trilinear in UT2k3 or used a bilinear setting on the ATi card or both and posted those results also. The way he presented it was basically a condescending dismissal of all the logically presented evidence with a big f-ing "Meh.". :(

He also shouldn't just be posting this stuff buried in his forums, it deserves to be openly posted on his front page same place as his PR BS was.

ATi is still not really pleased with [H] and did NOT consider that follow up article vindication, but I can't comment any further on that right now.
 
Park the Shark said:
Thanks for clearing that up. So there is no AF controversy, that's just a bug? The bi/tri controversy remains, though..

There is a separate issue with AF in that only the AF selected is applied to texture stage 0, any other stages recieve a maximum of 2x AF - and this overides any application settings.

(If you are looking at comparisons with ATI there is an issue that ATI applied only Bilinear AF to textures beyond stage 0 - although this can be overridden by the application. link)
 
Joe DeFuria said:
Though you have to ask, why would nVidia make this a UT specific setting, and not a global setting, if it really doesn't "impact" image quality?

Why did ATI release a driver update specifically to improve NeverWinter Nights play? Popular games get special driver treatment though specific optimizations and fixes.

I don't think the "crime" here is that nVidia optimized... the "crime" is that they over-rode a user selectable setting. No one seems to like that. They all seem to agree. That's why I don't understand the ill will.

The other part of the issue, is that you CAN get a similar quality level on ATI cards by setting (I believe) the texture slider lower.

So why isn't that done on ATI cards when comparing performance to nVidia ones?

That's what I was talking about previously when I mentioned IQ equivalent benchmarking, in addition to technically equivalent benchmarking. This leads to a very valid point, though, and I posted the follow message directly to Kyle:

1. From what you've written, I think you're of the opinion that nVidia should not be overriding user selected Trilinear filtering with this Bi/Tri mix. That would be the reason you consulted nVidia about changing that aspect of the driver, correct?
2. In light of point #1, do you think it's fair to compare benchmarking numbers from ATI's full tri filtering to nVidia's bi/tri mix? Has [H] published any reviews using the so called "cheat" drivers, making such a direct comparison, and if so, will those numbers be pulled or updated when nVidia delivers the drivers that enable full Tri filtering in UT2K3?
 
Bouncing Zabaglione Bros. said:
Park the Shark said:
What does nv30 delays have to do with being mad at [H]?
That's when Nvidia decided to trash 3Mark2003 because they couldn't compete, and recruited websites to voice their PR documents. That's when cheating on benchmarks became a viable alternative to having the fastest part out there.

It's the direct source for why Nvidia is touting a huge increase in speed on one of the most used benchmarked programs (UT2K) by cheating on the filtering, and why sites like [H] are supporting them.

In fact, though, the nv30 performed BETTER in 3dMark03 upon release. So that kind of shoots a hole in that theory....

How is [H] supporting cheating in UT2K3? They've openly stated that nVidia needs to provide a driver that allows the user to select FULL tri filtering.
 
DaveBaumann said:
There is a separate issue with AF in that only the AF selected is applied to texture stage 0, any other stages recieve a maximum of 2x AF - and this overides any application settings.

Is that what the issue raised on AMDMB.com turned out to be? And where were the specifics of this AF problem first talked about? I'd love to read from the source.
 
DaveBaumann said:
Some UT2003 Screenshots with 44.03 drivers:

http://www.beyond3d.com/forum/viewtopic.php?p=145076#145076

The mip maps are visible - the transitions are blended, but I can make out there is a difference between one mip level and the next (look at the bottom if the image upwards - there is a difference between the detail texture to the next mip level down). With full Trilinear all mip levels should be blended so you wouldn't be able to notice one mip from the next.
Dave, are you saying that under trilinear filtering in the 44.03 control panel, that mip transitions shouldnt be evident?
Due to an accident, I had to recently go from an R300 to an NV25, and the mip banding is killing me. I'm using the 44.03's. If I set the mipmap detail level to 'best image quality' in the control panel, my framerate is nearly halved. I never saw banding like this on my R200, or R300, at any driver setting. If it wasnt for this problem, I would be pleased with this particular card.
 
demalion said:
Actions associated with this include dismissing ExtremeTech, Beyond3D, and other sites involved in the nVidia/3dmark 03 benchmarking image quality degradation issues as "police" or lackeys, including saying that the issue's exposure was "payback" for not having access to Doom 3 granted by nVidia (he's only expressed "regret" concerning ExtremeTech, which happens to be the particular site which is part of a large media group) .

Actually, to clarify this point. Kyle only said that he was sorry for making it public. He is not sorry for making it.
 
Park the Shark said:
Well, you nailed me there... I'm pretty much in the boat of "that benchmark sucks, who cares" lol...


But *why* does it "suck"? Do you understand that it is meant to be a synthetic loading test that hammers every card as hard as possible to find it's breaking point? This is a quite different test from most other tests.

If it is issues with cheatability or "it's not a game", those same issues also "invalidate" any other test you care to mention.

Park the Shark said:
I can understand and appreciate the point, though, of disliking nVidia cheating on a benchmark, period. Obviously, I do read [H], and sometimes I disagree with Kyle, but as far as 3DMark03 goes, he posted numerous links and stories regarding the cheating issue. I did think it was pretty odd the way he linked to ExtremeTech's article and at the same time roasted them a bit, questioning their motives. It's not like he was in some conspiracy to deny nVidia was cheating.

That's just a standard technique for disparaging your opponent. Instead of ignoring or rebutting the points raised, you point to the information, vaguely disparage it and the source with veiled comments. [H] rarely linked to any of the "cheating" stories, except with a bit of namecalling and vague accusations of ulterior motives to explain the articles as anything other than factual reporting.

Park the Shark said:
What you're saying, though, is essentially that people are made at [H] because they didn't proclaim at the top of their longs that nVidia is the devil? What do you expect them to add? They posted the links to other works that clearly demonstrated what was happening, as well as concluded it was done intentionally to "cheat". Put another way, are you not saying that [H] has a duty/responsibility to not just link to other people's findings, but independently verify everything they link to and add their own commentary?

I expected them to tell the truth, not to say things were okay when they were not. They still don't admit to Nvdia doing any cheating, and still say that even if they did cheat, it does not matter because "you can't see it" or they don't like that particular benchmark.

It's nonsensical. Can you imagine a racing magazine saying it's okay to cheat during a car race because they (and the one particular manufacturer that they are close friends with) don't like that track?

Park the Shark said:
If someone has already done the work, what would be the point in wasting resources only to say, "In conclusion, ExtremeTech was right!"

Where did thay say that? AFAIK, [H] has always said ET was wrong, stating ET had some kind of grudge because they did not get one of the exclusive Doom3 Nvidia PR machines to test.

Park the Shark said:
Finally, what bearing does this really have on the bi/tri issue?

Shows a pattern of behaviour. You need to see the bigger picture to understand the context of the UT2K filtering issue.
 
Even if you and Kyle find 3dmark useless the fact remains it is used in almost every benchmark and several oem's use it to determine what graphic chips to put in their pc's. Nvidia flat out cheated to make their cards do better so more people and companies would buy their chips. That's consumer fraud in the simplest terms. Also the ironic part of this is it was Nvidia that pushed 3dmark in our face when they were doing good, they were using 3dmark scores in pr statements and advertising and linking scores on nvidia's official site. It was Nvidia that created the idea that 3dmark was an important all purpose benchmark, and now that there cards are doing badly they try and destroy 3dmark.

Now with Kyle what is happening is Nvidia is using him as an extension of their marketing dept. and for a site that claims to be a fair and unbias hardware review site that is a big no no. Kyle is now seen as a co conspirater in Nvidia's attempt to steal money from people by making there crappy new cards seem better than they really are.

All of these side issues of UT2k3, shadermark cheats, uwinder anti cheat script and now the encryption of their drivers are just side efects that show just how bad Nvidia was cheating and it seems to have been present in 3dmark 2001 also.

I mean please people every week new things are learned about how extensively nvidia has been cheating in their drivers aren't we beyond the point of trying to defend them and people like Kyle who are simply corporate extensions of Nvidia?
 
Let try and keep the thread to the context of the title please.

Park - did you see the images referenced earlier?
 
Park the Shark said:
In fact, though, the nv30 performed BETTER in 3dMark03 upon release. So that kind of shoots a hole in that theory....


But much, much worse than R300. It only got vaguely competative when it was overclocked (hence the noisy heatsink) and still had to cheat badly in the drivers.

Park the Shark said:
How is [H] supporting cheating in UT2K3? They've openly stated that nVidia needs to provide a driver that allows the user to select FULL tri filtering.

Where? Not in the review where they crown the NV35 as faster than the R350, even though the NV35 is secretly dropping IQ (whenever the user or application requests trilinear) in order to do less work, and thus get higher benchmark scores. If that's not misleading and worth pointing out in a competent review, I don't know what is.

The unwillingness to change faulty reviews even after it has pointed out to Kyle and Brent can be nothing but complicity in the deception.
 
Park the Shark said:
In fact, though, the nv30 performed BETTER in 3dMark03 upon release. So that kind of shoots a hole in that theory....

Which driverset for the NV30 are you comparing with which driverset for the 9700? And are you sure that the drivers you're referring to were devoid of invalid opimizations such as shader replacement, forced precision below the DX9 spec, etc.? URL, please?

Park the Shark said:
How is [H] supporting cheating in UT2K3? They've openly stated that nVidia needs to provide a driver that allows the user to select FULL tri filtering.

By allowing an improper comparison to stand, they are supporting Nvidia's efforts to deceive consumers about the performance of their video cards. at equivalent image quality settings vs. ATI. It wasn't too long ago that [H] was saying the only valid comparison to ATI's QualityAF was NV's Application setting, which did full Trilinear("Apples to Apples at last", I believe they referred to it as). Their next setting down(Balanced) performed a bi/tri blend that is curiously similar to the pseudo-trilinear we're seeing now. [H] is now saying however, that ATI's QualityAF and Nvidia's pseudo-trilinear CAN be compared to each other.

One might argue that since Nvidia's drivers don't allow for full Trilinear filtering, that a valid comparison is impossible, so this is the best that can be done(even if this is the case, it should be clearly noted in any review that makes use of the flawed comparison). This is crap. ATI's scores should not be penalized because they can go up to the proverbial "Eleven". And just to mix another metaphor, "if Mohammed will not go to the mountain, the mountain must go to Mohammed". In other words, if you can't raise Nvidia's image quality up to what ATI is capable of, bring ATI's down to Nvidia's level and then run the numbers. Then at least you can say "with Nvidia you can get $SCORE, and with ATI doing the same thing you get $OTHERSCORE". I could be wrong, but I believe it was Doomtrooper who has stated that using the Texture Quality slider it is possible to get the Catalysts to do the same pseudo-trilinear filtering. For a site that made such a big deal about bucking the trend and finding proper settings for a valid comparison between the NV3x and R3x0 not too long ago, they sure don't seem to place a very high priority on it anymore.
 
Nazgul (or anyone else who cares :)),

The anisotropic filtering issue we/I raised with the article at AMDMB is a different issue than the bi/tri issue.

The AF issue looks more to be a bug. I've done more tests and will publish an update that will be released after my next article is published (which will hopefully be tonight). The AF problem is inconsistent. It appears in only one of the maps I use, but is fine on other maps.

The timing of the AF issue, the [H] article, and B3D's tri/bi threads, and NVIDIA's per-application optimizations all came at pretty much the same time. It therefore seemed that all of this was the same problem. However that's not the case.

What we talked about at AMDMB appears to be a bug in v44.03 drivers. Nothing more.

The Bi/Tri and application-detection issue still stands.

- Jonathan.
 
Point-o-clarification please?

Jerky said:
Nazgul (or anyone else who cares :)),

The anisotropic filtering issue we/I raised with the article at AMDMB is a different issue than the bi/tri issue.

The AF issue looks more to be a bug. I've done more tests and will publish an update that will be released after my next article is published (which will hopefully be tonight). The AF problem is inconsistent. It appears in only one of the maps I use, but is fine on other maps.

The timing of the AF issue, the [H] article, and B3D's tri/bi threads, and NVIDIA's per-application optimizations all came at pretty much the same time. It therefore seemed that all of this was the same problem. However that's not the case.

What we talked about at AMDMB appears to be a bug in v44.03 drivers. Nothing more.

The Bi/Tri and application-detection issue still stands.

- Jonathan.
So has nVidia agreed to fix the tri/bi issue or the AF bug or both?
 
As far as I know, NVIDIA determined that the AF issue was a bug before I concluded as such last weekend. In their IRC chat, they referred to the AF issue as a "bug". So I would assume that NVIDIA has done their own testing in isolation, because they never contacted me about it.

If NVIDIA has already identified it as a bug, it may not necessarily mean it will be fixed (lots of bugs go unfixed).

The bi/tri issue I don't know what's happening with that (guys here have a better idea than I do). The only official statements I know regarding that are in that chat last week. As we all know, their replies were very general, so who knows if that will ever get fixed before NV36/NV38.

- Jonathan.
 
Back
Top