Help me understand the AF/[H] controversy

demalion said:
Actually, it's "not true trilinear filtering".

Yes Brent resolved that for me above...


2. It has shown obviously noticeable IQ degradation in the HallOf Anubis add-on map.

Well, perhaps that is one place where it is evident even in selected still screenshots?

Brent also points out that the HallOfAnubis issue is AF, not tri filtering related. It apparently is limited to this one map. For the purpose of discussing the bi/tri filtering issue, it would seem the AMDMB article is irrelevant. This was what confused me as well. I saw the AMDMB article and thought all the "woowoo" was regarding AF. The real issue is not AF, but tri vs bi/tri.

Clear as mud, right?

You make a point about "cherry picking" screen shots, motion involved with tri filtering, etc. You do realize that you're essentially saying there's no way to prove that nVidia is damaging IQ, short of posting raw video captures. No one is going to do that for bandwidth reasons.

More accurate to say that "[H] has not shown obviously noticeable still screenshot IQ degradation for the maps selected", or anyone involved with posting screenshots you may have looked at.

Well, that's what I'm asking... Has ANYONE posted a screenshot showing decreased image quality? I have already said that I haven't seen any. I've looked. Haven't found them yet. Got a link for me?

It's not like the nVidia cards can't do trilinear...nVidia decided to disable it for fps boosting.

I don't think anyone has disagreed with THAT point.

5. nVidia has promised to resolve the issue by providing the end-user control over AF.

Note that nVidia's review guidelines already propose that the end-user has control over trilinear filtering ("Quality" mode), though through application detection as evidenced in UT2k3 that is not the whole truth. The whole truth about the change remains to be seen.

So is the bottom line that people want [H] to put on their front page something like "nVidia lied, saying they would give us true tri filtering, but gave us a bi/tri mix".

Actually, I think I'm seeing your point... Essentially, many people are feeling that [H] is supporting the "cheating" because despite KNOWING that nVidia is not doing full tri filtering, they are maintaining that it is fair to compare directly to ATI who IS performing full tri filtering. I do think that is a valid point. Furthermore, since ATI does have a very comparable setting, the two should be compared at that setting. Certainly it is NOT fair to set both to full tri-filtering when we know the nVidia card is going to override that.

I'm going to post this directly over at [H] and see if Kyle will respond.

You had some other points on why, in general, may do not like Kyle. I'm not into all that, hell just this bi/tri issue is complicated enough. I'll just let that go by saying that I think there must be some happy medium between allowing folks to go off flaming and deleting something just because you don't like it. I personally have no knowledge of the situation and don't care to get involved with it--unless he starts deleting MY posts. :) Seriously, though, I have noted he sometimes does have an emotional chip on his shoulder.

They also left a review standing that showed an nVidia card leading an ATI card in a mode that is represented as being equivalent. nVidia cards can do trilinear filtering, they do trilinear filtering in other apps, including those they recommend to show whether they are doing trilinear filtering or not, they do not do trilinear filtering in UT2k3, which happens to be an application they reported as having a performance boost in the driver set, and which happens to be often used for fps comparisons. In response to ATI apparently indicating they had a problem with this, this image quality article concluded that "it didn't matter anyways", and let the fps comparisons stand.

Yes, I can see a problem there. I think you articulated it better than anyone previously. It seems there is so much other fluff thrown into the mix by so many different people, it's been difficult to find the crux of the matter.
 
Joe DeFuria said:
Finally, what bearing does this really have on the bi/tri issue?

Because benchmarking:

1) A card that CAN do trilinear, and doesn't, against.
2) A card DOING trilinear, even though it CAN do bi/tri...

Is in no way an attempt to fairly compare products.

You're points are good... but what you're responding to was in relation to the 3dMark issue the DW brought up... I just don't see what 3DMark03 has to do with this bi/tri filtering business.
 
Park the Shark said:
Where did thay say that? AFAIK, [H] has always said ET was wrong, stating ET had some kind of grudge because they did not get one of the exclusive Doom3 Nvidia PR machines to test.

In in his forum, under the video card section... There's a lengthy posting there on the issue... Their forum is down right now, but I think this is the link to the thread:
http://www.hardforum.com/showthread.php?s=&postid=1025103810#post1025103810

This has actually been discussed in another thread: http://www.beyond3d.com/forum/viewtopic.php?t=7164&highlight=

Many people (myself included) felt that the (grudging) half apology should have been in a front page editorial, where the original unfounded accusations took place. Making a forum post is akin to a newspaper tucking a retraction away on the bottom of page 19. Note that several people posted their thoughts here because they have been banned at [H] for trying to discuss similar matters there.

There's certainly nothing in there that even admits to Nvidia cheating, let alone claims it as a "bad thing".
 
DaveBaumann said:
Can you see the different mipmap levels?

It looks odd to me... like it gets clear then blurry, then clear... But I still find it hard to make any judgement without a comparison.

Either a comparison to the screenshots taken with a different driver, or a different card, with the settings *set* the same.

Still, I've finally grasped the point here. People are mostly upset because they feel benchmark comparisons between the two are NOT valid, REGARDLESS of resultant IQ. I can certainly appreciate that standpoint.

I'll say again what I would like to see is two sets of benchmarks in all video card comparisons--technique to technique, and optimization to optimization with comparable IQ.
 
Park the Shark said:
Why did ATI release a driver update specifically to improve NeverWinter Nights play? Popular games get special driver treatment though specific optimizations and fixes.

Right, but bi-tri optimizations can be applied globally. It actually takes MORE effort to limit such optimizations to one application than to just force it on all the time. So why limit it to just UT?

Did ATI release a driver update to improve NWN play, by REDUCING image quality in that single game?

I don't think the "crime" here is that nVidia optimized... the "crime" is that they over-rode a user selectable setting. No one seems to like that. They all seem to agree. That's why I don't understand the ill will.

There are two crimes.

1) That nVidia overrode an application setting.
2) That [H] does not compensate for this by equally treating the competition with respect to performance benchmarks.

1. From what you've written, I think you're of the opinion that nVidia should not be overriding user selected Trilinear filtering with this Bi/Tri mix. That would be the reason you consulted nVidia about changing that aspect of the driver, correct?
2. In light of point #1, do you think it's fair to compare benchmarking numbers from ATI's full tri filtering to nVidia's bi/tri mix? Has [H] published any reviews using the so called "cheat" drivers, making such a direct comparison, and if so, will those numbers be pulled or updated when nVidia delivers the drivers that enable full Tri filtering in UT2K3?

So, what was Kyle's response to that?
 
Joe DeFuria said:
Park the Shark said:
1. From what you've written, I think you're of the opinion that nVidia should not be overriding user selected Trilinear filtering with this Bi/Tri mix. That would be the reason you consulted nVidia about changing that aspect of the driver, correct?
2. In light of point #1, do you think it's fair to compare benchmarking numbers from ATI's full tri filtering to nVidia's bi/tri mix? Has [H] published any reviews using the so called "cheat" drivers, making such a direct comparison, and if so, will those numbers be pulled or updated when nVidia delivers the drivers that enable full Tri filtering in UT2K3?

So, what was Kyle's response to that?

He banned me from their forums.


lol j/k!! Their forums are down right now... or are they? :oops:
 
Park the Shark said:
Still, I've finally grasped the point here. People are mostly upset because they feel benchmark comparisons between the two are NOT valid, REGARDLESS of resultant IQ. I can certainly appreciate that standpoint.
No, I'd go a step further than that. The clear/fuzzy bit you saw were mipmap lines and they're even more noticable in motion.

I haven't seen a 5900 in action, but I'd be willing to be dollars to donuts that you CAN tell they are there while playing...and that Kyle as a guy "who can tell the difference between 200fps and 250fps in Quake" should be able to see a visual difference between them.

I think the man is lying and playing around with words to avoid saying so.
 
http://www.hardforum.com/showthread...fab7adeee1de&postid=1025110743#post1025110743

This is what I ended up posting:
1. From what you've written, I think you're of the opinion that nVidia should not be overriding user selected Trilinear filtering with this Bi/Tri mix. That would be the reason you consulted nVidia about changing that aspect of the driver, correct?
2. In light of point #1, do you think it's fair to compare benchmarking numbers from ATI's full tri filtering to nVidia's bi/tri mix? Has [H] published any reviews using the so called "cheat" drivers, making such a direct comparison, and if so, will those numbers be pulled or updated when nVidia delivers the drivers that enable full Tri filtering in UT2K3?
3. Knowing that nVidia is not doing full trilinear filtering (despite being capable of it and choosing to over-ride the setting), do you think a direct comparison to ATI while doing full trilinear is a valid performance comparison?
4. It has come to light that ATI has a very similar bi/tri filtering mix to what nVidia is doing. Do you think that it would be fair to compare that bi/tri mix to nVidia's bi/tri mix, rather than comparing the bi/tri mix of nVidia to the full tri of ATI?

5. IOW, as one person said it to me very succintly, he said he did not feel a direct comparison was valid between:
A. a card capable of full trilinear, but using a bi/tri mix
and
B. a card capable of a bi/tri mix, but using full trilinear.
What is your comment on this?
 
Park the Shark said:
...
You make a point about "cherry picking" screen shots, motion involved with tri filtering, etc. You do realize that you're essentially saying there's no way to prove that nVidia is damaging IQ, short of posting raw video captures.

No that is wrong. I said you can't prove they are not damaging image quality short of video captures (or tools like colored mip levels), but said you can prove that they are damaging image quality. I also explained why: in short, seeing it in still screen shots means you see it in motion, but seeing it in motion does not mean you will see it in still screen shots...and games are played in motion.

Example of the logical issue here:

You have two metal junction boxes, each with several bare wires. In one box, you see sparks. In the other, you do not. The sparks in the one box is proof that it is dangerous. The absence of sparks in the other box is not proof that the box is safe. Colored mip map levels are like checking if the wires are bare or not (or maybe whether there are wires at all) in the first place...depends on other factors in the situation for whether there will be a problem when sticking your hands in the junction box.

No one is going to do that for bandwidth reasons.

I think Wavey (DaveBaumann) has been pointing to example shots where he maintains the mip map transition is evident even in still screenshots. Please note, if you are still under a misinterpretation of my comments, that this does not contradict what I actually said. Please direct some discussion towards those examples directly for discussing your questions about being able to see the issue.

More accurate to say that "[H] has not shown obviously noticeable still screenshot IQ degradation for the maps selected", or anyone involved with posting screenshots you may have looked at.

Well, that's what I'm asking... Has ANYONE posted a screenshot showing decreased image quality?
I believe Wavey has been tackling this...?

I have already said that I haven't seen any. I've looked. Haven't found them yet. Got a link for me?

I've seen such pictures, and noticed the transitions in them, and I believe Wavey provided them at the time. I presume he is linking to such pictures in his above discussion to you. Does that serve?

Also, does an accurate reading of my statements about still screenshots help concerning understanding this issue?

It's not like the nVidia cards can't do trilinear...nVidia decided to disable it for fps boosting.

I don't think anyone has disagreed with THAT point.

Well, there was a failure to represent the issue to readers of atleast one of the site's reviews.

5. nVidia has promised to resolve the issue by providing the end-user control over AF.

Note that nVidia's review guidelines already propose that the end-user has control over trilinear filtering ("Quality" mode), though through application detection as evidenced in UT2k3 that is not the whole truth. The whole truth about the change remains to be seen.

So is the bottom line that people want [H] to put on their front page something like "nVidia lied, saying they would give us true tri filtering, but gave us a bi/tri mix".

Actually, I think I'm seeing your point... Essentially, many people are feeling that [H] is supporting the "cheating" because despite KNOWING that nVidia is not doing full tri filtering, they are maintaining that it is fair to compare directly to ATI who IS performing full tri filtering. I do think that is a valid point.

That is the issue with regard to the recent UT2k3 review and the UT2k3 bi/tri article. The controversy is related to quite a bit more besides just that issue, but to which the issue is related, as I outlined.

Furthermore, since ATI does have a very comparable setting, the two should be compared at that setting. Certainly it is NOT fair to set both to full tri-filtering when we know the nVidia card is going to override that.

I'm going to post this directly over at [H] and see if Kyle will respond.

Well, FYI, similar things have been posted there. One polarizing issue, specifically, is that Wavey posted such at [H], in a post that seems by any measure not to have broken the rules of conduct of the forums, but was pointed in factuality and details presented (I'd choose the term "inconvenient" in illustrating issues with Kyle's "take" on trilinear filtering). What is polarizing about this instance is that Kyle banned Wavey as a result of it.

This is (one) incident that adds to the other factors I already mentioned.

You had some other points on why, in general, may do not like Kyle. I'm not into all that, hell just this bi/tri issue is complicated enough.

Well, you did ask. You might not be interested in all the details, but all of the details relate to the "controversy", though you perhaps atleast understand the specific flaws as far as just this issue to see why it is related to a broader issue of controversy.

I'll just let that go by saying that I think there must be some happy medium between allowing folks to go off flaming and deleting something just because you don't like it.

Well, the thing is Kyle allows himself to go off "flaming" and deletes things because he doesn't like it. :-?

I personally have no knowledge of the situation and don't care to get involved with it--unless he starts deleting MY posts. :) Seriously, though, I have noted he sometimes does have an emotional chip on his shoulder.

Well, you are free to have no knowledge of it, or not to get involved in it, but you can't change that it is involved in the answer to the question about the "controversy" with [H] you asked. However, I don't recommend trying to conduct a discussion by dictating that everyone else must view things in similar isolation.
 
Gees he pretty much here condones what Nvidia has done as being responsible to their sotck holders. He has a very twisted and warped mind. Imagine what's gonna happnes once he figures out Nvidia is using him. reposted here in case it's nuked on the hardforum



"I think NVIDIA does everything they can do to beat the competition on every level that they can. ATI should be doing the same thing. If neither is, then they are not being responsible to their stockholders and that is simply the fact of the matter."

In the current context of what Nvidia has done to keep up with the competition this statement is pretty awful. This is one step away from you actually condoning what Nvidia has done lately.
They have cheated in 3dmark, cheated in shader mark we've seen that common timedemo's they've cheated in and now the whole bi/tri issue.

Being competitive is one thing but out right trying to fool your customers and oem's by artificially bumping up numbers you know are going to be used in reviews is simply fraud.

No matter what anyone thinks of 3dmark the reality is it is used a lot in benchmarks and oem's to determine the outcome of reviews and what graphic chip a company uses in their pc's.

To try and play it off like these drivers cheats were just nvidia trying to help their stock holders is condoning what would be considered in other industries a federal crime. I'm quite saddened that you have chosen to condone Nvidia's recent actions are mearly the actions of a company trying to stay competitive is a serious flaw in ethics of a company offering a product to it's customers.
 
I've got to agree with Swanlee 100% this is clearly consumer fraud at work and Hard is supporting it.
 
Bouncing Zabaglione Bros. said:
They still don't admit to Nvdia doing any cheating, and still say that even if they did cheat, it does not matter because "you can't see it" or they don't like that particular benchmark.
FrgMstr said:
I think that the optimization techniques that we have recently seen used by NVIDIA in 3DMark03, in principle violate the very fabric that our community and industry is held together with.
http://www.hardforum.com/showthread.php?s=&threadid=647147&perpage=15&pagenumber=3
;)
 
Park the Shark said:
DaveBaumann said:
Can you see the different mipmap levels?

It looks odd to me... like it gets clear then blurry, then clear... But I still find it hard to make any judgement without a comparison.

That is because it is not doing trilinear. If it was, you wouldn't be seeing the different mipmap levels(ie, where it goes from blurry to clear to blurry, etc).

Take a look at this:
http://www.3dcenter.de/artikel/ati_nvidia_treiberoptimierungen/screenshot05+06.php Top shot is bi/tri, bottom shot is full trilinear. Can you see a difference? You can see a mouseover version here http://www.3dcenter.de/artikel/ati_nvidia_treiberoptimierungen/index3.php

This is one of the big problems of [H]'s article. They did not compare NVIDIA's tri/bi optimization to NVIDIA's normal full trilinear. They compared ATI's tri/bi to NVIDIA's tri/bi, and somehow came across the conclusion that NVIDIA's tri/bi did not decrease image.

Or perhaps what they wanted to compare was ATI's trilinear AF to NVIDIA's tri/bi AF. Which AFAIK they did not do. Now, why did they not do this? Because although trilinear was set in the game, AF was set via the control panel, which AFAIK overrides the game setting. So ATI was not using trilinear AF.

Regardless, obviously the best comparison would be NVIDIA to NVIDIA, not NVIDIA to ATI. Interestingly enough, Brent concludes that NVIDIA tri/bi 0xAF looks better than ATI trilinear 0xAF.
 
Park the Shark said:
Now, if [H] was truly nVidia biased, wouldn't they have been singing the virtues of 3DMark because nVidia's nv30 was the leader in 3DMark when [H] was blasting it.

No, because NVIDIA wrote up a nice presentation on why 3dmark03 was useless and unfair. If [H] is NVIDIA biased as many believe, then they will agree with NVIDIA, not disagree.

Also, NVIDIA only won with cheating drivers. You can see NV30's real 3dmark03 performance in articles that benched it when the 330 patch first came out, which eliminated cheats. Or, you can do it yourself by downloading RivaTuner and running the anti-detection script.
 
micron said:
Bouncing Zabaglione Bros. said:
They still don't admit to Nvdia doing any cheating, and still say that even if they did cheat, it does not matter because "you can't see it" or they don't like that particular benchmark.
FrgMstr said:
I think that the optimization techniques that we have recently seen used by NVIDIA in 3DMark03, in principle violate the very fabric that our community and industry is held together with.
http://www.hardforum.com/showthread.php?s=&threadid=647147&perpage=15&pagenumber=3
;)

Micron, I got a cavity just from reading that candy coated remark. ;) My wife is a teacher. There have been times when she was forced to write things like "we will continue to help Bobby with his peer to peer interactions." when she wanted to say "Bobby really needs to stop hitting his classmates." Why it is it so difficult for Kyle to tell his readers that Nvidia cheated in plain and simple language.
 
Back
Top