Help me understand the AF/[H] controversy

nelg said:
Why it is it so difficult for Kyle to tell his readers that Nvidia cheated in plain and simple language.
A. Because he's afraid Brian and Derek wouldn't be his friends anymore.

B. He has a severe truth impedement.

C. He has actually started to believe the BS he's been spouting out so anything that is contrary to it is automatically wrong or doesn't exist in his mind.

D. He pretty much bet his reputation on his ability to help nVidia pull this one off and it would be counter-productive to that agenda to say anything like that.

Pick the one you like best. :)
 
All I can say is I came here trying to see what people were upset about, and I can clearly see it at this point... To my direct questions, Kyle responded:
You have asked some extremely broad questions here that do not make specific references so I will have to make some assumptions to answer....

1. That is not my opinion and I don't think I have ever stated that. I think there should be an option in the driver to let the more experience user allow the game to set the different levels of AA/AF/Tri/Bi if that is what he wants to do. That is part of the reasoning behind asking for the change. Another part of that is to serve our own selfish needs to facilitate apples-to-apples benchmarking. Still, the user should be the main reason the change was made IMO.

2. I will have to assume this question is in direct reference to UT2K3. This article will directly answer your question and also inform you of why it was written.

3. That would need to be evaluated on a case-by-case basis. I don't think there is a black and white answer to this question.

4. I think IQ would need to evaluated to make that determination.

5. I really don't have any comment on that person's opinion. Everyone is certainly entitled to their own.

My response was:
I'm going to start with point 5, since that was just incredible.

5. IOW, as one person said it to me very succintly, he said he did not feel a direct comparison was valid between:
A. a card capable of full trilinear, but using a bi/tri mix
and
B. a card capable of a bi/tri mix, but using full trilinear.
What is your comment on this?



quote:
--------------------------------------------------------------------------------
Originally posted by FrgMstr
You have asked some extremely broad questions here that do not make specific references so I will have to make some assumptions to answer....

5. I really don't have any comment on that person's opinion. Everyone is certainly entitled to their own.
--------------------------------------------------------------------------------



I don't think the questions could be much more specific, unless you are intentionally LOOKING to interpret them as vaguely as possible. I would certainly say that the questions were more specific than the answers.

To this direct and specific point:
It is invalid to compare benchmarks for:
A. a card capable of full trilinear, but using a bi/tri mix
and
B. a card capable of a bi/tri mix, but using full trilinear.

Your answer is "no comment"....

Now, this is getting plain odd... On point 1... I said "I think you're of the opinion that nVidia should not be overriding user selected Trilinear filtering with this Bi/Tri mix." and you say "That is not my opinion and I don't think I have ever stated that. I think there should be an option in the driver to let the more experience user allow the game to set the different levels of AA/AF/Tri/Bi if that is what he wants to do."

WTF is the difference here? If you think the user should be allowed to set the damn settings, then you obviously don't think nVidia should be overriding the user selected settings.

On point 2... In response to "do you think it's fair to compare benchmarking numbers from ATI's full tri filtering to nVidia's bi/tri mix?" you do not find that specific enough to say simply yes or no? Further on point 2, in response to "will those numbers be pulled or updated when nVidia delivers the drivers that enable full Tri filtering in UT2K3?" you do not find that specific enough to say simply yes or no?

On points 3 and 4... I think what would solve this issue is simply to complete your homework, so to speak. If you are going to let nVidia have a CHANCE to run a bi/tri mix, why is ATI not given that same chance?

It's obvious that Kyle does not intend to change his mind. Since he is not going to admit the comparison is invalid, we are left only to speculate as to his reasoning. I'm often one to give a person the benefit of the doubt...

I think he's probably more stubborn than willingly biased towards nVidia. You see what you want to see, right?
 
Thanks for coming here Park the Shark. I think this thread has been very positive, largely due to your willingness to see all points of view. I hope you continue to hang out here. :D
 
micron said:
Bouncing Zabaglione Bros. said:
They still don't admit to Nvdia doing any cheating, and still say that even if they did cheat, it does not matter because "you can't see it" or they don't like that particular benchmark.
FrgMstr said:
I think that the optimization techniques that we have recently seen used by NVIDIA in 3DMark03, in principle violate the very fabric that our community and industry is held together with.
http://www.hardforum.com/showthread.php?s=&threadid=647147&perpage=15&pagenumber=3
;)

And straight afterwards, the kicker is:

Frgmstr said:
We have suggested that reviewers stop using 3DMark03 as is obviously represents nothing pertaining to real world gaming.

Why? Because Nvidia "optimse" for it. Nvidia also "optimise" for UT2K, but I don't see Kyle advocating that people stop using UT2K as a benchmark. Kyle either does not understand what 3Dmark2003 does, and why it is different from gaming benchmarks, or is entrenched in his position after swallowing and parroting the Nvidia anti-3Dmark document at the beginning of the year.

To lambast 3Dmark2003 as useless whilst letting Nvidia continue to "violate the very fabric that our community and industry is held together with" is nonsense. A couple of recent, qualified, grudging statements tucked away in the forums does not counterbalance the damage done by his implicit and explicit support of Nvidia's actions over a long period in the main articles, reviews and news features of [H].


That grudging concession, where he can't even bring himself to use the word "cheat", continuing to sugar coat it with the word "optimisation", is brought to us by the man who says:

Frgmstr said:
I can't really say that that optimization is "wrong" or "right" and I don't think that is the way to go about characterizing it. We are in this huge gray area and I don't think you will see any lines drawn that are specific. Game specific optimizations are here to stay, that is something I believe. It is our job as a review site to make sure that a certain level of image quality is maintained. I think we fully demonstrated this earlier this year with our 5200/5600 article. I think this is something we do with every review we write. I think from reading the linked article you will see that we thought the IQ was sliding a bit to far towards what you might say is "wrong". We made huge efforts to get that fixed and overall the problem was solved to some extent, although this UT2K3 mipmap issue has clouded the air.

I look at it this way and have hinted at this above. NVIDIA and ATI have the right to do whatever they want to with their products and drivers. If you do not like the IQ that NVIDIA or ATI is producing, then you should not buy their card. This is a free market and unless any specific contracts are signed between game developers and IHVs regarding IQ they are free to do what they wish, whether we like it or not.

We feel that IQ and Performance are the two foremost issues that we should be representing to our readers for them to make an informed decision when buying a video card.

It's very little, very late, and the words still do not represent the *actions* that [H] take. For instance, look at the last sentence above, then compare it with the lowering of IQ in both synthetic and game tests (along with other cheats) that Nvidia have been doing, and how these have been ignored, or even championed in recent [H] articles.
 
Park the Shark said:
5. IOW, as one person said it to me very succintly, he said he did not feel a direct comparison was valid between:
A. a card capable of full trilinear, but using a bi/tri mix
and
B. a card capable of a bi/tri mix, but using full trilinear.
What is your comment on this?

Ok, once again I will stubborningly bring up this point about [H]'s UT2003 filtering comparison.

Based on the information layed out in this thread,
http://www.beyond3d.com/forum/viewtopic.php?t=6719, trilinear AF was not enabled on the r9800. Based on Brent's own test settings, this is what he did:
The ATI driver control panel was set to “Application Preferenceâ€￾ on both AA and AF so that we could test with AF disabled. Then we set the AF slider to Quality AF when AF was tested.

I have seen this brought up once, and no one denied or agreed with whether or not the testing was done correctly or not. When I have asked, I have gotten no answer. I don't want to spread false information. Based on what I have read about ATI's control panel and application settings, Brent did not set up the test correctly if he in fact wanted to use trilinear AF. Then again, while perusing the article, I did not actually find it written anywhere where he explicitly states ATI was using trilinear AF, although it is certainly implied, and I have seen people all over different forums saying that the [H] article "proves there is no difference between trilinear AF and tri/bi AF because of the comparison between ATI and NVIDIA performed in the article."

Now, I am reasonably sure I am correct and [H] did not test trilinear AF in UT2003. I am also astonished that none of ATI's staunch defenders have brought up this point. This makes me believe I may be in error.

Here are the facts about NVIDIA:
1) there is no way to get full trilinear filtering in UT2003 via the control panel or by changing the game settings. This optimization is triggered by the detection of UT2003.exe.
2) Full trilinear can be enabled by using RivaTuner and the anti-detection script.

Here are the facts about ATI:
1) By selecting 'Quality AF' in the control panel, trilinear is performed in stage 0 and bilinear is performed on other stages. Normally this is fine and provides full IQ, but not in UT2003.
2) With AF off, full trilinear is performed, unlike with NVIDIA cards.
3) Full trilinear is attainable by enabling AF in UT2003 via the ini file and by setting AF to 'application' in the control panel <- Brent did not do this.

There are a few possibilities here.
1) Brent mistakingly listed the test setup and trilinear AF was used for ATI.
2) Brent knowingly did not test trilinear AF for ATI.
3) Brent unknowingly did not test trilinear AF for ATI.
 
The firinqsquad Asus FX 5900 U review has some words and screenshots about the tri/bi issue

Link

Dave Baumann of Beyond3D discovered that GeForce FX cards render a form of quasi trilinear filtering, even with the driver running in quality mode. This is important because by using a mix of bilinear and trilinear, performance is enhanced. When you factor in the significance of Unreal Tournament 2003 not just as a game, but as a performance benchmark used by numerous publications, quite a few people became concerned about the legitimacy of testing results taken on this website as well as countless others.
;)
 
Frgmstr/Kyle said:
We have suggested that reviewers stop using 3DMark03 as is obviously represents nothing pertaining to real world gaming.
He has either missed the point about 3DMark03 or he believes/accepts FM's "The Gamers Benchmark" slogan about 3DMark03 (which is not quite understandable given that I assume he is a matured guy) or he is just being anal-retentive (i.e. "FM says their 3DMark03 is a Gamers' Benchmark when we all know it is not").

3DMark03 is nothing more than a suite of technology demos. I, as part of a website that is FM's beta partner, have said publicly that I do not agree with FM's "The Gamers' Benchmark" slogan (which I have voiced out to FM in their private/NDA'ed solicitation of their beta members for the next 3DMark). We all know that it is not "a gamers benchmark"... stressing on this shows that a person is just ignoring the obvious while also wanting to knock on FM for their obviously-incorrect slogan.

If FM is not allowed, or is not supposed, to make a benchmark that attempts to show the benefits of the latest 3D technologies as advertised prominently by FM and IHVs themselves, and reported so very prominently by 99% of the websites out there in their p/reviews of the latest video cards, then pray tell what is the purpose of websites in featuring the latest 3D technologies so prominently in their previews of the latest video cards?

Because the latest technologies is exciting (surely this can't be argued... why else would websites talk about such latest technologies in their previews of the latest video cards?). Because the only way to show them off is via technology demos (ditto) and not games. Because FM happens to show them off probably the earliest. Because FM do not care if their benchmark that utilizes the latest 3D technology makes as much money or has the same financial consideration as developers that make games, nor does FM even need to care as much about support/compatibility as game developers with their games -- FM makes it a point that either you have the latest video card or not and that the "score" will be affected accordingly and that the performance will be affected accordingly.

Kyle/public is right for knocking FM about 3DMark not being representative of games. Because FM never believes it to be so, regardless of their marketing slogan for 3DMark. But when I questioned Kyle about his opinion that "3DMark03 is useless" while he also told me via email that he would not object to using synthetic benchmarks in any of their articles/reviews, and that I think 3DMark03 really is nothing more than a suite of synthetic benchmarks and hence I am a little puzzled by his statement that 3DMark03 is useless yet he agrees that synthetic benchmarks have their place at [H]... so why not use 3DMark03... he did not reply.

Kyle is bent on, and publizing the fact, that he/[H] will not use 3DMark03... because he has said so, is not willing to retract this... but at the same time act very strange about the whole matter. It is strange because he knows 3DMark03 is nothing more than a suite of synthetic demos (but he will knock FM/3DMark03 for its "Gamers benchmark" slogan) yet he has acknowledged that synthetic benchmarks have their place. Huh? Someone say "An agenda... or a refusal to acknowledge/admit mistakes".

Kyle will never admit he is wrong wrt FM/3DMark03. Because NVIDIA will never admit the same.
 
Re: Point-o-clarification please?

digitalwanderer said:
So has nVidia agreed to fix the tri/bi issue or the AF bug or both?

As I understand it from Kyle, what nVidia have promised to do is reintroduce the 'Application' setting in their driver control panel, so that an application can force it's own aniso settings.

However, what hasn't been promised is that using this 'Application' setting will override nVidia's application specific optimisation in UT2003. Kyle seems to be guessing that it will (which is the reason why he says that the tri/bi problem was 'fixed before it was started'), but at no point have nVidia promised to remove this specific optimisation or make it user-controllable (at least not publically).
 
Reverend said:
Frgmstr/Kyle said:
We have suggested that reviewers stop using 3DMark03 as is obviously represents nothing pertaining to real world gaming.
He has either missed the point about 3DMark03 or he believes/accepts FM's "The Gamers Benchmark" slogan about 3DMark03 (which is not quite understandable given that I assume he is a matured guy) or he is just being anal-retentive (i.e. "FM says their 3DMark03 is a Gamers' Benchmark when we all know it is not").
Well i don't know about kyle, but shouldn't FM remove that slogan? I would say false publicity.... I don't mind, i'm not using 3DMark, but well it's a "gamer benchmark" with "game test" knowing that in fact it isn't.
 
Reverend said:
Kyle is bent on, and publizing the fact, that he/[H] will not use 3DMark03... because he has said so, is not willing to retract this... but at the same time act very strange about the whole matter. It is strange because he knows 3DMark03 is nothing more than a suite of synthetic demos (but he will knock FM/3DMark03 for its "Gamers benchmark" slogan) yet he has acknowledged that synthetic benchmarks have their place. Huh? Someone say "An agenda... or a refusal to acknowledge/admit mistakes".

Kyle will never admit he is wrong wrt FM/3DMark03. Because NVIDIA will never admit the same.
Yeah, I'm afraid you're right on that one.

I never knew Kyle wanted to be a cop or a politician when he grew up, it explains a LOT! The man is pretty much incapable of giving a straight answer on this issue since the truth would be embarressing.

I'm just wondering if yesterday he had an actual epiphany that he was wrong, he was drunk, or someone slipped some sodium pentathol into his drink....either that or he thought he could wordsmith his way out of it by redefining the word "is". :rolleyes:

I don't get what the point of his thread was about if it was just gonna be a song-n-dance again. :( ("Pssssssst! Dig, it was damage control!" )
 
On 3DMark03, maybe he's wrong, but he listed:
http://www.hardocp.com/article.html?art=NDI4LDM=

If that information is in fact correct, I agree that 3DMark is not very useful... You have DX7 and 8 tests, done in apparently weird ways:

ExtremeTech said:
The sky in Game Test 4 uses Pixel Shaders 2.0, and so having to first draw the entire sky, only to have scene objects occlude it later is an expensive way to render, almost like a back-to-front Painter's Algorithm. Using clip planes to ignore drawing non-visible parts of the sky can save considerable processing time and bandwidth -- although we don't know exactly how much performance benefit nVidia receives. Oddly enough, nVidia hasn't published a registry setting bit to enable/disable this "feature." And while one could take FutureMark to task for drawing the scene in this manner, it is nonetheless what the application does, and we think it should be executed correctly in the hardware.

There's a bit of DX9 pieces (and DX8 as well) thrown in only on the mothernature "game" (and apparently the 4 game tests are all that matter for the overall score).

So I think the reason 3DMark was concluded as useless is because if you want to test DX7 and 8, do it with real games... If you want to test DX9... Don't you need more than a few seconds of a DX9 test, and isn't it muddling to the make that only ~30% of the overall score?

The problem is not neccessarily that 3DMark is synthetic--it just doesn't have much basis in reality--particularly the overall score.

I think it's a good utility for measuring fillrate. I could understand using the mothernature score as a forward looking comparison... But there are simply much better tools for DX7 and 8 testing out there.

I happen to agree with Kyle that synthetics have their place. For me personally, that place is to test things like fillrate for video cards, or memory bandwidth on a motherboard, or sustained transfer rate on a hard drive. Those are all example of trying to find a limit of the actual throughput to compare to a mfgr listed specification.

The 3DMark overall score is pretty pointless, then... I still like the fillrate testing, though.
 
Park the Shark said:
If that information is in fact correct, I agree that 3DMark is not very useful... You have DX7 and 8 tests, done in apparently weird ways:

3Dmark2003 is different from other tests in that it is a loading test. It is like measuring the breaking strain of a steel cable. In the real world, you don't use a cable to the point where it breaks straight away, but the breaking strain is a useful metric if what you want to know is the strength of the thing (and thus it's SWL).

To claim it is usless is like saying the top speed of a car is a useless metric because "people don't actually drive around town at 150 mph". That doesn't make knowing the top speed any less useful a metric to compare cars if you understand the context of the results. Can you imagine a car magazine telling it's readers "well, we don't believe that top speed or 0-60 times are worth telling you, because no one drives like that"?

Now you may claim that what 3DMark2003 measures is pointless to you (which is a different discussion) but that doesn't invalidate it as a tool, any more than testing a car's top speed is invalidated by the fact that you won't ever be driving that fast.

3DMark2003 is designed to measure certain things in a certain way so that you can get a comparison between cards. To complain that it doesn't measure *other* things in *different* ways is a pointless exercise. It does what it is designed to do, and tests things which give us different types of information from running UT2K or Quake.

Don't forget that 3Dmark2003 was written this way because of input from many manufacturers, *including* Nvidia. They wanted a heavy, loading based test that was divorced from whether there was a fast/slow CPU in the PC. It's just that Nvidia has managed to muddy the waters by reeling in people like Kyle to give out a confusing message to users so that they can hide the poor performance of the GFFX cards behind this "it's not a real test, it's not a real game" smokescreen.

No it's not a real game - that's why it was developed the way it is.
 
Park the Shark said:
The 3DMark overall score is pretty pointless, then... I still like the fillrate testing, though.

Actually, if a benchmark is done right and no IHV's are actually cheating, I believe that an overall score is not useless but instead potentially valuable. It could show you relatively speaking that Card A is relatively faster than Card B on Machine C. I also believe that 3DMark 99/Max and to a certain degree 3DMark 2001 was successful in doing that. You could look at the scores in those tests and actually say that a GeForce3 was faster than a Radeon 7500 in almost every application. Unfortunately, the major changes to 3DMark03 like no longer using the Max Payne engine and more GPU intensive tests have clouded the use of the overall score in some people's eyes. It also didn't help that the performance of the R3xx and NV3x series are a probably too close to call and thus the need to look at subjective tests like image quality. Chalk it up to bad timing. However, can you still use the 3DMark03 overall score for simple comparisons? I think so, if you understand how the benchmark works, are using it under the right conditions and provided drivers are no longer optimizing for the tests.

Tommy McClain
 
What is most ironic here is it was Nvidia themselves that greatly helped 3dmark be the standard in 3d benchmarks. They are the ones that used their scores in every pr release and linked to them off there main website.
Whaen tehre cards were doing good Nvidia promoted the heck out of 3dmark and they are mostly responsible for how highly used 3dmark is.

Now all of a sudden there cards look like crap on 3dmark because they are crap and do not follow standard api's and need cheats to do well. Nvidia got ticked off that 3dmark actually made a good gpu bench that stricktly followed the DX api while Nvidia's new cards need customn paths to perform well because nvidia is trying to control the api much like 3dfx did with glide.

Nvidia is pretty much following 3dfx's blue print to disaster to a T, excpet now they are taking a step further and purposley mis-leading people.
 
AzBat said:
Park the Shark said:
The 3DMark overall score is pretty pointless, then... I still like the fillrate testing, though.

Actually, if a benchmark is done right and no IHV's are actually cheating, I believe that an overall score is not useless but instead potentially valuable. It could show you relatively speaking that Card A is relatively faster than Card B on Machine C. I also believe that 3DMark 99/Max and to a certain degree 3DMark 2001 was successful in doing that. You could look at the scores in those tests and actually say that a GeForce3 was faster than a Radeon 7500 in almost every application. Unfortunately, the major changes to 3DMark03 like no longer using the Max Payne engine and more GPU intensive tests have clouded the use of the overall score in some people's eyes. It also didn't help that the performance of the R3xx and NV3x series are a probably too close to call and thus the need to look at subjective tests like image quality. Chalk it up to bad timing. However, can you still use the 3DMark03 overall score for simple comparisons? I think so, if you understand how the benchmark works, are using it under the right conditions and provided drivers are no longer optimizing for the tests.

Tommy McClain

The "funny" part for me is that one of my long-standing objections a couple of years ago to using 3DMk01 was that it was based on the Max Payne engine, which was only one game engine out of the myriad game engines 3D games are built around. IE, running Max Payne wasn't going to help me determine how Q3 or UT were going to run on my hardware. So I feel that if anything, coming up with a synthetic vpu-stressing 3D benchmark makes all the sense in the world, especially considering the fact that vpus are continuing to distance themselves from cpus in terms of 3D functionality and capability. To my eyes being vpu-dependent serves to legitimize 3DMk03 as a real "3D" benchmark for the present and future, which is just what I think FM intended it to be.

I mean, if you want to get a measure of current in-game frame rates, then run the games you want to know about--ie, don't run Max Payne if you are interested in how your system runs Q3--run Q3. Or, if you are interested in looking at the performance of other games, run them and use either their inbuilt utilties to measure performance, or run something like Fraps. But if you are interested in investigating your hardware's features which are unsupported or else partially supported in your current games--just because you are interested--then something like '03 fits the bill perfectly, I think. This is the general purpose for which all synthetic benchmarks exist. Are all synthetics "bad" because they don't describe Q3 performance? I can't see how or why, since you are using the software for different purposes. Certainly, it is exactly the same thing as someone running Q3 in a given hardware environment and declaring that the Q3 results extrapolate to all other 3D games running on the same hardware--not so...

That's why I think the whole "anti-3dMk03" crusade is dishonest and misrepresentative on its face.
 
WaltC said:
That's why I think the whole "anti-3dMk03" crusade is dishonest and misrepresentative on its face.
I agree, but I don't disagree with Kyle's take on benching entirely either. I think that pushing for in-game benching is a GOOD THING(tm), I just don't see why you can't use 3dm2k3 in addition to real world testing....the way it was meant to be used. :(
 
Evildeus said:
Well i don't know about kyle, but shouldn't FM remove that slogan?
Like I said, I have suggested/told FM to drop this for the next 3DMark. Of course, it all remains to be seen how successful I will have been in this aspect :)
 
Back
Top