OMG HARDOCP REVIEW OF UT2003 AND FILTERING

Status
Not open for further replies.
But to me this is still different than if the setting was labelled "trilinear filtering" in the drivers themselves. As it is, Nvidia lied (or perhaps, if you could examine an exact transcript of the meeting you'd find in hindsight that they'd merely carefully mislead, as they did with the whole 8x1/4x2 mess) to reviewers, and thus facilitated a round of incomplete if not partially misleading reviews. That's quite bad. But at least it's been discovered and disseminated, and any future review failing to discuss the issue will have only the reviewer's incompetence to blame. Only that fraction of consumers who pays attention to the review sites were ever affected, and presumably most are now aware that the issue exists (although for some too late to affect their purchase decision). Those who don't read the review sites were never under any impression that 'quality' necessarily meant trilinear in the first place, only that it offers sufficiently higher quality than 'performance', which it does.

DAVE H:
You dont find it misleading that said "quality" option shows 100% trilinear filtering in our aniso test apps? (The tunnel test).
Or that Quality apparently means full level aniso in morrowind and freelancer? (according to the amdmb look at the issue)

You dont think THAT might be enough to tell people that "quality" means trilinear????

You dont find that to be dishonest? To change what a settings means for just one game? (and a heavily benchmarked one at that!)
By itself, this breaks one of nVidias optimization rules - its only good for a benchmark. If it was good enough for all games, then there wouldnt be any trickery...
 
RaolinDarksbane said:
Heh, it's kinda amusing that Kyle closed a 21 pages thread(the UT2k3 filtering article) citing that it is too long and shoud be continued on another thread. He then closed that second thread when it barely hit second page.

Kyle is really working overtime in damage control the way he bans people, reply to post with mostly semantics, and close threads, and deleting countless post.

I don't feel sorry for the poor bastard from what I heard of him and read from a few of his reply in those threads. He really had it coming. Kinda ironic that his site is getting so many hits for his FUD.

I just checked the site to pull some quotes and apparently he's taken the whole article down now, citing technical difficulties with his page display software. Heh...;) I guess the technical problems were specific to that one particular article. Wouldn't it be ironic if nVidia asked him to pull it, citing "trade secrets" yet again.... ? Well, I guess I'll have to wait until it's back up...

Kyle on the [H said:
front page]I have currently taken down the UT2K3 Texture Filtering article. We did some major hardware installs this weekend and have been having some page issues every since. We are trying to locate the source of our issues and this is simply a test. The article will be put into rotation again soon.
 
Dave H said:
Understood. I didn't realize quite how explicitly the claim had been made that 'quality' meant trilinear. I understand why you'd be pissed about this.

But to me this is still different than if the setting was labelled "trilinear filtering" in the drivers themselves. As it is, Nvidia lied (or perhaps, if you could examine an exact transcript of the meeting you'd find in hindsight that they'd merely carefully mislead, as they did with the whole 8x1/4x2 mess) to reviewers, and thus facilitated a round of incomplete if not partially misleading reviews. That's quite bad. But at least it's been discovered and disseminated, and any future review failing to discuss the issue will have only the reviewer's incompetence to blame. Only that fraction of consumers who pays attention to the review sites were ever affected, and presumably most are now aware that the issue exists (although for some too late to affect their purchase decision). Those who don't read the review sites were never under any impression that 'quality' necessarily meant trilinear in the first place, only that it offers sufficiently higher quality than 'performance', which it does.

That's different than if Nvidia had actively mislead their users by misrepresenting what the slider does in the drivers. I suppose it's a subtle difference, but it's worth something IMO.

I'm bringing this bit out into a discussion here:

http://www.beyond3d.com/forum/viewtopic.php?t=7025
 
WaltC said:
RaolinDarksbane said:
Heh, it's kinda amusing that Kyle closed a 21 pages thread(the UT2k3 filtering article) citing that it is too long and shoud be continued on another thread. He then closed that second thread when it barely hit second page.

Kyle is really working overtime in damage control the way he bans people, reply to post with mostly semantics, and close threads, and deleting countless post.

I don't feel sorry for the poor bastard from what I heard of him and read from a few of his reply in those threads. He really had it coming. Kinda ironic that his site is getting so many hits for his FUD.

I just checked the site to pull some quotes and apparently he's taken the whole article down now, citing technical difficulties with his page display software. Heh...;) I guess the technical problems were specific to that one particular article. Wouldn't it be ironic if nVidia asked him to pull it, citing "trade secrets" yet again.... ? Well, I guess I'll have to wait until it's back up...

Kyle on the [H said:
front page]I have currently taken down the UT2K3 Texture Filtering article. We did some major hardware installs this weekend and have been having some page issues every since. We are trying to locate the source of our issues and this is simply a test. The article will be put into rotation again soon.

Don't worry. He compensated by putting "[H]arder Than Trilinear Filtering on GFFX" as the latest [H] description on the front page earlier today. No kidding. :p
 
Kyle also posted this in the forum:

I did not feel comfortable saying this earlier, but I am in a WTF state of mind at the moment so....

The entire Trilinear issue was brought up with NVIDIA in face-to-face meetings over two months ago. Long before the B3DPolice came to the scene of the crime. Myself and the owner of another hardware site that is in the spotlight often talked specifically about Bi/Tri/AF with NVIDIA and how it was handled in the drivers. We specifically asked for NVIDIA to add a mode that allowed the application/game being used to set those settings as well a some others without entering a "debug" mode as their previous driver did (that was removed after our 5200/5600 article). AA etc..

NVIDIA has implemented this and tested this and has given me written re-verification as of Monday of this week that this feature will surely be included in their next driver release. I have seen the driver first hand, as of two weeks ago in the NVIDIA offices, and it is very simple in its implementations.

Also, we asked for a few other features a couple of months ago. I would say that about 75% of our suggestions were taken to heart and implemented in their upcoming driver set. Quite frankly, it is very likely going to be the most robust driver you have ever seen for a video card.

So all in all, it could be said that all of this was fixed before it started..... You will get true Trilinear if that is what you so desire. You have to keep in mind that a driver release is far from a trivial thing. It takes an incredible amount of resources for both ATI and NVIDIA to do so. So keep in mind, while it may seem very simple it is an incredibly demanding process.

Both ATI and NVIDIA get kudos for staying on top drivers. The industry would be totally different if we did not have commitments from both in this arena.
 
The entire Trilinear issue was brought up with NVIDIA in face-to-face meetings over two months ago. Long before the B3DPolice came to the scene of the crime. Myself and the owner of another hardware site that is in the spotlight often talked specifically about Bi/Tri/AF with NVIDIA and how it was handled in the drivers.

If this is actually the case, is sitting on this information and not mentioning supposed to be a good thing for everyone? And why get so annoyed by the "B3DPolice" for doing so?
 
Scorched said:
Kyle also posted this in the forum:

I did not feel comfortable saying this earlier, but I am in a WTF state of mind at the moment so....

The entire Trilinear issue was brought up with NVIDIA in face-to-face meetings over two months ago. Long before the B3DPolice came to the scene of the crime. Myself and the owner of another hardware site that is in the spotlight often talked specifically about Bi/Tri/AF with NVIDIA and how it was handled in the drivers. We specifically asked for NVIDIA to add a mode that allowed the application/game being used to set those settings as well a some others without entering a "debug" mode as their previous driver did (that was removed after our 5200/5600 article). AA etc..

NVIDIA has implemented this and tested this and has given me written re-verification as of Monday of this week that this feature will surely be included in their next driver release. I have seen the driver first hand, as of two weeks ago in the NVIDIA offices, and it is very simple in its implementations.

Also, we asked for a few other features a couple of months ago. I would say that about 75% of our suggestions were taken to heart and implemented in their upcoming driver set. Quite frankly, it is very likely going to be the most robust driver you have ever seen for a video card.

So all in all, it could be said that all of this was fixed before it started..... You will get true Trilinear if that is what you so desire. You have to keep in mind that a driver release is far from a trivial thing. It takes an incredible amount of resources for both ATI and NVIDIA to do so. So keep in mind, while it may seem very simple it is an incredibly demanding process.

Both ATI and NVIDIA get kudos for staying on top drivers. The industry would be totally different if we did not have commitments from both in this arena.

Clearly Kyle was aware of the problem. So he posts his review and covers up the driver issues.

Why exactly is he trying to fix the problem with nVidia instead of reporting about it? Does he work for nVidia or is he a journalist? Did he try to fix the Quack problems with ATI or was he just interested in giving them a black eye. Clearly there is a double standard!
 
I wonder how Nvidia justified to Kyle the fact that this driver behavior was unique to UT 2003 and managed to convince him that this was worth covering up and not worth mentioning even though he used UT 2003 in his reviews. Clear lack of integrity on his part.

And I still think it is misleading to say that Nvidia is "adding" an option to do full trilinear as requested by the application. That option has been there all along, it is the "quality" setting in the slider. All Nvidia is doing is removing a behind-our-backs application detection trick after they were caught.

Nvidia is leading Kyle down the garden path by making him feel important and influential with their "serious face-to-face talks".
 
Nvidia is leading Kyle down the garden path by making him feel important and influential with their "serious face-to-face talks".

Which they are doing with most of the US editors at the moment, and I'll be doing here soon as well.
 
WaltC said:
Heh-Heh....Always nice to encounter folks more long winded than me, Dave H...;)

I know the feeling. That's why I love having demalion around sometimes... :p

I think you're for some reason missing the obvious...;) It was because people noticed a difference that they first started looking into whether or not the Dets were doing Trilinear in UT2K3. Your position seems to effectively be that they are doing trilinear filtering without doing trilinear filtering. They aren't, and the differences are noticeable, which is why this topic has come up. If there were no visible differences, the topic would most likely never have been raised in the first place, right?

Actually I think it was noticed because the new UT2003 build offered the option to display colored mipmaps, and when Wavey or Rev (looking at the original thread, I think Rev) tried it out, they immediately noticed the lack of trilinear filtering. I don't know this for a fact--perhaps they noticed something suspicious in normal mode and subsequently decided to turn on the colored mipmaps to check it out. But that's the impression I got.

Perhaps Rev (or Wavey, or whoever it was that first discovered this) would like to comment?

You're reading way too much into [H]'s pronouncements that they couldn't tell much difference. As well, not even [H] says they can't tell any difference.

...

Again you keep reaching an erroneous conclusion that the "output isn't subjectively noticeable"--of course that's simply not so. Were it so, no one would ever have been able to notice the difference, hence none of us would be talking about it right now.

...

The reason you are "curious" is because nVidia, in this game, has substituted this performance bi/trilinear mode for full trilinear, and [H] editorializes that the substitution is A-OK with them because they "can't tell much difference" in the resulting IQ, although as I said even they don't deny differences exist.

...

You seem to have a problem with "almost as good" and "as good"....there is a distinct difference between these two states. Nobody, not even [H], claims that it's "as good." The entire [H] piece revolves around their subjective opinion that it's "almost as good" as far as they can see and where it isn't "as good" they flatly state they don't care. That's what subjective opinions do for you...;)

...

Nope, that is not what the [H] article states at all. They've said that what differences they can see, and they do see them, are so "minor" in their opinions (because "all you do in the game is run around fragging people and so don't have time to look at the image quality" if I got that right) that they just don't care whether nVidia provides full trilinear like the Cats do or not. That's very, very different from your characterization of what they said.

...

So.....what [H] views as "minor" IQ degradation because of the lack of full trilinear support in this game, someone else might view as a "major" IQ difference.

You can repeat it as often as you like, but that's not what the article, or Kyle and Brent's forum comments, said. Unfortunately the article is down, but some quotes by Brent from the forum discussion:

"as we shown in the Non-AF screenshots there aren't any mip-map boundary difference"

"Actually the conclusion is that we can't tell any in-game mip-map boundary difference between NVIDIA's "Quality" setting and ATI's Application Preference setting in a Non-AF situation. The 5900u seems to have a SLIGHT sharper texture in this situation as well.

With 8XAF there is also no in-game mip-map boundary differences and the 9800 Pro has a slight sharper texture."

"But currently we are NOT seeing it differ from the quality of ATI's mip-map transitions in a regular game view."

"My stance is purely observation, you see what i saw with the screenshots. The same is also said with movement as we indicated.

I played this game, first in all, yes ALL, the maps with NO bots, running around, backing into corners, looking in open spaces, looking at the floor, the walls, the ceilings, slopes, slants, gradiants, moving back and forth sometimes in one spot or one path looking hard for mip-map boundaries."

"We are just saying that NVIDIA's current filtering has no IQ difference in a regular game view compared to ATI's. The only place it shows a difference is with the mip-map levels highlighted, and i don't know anybody that plays the game in that mode."

"remember, the mip-map boundaries are what was in question and is what we are saying are not noticeable between the two"

There's more; they can repeat themselves almost as much as you can, albeit not all in the same post. Meanwhile, even though the original article is down the uncomressed pics are still available for download. Since you obviously do care about those minor differences that Kyle and Brent can see but choose to ignore (no doubt helped by Nvidia slush money), why don't you point them out for us in those screenshots?

As I said, I have no problem with nVidia offering a performance trilinear which is a mixture of tri and bilinear....none whatsoever. I understand what that is and am not curious about it (what's there to be curious about?)

I'm curious how well it is actually working in UT2003. (Kyle/Brent and Wavey seem to disagree about this, and neither has shown enough evidence for me to have a real sense of the actual IQ costs.) I'm curious whether there are specific features of UT2003 that make it particularly applicable for this optimization, or whether this optimization would be a good idea for more or all games. I'm curious why this sort of thing has been rejected in the past. I'm curious whether it could be improved in either the IQ or performance directions.

My problem is with the fact that they *substitute* it for full trilinear, even when the application asks for full trilinear. Very simple.

What the game asks for is irrelevant. Nvidia doesn't have an "application preference" mode in their drivers. The issue is that Nvidia's "quality" mode doesn't do the same thing in all games, and that they led reviewers to believe that it always did full trilinear.

[H] at no point ever denies that nVidia is not doing full trilinear filtering in UT2K3--in fact, their entire article affirms and confirms it...;) Their only contribution otherwise is to state that they don't care, for one reason or another.

I think only for one reason--it's not visible in the game. And going by the criteria on which they conduct their reviews, that's the only reason that matters.

Missing the obvious here Dave H, again... First of all there are no similar driver hacks in the current Catalysts, are there? All one need do is to use the UT2K3.ini to turn on trilinear--and presto, the game is fully trilineared.

Gee how come you can't just turn on "quality" in the drivers? Doesn't that mean trilinear?? Why should you have to muck around in the .ini???

There are many, many obvious differences between the IHVs aside from the products they produce.

Don't disagree with you there. So why not focus on the Nvidia scandals that actually do hurt the end-user?

[Q: then why doesn't Unwinder's anti-detect impact ATI performance on more games? A: we have no proof it is detecting all game-specific optimizations in either ATI's or Nvidia's drivers]

Did you write the Anti-Detector software, DaveH? The guy who wrote the software claims it does the same thing for the Catalysts and the Dets. Argue with him if you like...

Completely false. But thanks for being a sarcastic jerk.

[url=http://www.beyond3d.com/forum/viewtopic.php?p=132527#132527 said:
the guy who wrote the software[/url]]ATIAntiDetector scripts is a bit more complicated. ATI use different ways of application detections so it's much more difficult to collect and block _all_ the application detection related pieces of code. At this time I was able to identify and block at least 8 application detection routines in the driver, but I cannot give you any warranties that there are no more detections left (this applies to NVAntiDetector too).

So actually, it doesn't do the same thing for the Cats as for the Dets. And he doesn't claim it "is detecting all game-specific optimizations in either ATI's or Nvidia's drivers", like I said. But you can argue with him if you like.

Has it ever struck you that their article is so subjective it's worthless? Listen, opinions abound about IQ. Whereas I run with 2x FSAA enabled, 16x AF in my games by default--some people state they prefer 0x FSAA/8xAF for their own reasons. Which of us is "right?" Correct answer is "neither" because it's a matter of subjective preference only.

If some people posted 81MB of screenshots comparing 2xFSAA to no FSAA I think it would be easy to tell the difference.

Look. IQ is subjective. I know you want to make it objective by replacing IQ with a big checklist of rendering features, but that doesn't serve any purpose. The goal of 3d rendering is to appear as realistic as possible to most people, and thus success can only be judged by a person commenting on how well he/she thinks this has been done. But "subjective" doesn't mean "meaningless". You are well aware of this, but as your only chance of winning this argument is to muddy the waters, you choose to ignore it.

The important point about this whole affair is this: nVidia has removed the option of full trilinear support from its drivers for UT2K3. Nothing else matters--at all. Had they not done this, there would be no issue whatever as no one would care what lesser IQ modes nVidia built into its driver support. There is no issue apart from this one in my view and as such [H]'s entire attempt at apology is a waste of epaper.

This is like the pot calling the tablecloth black.

They can "call on" nVidia all they like but until nVidia *does something* relative to the issue such statments are pompous and mean nothing, right?

Something like...

[url=http://www.hardforum.com/showthread.php?s=d6fbedd5ab52349ab5d713b7698ba078&threadid=644163 said:
Kyle[/url]]We specifically asked for NVIDIA to add a mode that allowed the application/game being used to set those settings as well a some others without entering a "debug" mode as their previous driver did (that was removed after our 5200/5600 article). AA etc..

NVIDIA has implemented this and tested this and has given me written re-verification as of Monday of this week that this feature will surely be included in their next driver release.

Is that what you wanted?

Heh...would have been nice if [H] had ONCE "called on" nVidia to stop cheating in its drivers relative to benchmarks...! What did [H] do instead? Tell everyone to dump their benchmarks, that's what [H] did... That's pretty funny, Dave H....;)

I'm assuming you remember how strongly I argued against [H]'s position on 3dMark03? Or is there some other reason for bringing up this totally irrelevant subject??

Sigh--what would satisfy me is simply [H] stopping its infantile behavior of apologizing for nVidia and plainly stating that you can't compare nVidia's faux-trilinear to the Catalysts' full trilinear in terms of performance because its not an apples-apples IQ comparison.

If it looks the same then it's by definition an apples-apples IQ comparison. It's not an apples-apples workload comparison, but that's not what [H] is interested in.

A subjective opinion that something is "almost as good" for reasons I've already stated doesn't suffice, no.

Their opinion is that it is as good. Is better, actually, with no AF. And their screenshots agree with them, subjectively speaking. Incidentally, you haven't stated any reasons why it doesn't suffice, except that Nvidia doesn't also offer a full trilinear mode, which is neither here nor there when all that's required is "an apples-apples IQ comparison".

Excuse me--I don't want to personally verify the old saying that "a fool and his money are easily parted"....;)

Presumably because you'd be broke awful quick.

And by the way, in the future it might be nice if you used "quotation marks" to enclose words that people "actually used" instead of "misleading paraphrases".
 
DaveBaumann said:
The entire Trilinear issue was brought up with NVIDIA in face-to-face meetings over two months ago. Long before the B3DPolice came to the scene of the crime. Myself and the owner of another hardware site that is in the spotlight often talked specifically about Bi/Tri/AF with NVIDIA and how it was handled in the drivers.

If this is actually the case, is sitting on this information and not mentioning supposed to be a good thing for everyone? And why get so annoyed by the "B3DPolice" for doing so?

How is it bad for everyone? Especially when several other creditable sites are already applying extensive energy to the topic? IMO they did what was best for them in their efforts and direction. Let other sites apply effort in that arena, we will apply effort here.

Fred da Roza said:
Scorched said:
Kyle also posted this in the forum:

I did not feel comfortable saying this earlier, but I am in a WTF state of mind at the moment so....

The entire Trilinear issue was brought up with NVIDIA in face-to-face meetings over two months ago. Long before the B3DPolice came to the scene of the crime. Myself and the owner of another hardware site that is in the spotlight often talked specifically about Bi/Tri/AF with NVIDIA and how it was handled in the drivers. We specifically asked for NVIDIA to add a mode that allowed the application/game being used to set those settings as well a some others without entering a "debug" mode as their previous driver did (that was removed after our 5200/5600 article). AA etc..

NVIDIA has implemented this and tested this and has given me written re-verification as of Monday of this week that this feature will surely be included in their next driver release. I have seen the driver first hand, as of two weeks ago in the NVIDIA offices, and it is very simple in its implementations.

Also, we asked for a few other features a couple of months ago. I would say that about 75% of our suggestions were taken to heart and implemented in their upcoming driver set. Quite frankly, it is very likely going to be the most robust driver you have ever seen for a video card.

So all in all, it could be said that all of this was fixed before it started..... You will get true Trilinear if that is what you so desire. You have to keep in mind that a driver release is far from a trivial thing. It takes an incredible amount of resources for both ATI and NVIDIA to do so. So keep in mind, while it may seem very simple it is an incredibly demanding process.

Both ATI and NVIDIA get kudos for staying on top drivers. The industry would be totally different if we did not have commitments from both in this arena.

Clearly Kyle was aware of the problem. So he posts his review and covers up the driver issues.

Why exactly is he trying to fix the problem with nVidia instead of reporting about it? Does he work for nVidia or is he a journalist? Did he try to fix the Quack problems with ATI or was he just interested in giving them a black eye. Clearly there is a double standard!

Actually [H] did attempt to resolve the problem with ATI well before they ever reported on it. ATI insisted for well over a month that [H]ard findings were inaccurate and attempted to cover it up. Any effort applying by anyone, whether here or other that results in change or improvement in our experiance as a consumer is a good thing.
 
Althornin said:
DAVE H:
You dont find it misleading that said "quality" option shows 100% trilinear filtering in our aniso test apps? (The tunnel test).
Or that Quality apparently means full level aniso in morrowind and freelancer? (according to the amdmb look at the issue)

You dont think THAT might be enough to tell people that "quality" means trilinear????

No, not really. I'm more bothered by the fact that they seem to have explicitly told reviewers that "quality" means trilinear. That was either a lie or, if it turns out that their statement actually left themselves a tiny loophole, purposely misleading behavior. Or if they app-detected the tunnel test specifically for the purpose of turning trilinear on, and it was normally off, then that would clearly be beyond the pale.

As it is, their definition of "quality" seems to be "trilinear by default, and partial trilinear in those cases where we can verify that it has no noticeable effect on IQ and provides a substantial speed boost". I would much prefer that they made that definition explicit. And that they provided a way to override it and force full trilinear all the time. Their not having done so is certainly yet another example of slimy Nvidia behavior (although IMO far from the worst).

But that definition of "quality" seems both consistent and reasonable to me.

You dont find that to be dishonest? To change what a settings means for just one game? (and a heavily benchmarked one at that!)
By itself, this breaks one of nVidias optimization rules - its only good for a benchmark. If it was good enough for all games, then there wouldnt be any trickery...

It's not only good for a benchmark--UT2003 is a fairly popular game, you know. If they were only turning the optimization on during benchmark runs, then that would be breaking their optimization rules. Believe me, Nvidia is not swearing off application-specific optimizations for heavily-benchmarked games. (They would probably claim "for popular games", but I doubt any IHV has worried much on optimizing a popular game that isn't used in video card reviews.)
 
Blackwind said:
Actually [H] did attempt to resolve the problem with ATI well before they ever reported on it. ATI insisted for well over a month that [H]ard findings were inaccurate and attempted to cover it up. Any effort applying by anyone, whether here or other that results in change or improvement in our experiance as a consumer is a good thing.

Kyle could have reported his findings. As he has said it's been over 2 months just on this issue. How about the 3DMark03 issue where nVidia has publically defended their cheats and Kyle is also very aware of them. This is just a long string of cheats that nVida is now trying to worm their way out of.
 
Fred da Roza said:
[snip]
Why exactly is he trying to fix the problem with nVidia instead of reporting about it? Does he work for nVidia or is he a journalist? Did he try to fix the Quack problems with ATI or was he just interested in giving them a black eye. Clearly there is a double standard!

What is worth quoting here is the comments he made in the conclusion to the Quack article authored by Kyle and still available from the HardOCP site :-

'We think that ATi should be producing Radeon drivers that are 3D engine specific and not game specific. Especially when the one targeted game is a widely used benchmark that people trust.'

What I want to know is if that was the stance Kyle took for ATi with the, as noted by him in the article, hard to notice in game optimisations it involved then what is the difference here? How does nVidia's gain moral correctness when ATi's Quack did not?

Does someone want to quote that and ask him? I can't register to do it.

Philip
 
DaveBaumann said:
The entire Trilinear issue was brought up with NVIDIA in face-to-face meetings over two months ago. Long before the B3DPolice came to the scene of the crime. Myself and the owner of another hardware site that is in the spotlight often talked specifically about Bi/Tri/AF with NVIDIA and how it was handled in the drivers.

If this is actually the case, is sitting on this information and not mentioning supposed to be a good thing for everyone? And why get so annoyed by the "B3DPolice" for doing so?
Simple, cause it's BS. :)

Fred da Roza said:
Scorched said:
Kyle also posted this in the forum:

I did not feel comfortable saying this earlier, but I am in a WTF state of mind at the moment so....

The entire Trilinear issue was brought up with NVIDIA in face-to-face meetings over two months ago. Long before the B3DPolice came to the scene of the crime. Myself and the owner of another hardware site that is in the spotlight often talked specifically about Bi/Tri/AF with NVIDIA and how it was handled in the drivers. We specifically asked for NVIDIA to add a mode that allowed the application/game being used to set those settings as well a some others without entering a "debug" mode as their previous driver did (that was removed after our 5200/5600 article). AA etc..

NVIDIA has implemented this and tested this and has given me written re-verification as of Monday of this week that this feature will surely be included in their next driver release. I have seen the driver first hand, as of two weeks ago in the NVIDIA offices, and it is very simple in its implementations.

Also, we asked for a few other features a couple of months ago. I would say that about 75% of our suggestions were taken to heart and implemented in their upcoming driver set. Quite frankly, it is very likely going to be the most robust driver you have ever seen for a video card.

So all in all, it could be said that all of this was fixed before it started..... You will get true Trilinear if that is what you so desire. You have to keep in mind that a driver release is far from a trivial thing. It takes an incredible amount of resources for both ATI and NVIDIA to do so. So keep in mind, while it may seem very simple it is an incredibly demanding process.

Both ATI and NVIDIA get kudos for staying on top drivers. The industry would be totally different if we did not have commitments from both in this arena.

Clearly Kyle was aware of the problem. So he posts his review and covers up the driver issues.

Why exactly is he trying to fix the problem with nVidia instead of reporting about it? Does he work for nVidia or is he a journalist? Did he try to fix the Quack problems with ATI or was he just interested in giving them a black eye. Clearly there is a double standard!
What I'm wondering about now is if he was in secret discussions with nVidia fixing the "problem" than why did he and Bent write up that justification for the "almost-trilinear" filtering?!?

I hate it when he contradicts himself and sends mixed messages, I get so confused! ;)
 
Scorched said:
Kyle also posted this in the forum:

I did not feel comfortable saying this earlier, but I am in a WTF state of mind at the moment so....

The entire Trilinear issue was brought up with NVIDIA in face-to-face meetings over two months ago. Long before the B3DPolice came to the scene of the crime. Myself and the owner of another hardware site that is in the spotlight often talked specifically about Bi/Tri/AF with NVIDIA and how it was handled in the drivers. We specifically asked for NVIDIA to add a mode that allowed the application/game being used to set those settings as well a some others without entering a "debug" mode as their previous driver did (that was removed after our 5200/5600 article). AA etc..

NVIDIA has implemented this and tested this and has given me written re-verification as of Monday of this week that this feature will surely be included in their next driver release. I have seen the driver first hand, as of two weeks ago in the NVIDIA offices, and it is very simple in its implementations.

Also, we asked for a few other features a couple of months ago. I would say that about 75% of our suggestions were taken to heart and implemented in their upcoming driver set. Quite frankly, it is very likely going to be the most robust driver you have ever seen for a video card.

So all in all, it could be said that all of this was fixed before it started..... You will get true Trilinear if that is what you so desire. You have to keep in mind that a driver release is far from a trivial thing. It takes an incredible amount of resources for both ATI and NVIDIA to do so. So keep in mind, while it may seem very simple it is an incredibly demanding process.

Both ATI and NVIDIA get kudos for staying on top drivers. The industry would be totally different if we did not have commitments from both in this arena.
I just made my first ever post in their forum, in the same thread.

Gee, I hope I don't get banned.
 
I'm curious why this sort of thing has been rejected in the past.

Wonder no more, DaveH - it was done by ATI in the not so distant past, and they got slammed for it, upon the release of the 8500, iirc, there was a driver set that didnt do "full trilinear" and ATI got hammered for it.
 
Reverend said:
Scorched said:
Kyle also posted this in the forum:

I did not feel comfortable saying this earlier, but I am in a WTF state of mind at the moment so....

The entire Trilinear issue was brought up with NVIDIA in face-to-face meetings over two months ago. Long before the B3DPolice came to the scene of the crime. Myself and the owner of another hardware site that is in the spotlight often talked specifically about Bi/Tri/AF with NVIDIA and how it was handled in the drivers. We specifically asked for NVIDIA to add a mode that allowed the application/game being used to set those settings as well a some others without entering a "debug" mode as their previous driver did (that was removed after our 5200/5600 article). AA etc..

NVIDIA has implemented this and tested this and has given me written re-verification as of Monday of this week that this feature will surely be included in their next driver release. I have seen the driver first hand, as of two weeks ago in the NVIDIA offices, and it is very simple in its implementations.

Also, we asked for a few other features a couple of months ago. I would say that about 75% of our suggestions were taken to heart and implemented in their upcoming driver set. Quite frankly, it is very likely going to be the most robust driver you have ever seen for a video card.

So all in all, it could be said that all of this was fixed before it started..... You will get true Trilinear if that is what you so desire. You have to keep in mind that a driver release is far from a trivial thing. It takes an incredible amount of resources for both ATI and NVIDIA to do so. So keep in mind, while it may seem very simple it is an incredibly demanding process.

Both ATI and NVIDIA get kudos for staying on top drivers. The industry would be totally different if we did not have commitments from both in this arena.
I just made my first ever post in their forum, in the same thread.

Gee, I hope I don't get banned.

Here's the page with Reverend's post, to save giving Kyle a few hits. ;)
 
Status
Not open for further replies.
Back
Top