So much for Nvidia's internal guidelines

Bolloxoid

Newcomer
I'll just quote 3DGPU quoting Brian Burke:

I spoke with Brian Burke, PR Director for Desktop Products at NVIDIA regarding the UT2003 situation as was told that the drop in trilinear filtering levels does not degrade IQ, and therefore does not violate their new optimization methods.

So, their interpretation of the "leaked" optimisation guidelines is pretty flexible.

Link to the whole editorial:

http://www.3dgpu.com/modules/wfsection/article.php?articleid=70
 
Bolloxoid said:
I'll just quote 3DGPU quoting Brian Burke:

I spoke with Brian Burke, PR Director for Desktop Products at NVIDIA regarding the UT2003 situation as was told that the drop in trilinear filtering levels does not degrade IQ, and therefore does not violate their new optimization methods.

So, their interpretation of the "leaked" optimisation guidelines is pretty flexible.

Link to the whole editorial:

http://www.3dgpu.com/modules/wfsection/article.php?articleid=70
And you're surprised? C'mon, don't tell me you actually BELIEVED that PR BS...did ya? :oops:
 
Bolloxoid said:
I'll just quote 3DGPU quoting Brian Burke:

I spoke with Brian Burke, PR Director for Desktop Products at NVIDIA regarding the UT2003 situation as was told that the drop in trilinear filtering levels does not degrade IQ, and therefore does not violate their new optimization methods.

So, their interpretation of the "leaked" optimisation guidelines is pretty flexible.

Link to the whole editorial:

http://www.3dgpu.com/modules/wfsection/article.php?articleid=70

And, again, that puts us on the slippery slope of other IHVs feeling tempted to find crafty ways of trading IQ for performance so their products aren't unfairly represented during comparative benchmarks. Not good.
 
Bolloxoid said:
I'll just quote 3DGPU quoting Brian Burke:

I spoke with Brian Burke, PR Director for Desktop Products at NVIDIA regarding the UT2003 situation as was told that the drop in trilinear filtering levels does not degrade IQ, and therefore does not violate their new optimization methods.

So, their interpretation of the "leaked" optimisation guidelines is pretty flexible.

Link to the whole editorial:

http://www.3dgpu.com/modules/wfsection/article.php?articleid=70


Wow. I cannot contain my amazement and stupefaction....Every time I think I think we might be on the verge of a reformation...the guys at nVidia PR bring me solidly back to earth.

Assuming this is an accurate representation of what BB said (I can't figure out why the guys who write these pieces and call these people on the phone don't get direct quotes), it would seem to directly contradict an answer given in a recent DriverHeaven interview with Derek Perez:

Zardon: Will you ever re-add the option to force quality (trilinear etc) settings for AA/AF?

Derek: Yes we will – we hope to do so in a future driver rev.

http://www.driverheaven.net/dhinterviews/nvidia/

Sounds like the same issue to me...Admittedly, there's "weasel room" in that DP could always say "I wasn't talking about UT2K3".....but....come on...

If that's the answer frgmstr was given--that disabling full trilinear doesn't affect image quality--then no wonder he blew his top. Sadly, I'm afraid, they'll probably reference frgmstr's UT2K3 filtering article as a "reference" to prove their point--so I hope he stays a step ahead and deletes it.

I have to say that I'm disappointed, but hardly suprised, actually. Even so, public displays of incredible stupidity do shock me, even if they're anticipated. nVidia seems to have a real knack for doing it though, if you know what I mean.

In the DH interview, Perez also lays out the "guidelines," as well. But if Burke is now saying those guidelines may only be interpreted and defined by nVidia then it almost appears as if they were never anything more more than a wholesale fabrication of nVidia PR. There's not a software engineer alive who makes his living writing 3d-card drivers who would be caught dead saying trilinear filtering doesn't make an image quality difference--heh. Certainly not at nVidia--a company which has been enabling and supporting full trilinear in its products for years precisely because of the benefits it brings to IQ. Remarkable.

The very best nVidia can hope for here is that Burke's comments were badly mangled in the retelling.

Edit:

I'm sure nVidia will "luv" this now that it has ponied up some fees and rejoined the FM Program:

3dgpu said:
We will not use any synthetic benchmarks to test performance.

That means 3DMark, CodeCreatures, ShaderMark, etc. Since there is no way for us to be sure of the validity of the tests, we will leave them by the wayside. We don’t play synthetic benchmarks anyway. We may use programs to compare IQ, like FSAATester.

Personally, I think this is tossing out the baby with the bathwater. It's a cryin' shame that FM didn't stick to its original audit report, but then again, FM's just digging it's own grave here. IMO, there is absolutely nothing inherently "wrong" with synthetic benchmarks. With or without nVidia's co-operation, I think 3dMk03 is probably the most impressive synthetic benchmark for a 3d-card to date. It's too bad that FM can only measure the quality of its software by the number of paying IHV's in its program--really too bad. But it's their company, not mine.

*sigh* Synthetic benchmarks, like individual 3d games with differing engines, will return differing performance results, even on the same hardware. Might as well say that all 3d games are bad standards to use because the results differ among them, as to say all synthetics are bad for the same reason. The point to benchmarks and games is that you use *a lot of them* to get an aggregate picture of 3d hardware. It's just as wrong to look only at Doom3 performance as it is to look only at 3dMk03 performance--wrong, because either method is woefully incomplete. But that's not to say you should avoid looking at either as a part of a complete picture. My, my...what have we come to...
 
I agree, there's nothing wrong with synthetic benchmarks, but unfortunately there are something wrong with the drivers we use for those synthetic benchmarks. This is essentially a way for us to show these companies that we're not going to let them get that edge over the competition through these "optimizations". Both me and Brian agree, there should be no inherent optimizations for synthetic benchmarks at all. Games, fine, as long as it doesn't affect image quality. The idea of a synthetic benchmark is to test a certain product for performance and stability as they stand, not with tweaks and specific application parameters.

As for the trilinear issue, I think NVIDIA are essentially shooting themselves in the foot if they feel they can dictate how adjustable the policies are when it comes to these optimizations. They should just leave trilinear filtering enabled, and give the end user a choice to turn on adaptive filtering if they so desire. I mentioned this to NVIDIA in a conference call, that if they were to develop a program that would remember specific settings for each game we play, so when the game starts up, it automatically enables/disables those settings you set, it'd be a good thing. The VP Of Marketing, Dan Vivoli said it was a good idea and that he'll pass it on. Here's hoping the idea is implemented sometimes in the future.

Anyways, I'm hoping this editorial, which we'll provide a link to in all our review, sets the tone for the site, and hopefully companies like NVIDIA will see the light, stop making it more difficult for us reviewers to provide a fair review, and give the gamers more control over their own product they bought. It's a bloody shame that if you buy a $500 video card, that you're playing games with bilinear filtering enabled on a new shooter like UT2003, or some other IQ altering setting, all because the competition has a better product at the time.
 
Well, it looks like people who questioned that an "optimization must produce the correct image" were justified. "Correct" according to whom, indeed :devilish:
 
Matt said:
I agree, there's nothing wrong with synthetic benchmarks, but unfortunately there are something wrong with the drivers we use for those synthetic benchmarks. This is essentially a way for us to show these companies that we're not going to let them get that edge over the competition through these "optimizations". Both me and Brian agree, there should be no inherent optimizations for synthetic benchmarks at all. Games, fine, as long as it doesn't affect image quality. The idea of a synthetic benchmark is to test a certain product for performance and stability as they stand, not with tweaks and specific application parameters.

I agree with you here very much except on one particular point--I think the IHV's should be more directly targeted, specifically, instead of lumped together. For instance, let's say that you can use something like "Anti-Detect" to determine that one IHV isn't cheating in a benchmark, but it reveals another is. Or, let's say that one IHV looks to be running clean drivers in a D3d benchmark, but you can't tell about the other because the drivers have been encrypted and something like Anti-Detect won't work. If you were to run the benchmarks anyway, listing the compared products by IHVs, but you put 0's all across the chart for the cheating/encrypted IHV's products with an asterisk denoting the reason in small print at the bottom of the chart--heh, I guarantee that you'll get their attention very quickly, and very effectively. If every web site were to do this I think we'd all be amazed at how quickly the guilty IHVs would respond in a positive manner. Also, if you have a suspicion that a particular driver set may be cheating, but can't obviously prove it with Anti-Detect or some other method--fire off an email to the manufacturer and inquire about it--you might get some answers in a hurry.

What I really can't support is an approach which puts all IHVs, and all their products, in the same boat--when the same situations may not apply at all. This does no service to your readership, I think, because it works from an assumption that all parties are guilty, and equally guilty. I just don't think the record supports such a contention. Web sites routinely run reviews which focus on 5 fps differences between products and make much out of it--so focusing in on this issue in a detailed and exacting fashion wouldn't evoke more detail than is called for by the situation, I think. And of course it would provide vastly more information to your readers than simply declaring everybody equally guilty and simply not running any synthetic benchmarks. That's not fair to the IHVs or your readers, IMO. If it's important that we know when a particular IHV's drivers cheat or inappropriately optimize for a benchmark, I think it's equally important that we know when they don't.

As for the trilinear issue, I think NVIDIA are essentially shooting themselves in the foot if they feel they can dictate how adjustable the policies are when it comes to these optimizations. They should just leave trilinear filtering enabled, and give the end user a choice to turn on adaptive filtering if they so desire. I mentioned this to NVIDIA in a conference call, that if they were to develop a program that would remember specific settings for each game we play, so when the game starts up, it automatically enables/disables those settings you set, it'd be a good thing. The VP Of Marketing, Dan Vivoli said it was a good idea and that he'll pass it on. Here's hoping the idea is implemented sometimes in the future.

Good idea...but if an IHV rigs his drivers not to respond to calls from the control panel or calls from games to set IQ options, but forces something else instead, I'm not sure that an IHV-implemented option as you describe would operate any differently. Really, all that's needed in the UT2K3 situation is for nVidia to remove this "optimization" and as you say make the optimization a matter of selection in the control panel. I think the approach you're suggesting would be fine if it removes the forced optimization from the drivers for UT2K3, but if not...?

Anyways, I'm hoping this editorial, which we'll provide a link to in all our review, sets the tone for the site, and hopefully companies like NVIDIA will see the light, stop making it more difficult for us reviewers to provide a fair review, and give the gamers more control over their own product they bought. It's a bloody shame that if you buy a $500 video card, that you're playing games with bilinear filtering enabled on a new shooter like UT2003, or some other IQ altering setting, all because the competition has a better product at the time.

Agreed 100%, my only reservations being stated above.
 
Disappointing, but perhaps nV thinks this is the only way they can compete with ATi's more-adaptive AF (ATi's varies by angle, but nV can't do that in hardware, so they're forced to compete another way: with MIP-map transitions).

As long as reviews make consumers aware of this, I'm honestly OK with it. It's the deception and lame PR that irks me.
 
Mr Evans has seriously stuffed up when he stated that the filtering level tradeoff made no IQ difference in game.

What a load of B$, maybe he should try and RUN a game. I know I have and I can see a VERY large difference. Maybe Mr Evans is colourblind?

Maybe half-assed trilinear filtering is an okay tradeoff for Mr Evans but not an option for me.
I run my games with full trilinear filtering and 16x AF and damn, there is a pretty large difference on my R300 between 1 stage trilinear(quality AF) and full trilinear(not listed in CP) using 16x AF.

If I just wanted to play my games well with mediocre IQ then I would have just stuck with my NV20. There is a reason for me to upgrade to a high end card and it isn't to dumb down the IQ and play games over 300fps. I upgraded only for the visuals, not pure speed with mediocre IQ. I wanted great performance with max IQ and the R300 lived up to it's reputation.

Fortunate for me I didn't wait for the NV30 or the NV35. I know I would regret it (especially after seeing it in action, I seen no IQ difference from the NV35 over my NV20 when both cards are at their peak IQ).
 
Pete said:
Disappointing, but perhaps nV thinks this is the only way they can compete with ATi's more-adaptive AF (ATi's varies by angle, but nV can't do that in hardware, so they're forced to compete another way: with MIP-map transitions).
NVIDIA does have what you call "adaptive" AF. The GeForce FX series introduced a new form of AF that is not rotationally invariant (actually, it's not really AF at all until about the second or third mip-level, but that's another story). But this isn't what people are talking about. People are talking about how trilinear (the default filtering mode in UT2003) isn't trilinear on the NVIDIA drivers.
As long as reviews make consumers aware of this, I'm honestly OK with it. It's the deception and lame PR that irks me.
Do consumers have these web reviews available in stores? Do consumers think to go read web reviews when the CEO of NVIDIA claims to have the fastest cards based on 3D Mark 2003 and UT2003?

-FUDie
 
FUDdie, I know nV's AF is also "adaptive," but it seems to me their "adaptivity" is focused more on MIP-map transitions, whereas ATi's is on angle relative to viewport.

As for reviews, if people don't know about the internet in general and Google in particular by now, I can't help them. :p I doubt Joe Schmoe/Blow/Sixpack knows what nV's CEO is saying, anyway.
 
Bouncing Zabaglione Bros. said:
digitalwanderer said:
And you're surprised? C'mon, don't tell me you actually BELIEVED that PR BS...did ya? :oops:

How's things going with David Perez? I thought you and he were sorting all these issues out?
Much better, we're not speaking again. :) We had a bit of a mix-up in an e-mail. He told me he couldn't tell me something because it was under NDA with nVidia, I mentioned to him that was total BS as he WAS nVidia and if'n he wanted to tell me he could but to pull that cop-out was kind of lame and insulting, he sent me back a smiley in reply, and I kind of went off a bit explaining to him that THAT was just the kind of attitude and condescending PR BS that the enthusiast crowd REALLY didn't care for and next time he wants to flip me the bird to please not do it with a smiley.

I haven't heard from him since, but I WAS banned at DriverHeaven for questioning their fluff piece with him. :rolleyes:

I'd say we've finished our discussions and have a perfect understanding of each other. :devilish:

WaltC said:
In the DH interview, Perez also lays out the "guidelines," as well. But if Burke is now saying those guidelines may only be interpreted and defined by nVidia then it almost appears as if they were never anything more more than a wholesale fabrication of nVidia PR. There's not a software engineer alive who makes his living writing 3d-card drivers who would be caught dead saying trilinear filtering doesn't make an image quality difference--heh.
Fortunately there isn't a whole lot of technical knowledge in nVidia's PR department to let messy realities like that get in the way of a good PR spin. :rolleyes:

I'd be disgusted, but I'm not awake enough yet. On a happy-note, it's getting to be pretty common knowledge about what nVidia is/isn't doing and I expect it to become mainstream knowledge very soon. 8)
 
Matt said:
I agree, there's nothing wrong with synthetic benchmarks, but unfortunately there are something wrong with the drivers we use for those synthetic benchmarks.
You make it sound like it is the drivers' fault, when it is most definitely the fault of the company making the drivers.

Such companies know exactly what they're doing wrt synthetic benchmarks.

This is essentially a way for us to show these companies that we're not going to let them get that edge over the competition through these "optimizations". Both me and Brian agree, there should be no inherent optimizations for synthetic benchmarks at all. Games, fine, as long as it doesn't affect image quality. The idea of a synthetic benchmark is to test a certain product for performance and stability as they stand, not with tweaks and specific application parameters.
"As long as it doesn't affect image quality"...

Who is the ultimate judge of whether image quality is affected in games? Who knows for sure?

How many games are you going to test, or can test, or is willing to spend the time to test, to see if image quality is affected through optimizations already in place that you probably don't know about?

Anyways, I'm hoping this editorial, which we'll provide a link to in all our review, sets the tone for the site, and hopefully companies like NVIDIA will see the light, stop making it more difficult for us reviewers to provide a fair review, and give the gamers more control over their own product they bought.
NVIDIA, or any other IHVs, don't really care if what they do will "stop making it more difficult for us reviewers to provide a fair review".

It is the reviewers duty to find out if the IHVs are indeed making things more difficult for reviewers.

Wishing for a perfect world is never a good thing. Because a perfect world don't exists. Everyone will take shortcuts without mentioning the shortcuts if they can.

Wishing for a perfect world makes you lazy, or dependent on others to make discoveries, which basically also means you're lazy.
 
Pete said:
Disappointing, but perhaps nV thinks this is the only way they can compete with ATi's more-adaptive AF (ATi's varies by angle, but nV can't do that in hardware, so they're forced to compete another way: with MIP-map transitions).
Errm, ATi's AF is not more adaptive than Nvidia's AF. It is more the "lowering the image quality to get higher framerates"-category.
 
Reverend said:
Matt said:
I agree, there's nothing wrong with synthetic benchmarks, but unfortunately there are something wrong with the drivers we use for those synthetic benchmarks.
You make it sound like it is the drivers' fault, when it is most definitely the fault of the company making the drivers.

Such companies know exactly what they're doing wrt synthetic benchmarks.

This is essentially a way for us to show these companies that we're not going to let them get that edge over the competition through these "optimizations". Both me and Brian agree, there should be no inherent optimizations for synthetic benchmarks at all. Games, fine, as long as it doesn't affect image quality. The idea of a synthetic benchmark is to test a certain product for performance and stability as they stand, not with tweaks and specific application parameters.
"As long as it doesn't affect image quality"...

Who is the ultimate judge of whether image quality is affected in games? Who knows for sure?

How many games are you going to test, or can test, or is willing to spend the time to test, to see if image quality is affected through optimizations already in place that you probably don't know about?

Anyways, I'm hoping this editorial, which we'll provide a link to in all our review, sets the tone for the site, and hopefully companies like NVIDIA will see the light, stop making it more difficult for us reviewers to provide a fair review, and give the gamers more control over their own product they bought.
NVIDIA, or any other IHVs, don't really care if what they do will "stop making it more difficult for us reviewers to provide a fair review".

It is the reviewers duty to find out if the IHVs are indeed making things more difficult for reviewers.

Wishing for a perfect world is never a good thing. Because a perfect world don't exists. Everyone will take shortcuts without mentioning the shortcuts if they can.

Wishing for a perfect world makes you lazy, or dependent on others to make discoveries, which basically also means you're lazy.

I don't know how I can make it sound like it's not the company's fault. As far as I know, the company writes the drivers, not a 3rd party. If it's the driver's fault, it's the company that wrote it to blame. Not sure what you're getting at here.

The ultimate judge lies in the collective voices of gamers. I'm certainly not the ultimate judge, nor you, or any single person. It's when websites and gamers posting in forums all lend their voice and say no trilinear filtering in UT2003 = not good.

I play all my games when testing, all the latest, and all the old ones I still have on my hard drive. I always like to see how technology advanced and see if a game that stuttered before still stutters. I'd like to see if the FSAA implementation has improved, and if 8x anisotropic makes my games clearer. Of course time constraints are a factor, I can't spend 24 hours a day testing a product, but I do spend every free moment doing so, and a minimum of 2 weeks is my rule. Of course I can't find every single optimizations in the drivers, I don't have access to the source code of the drivers anymore than you do, but I do keep my eyes open for anyone posting any bugs or difficulties in games. I monitor my forums, along with 4 others, and a plethora of websites to see if anyone caught anything with the latest drivers.

I never wished for a perfect world, just for a sane one. :)
 
FUDie said:
Pete said:
FUDdie, I know nV's AF is also "adaptive," but it seems to me their "adaptivity" is focused more on MIP-map transitions, whereas ATi's is on angle relative to viewport.
I don't know if you're aware of it, but the GeForce FX chips have a completely different AF mode, unlike anything in the previous GeForce chips. See http://www.beyond3d.com/previews/nvidia/gffxu/index.php?p=20.

-FUDie

Those algorithms are outdated and outmoded.

Although I think the tri/bi mix was a good idea. Forcing bilinear in two modes is just nasty.
 
I want full trilinear filtering on all high end cards, wheather it be with or without AF.
I don't pay $800-$1000 for nothing.
 
StealthHawk said:
FUDie said:
Pete said:
FUDdie, I know nV's AF is also "adaptive," but it seems to me their "adaptivity" is focused more on MIP-map transitions, whereas ATi's is on angle relative to viewport.
I don't know if you're aware of it, but the GeForce FX chips have a completely different AF mode, unlike anything in the previous GeForce chips. See http://www.beyond3d.com/previews/nvidia/gffxu/index.php?p=20.

-FUDie

Those algorithms are outdated and outmoded.
What are you talking about? They are still in the hardware.

-FUDie
 
Back
Top