Toms speaks!!

See

http://www20.tomshardware.com/graphic/20030818/index.html

There are some superb statements in the conclusion.

The Futuremark optimizations, which are deactivated through the 3D Mark 2003 Patch v330, seem to be active again in the new driver. However, the extent of those optimizations is not clear to us, since THG is not a member of Futuremark's beta program and we therefore do not have access to the 3D Mark 2003 developer's version, which would be required to determine optimizations such as clipping frames and unperformed buffer clears that have been under criticism. NVIDIA assures us, however, that these are no longer in the new driver-its own rules forbid it. It's questionable as to whether this will win back the trust of users in the benchmark 3D Mark 2003.

and my fave


It would be nice to hear ATI's official stance on driver optimizations. Although ATI already stated months ago that it would do away with "unfair" driver optimizations, they have not yet announced any pertinent guidelines.

Nvidia have given him some ace flow charts and everything!!! ATI have not produced the same exellent PR goodies!

Nvidia may have produced some guidelines but are they using them?

At least they finally bring the issue up
 
NVIDIA's new optimization guidelines represent a step in the right direction following the massive amount of criticism the company has received during the past few months. NVIDIA shows it is taking buyers' wishes and considerations into account and is taking action (similar to how it made efforts in the past to remedy the loud noise level of the FX 5800 Ultra's (NV30) fans).

Just like THG themselves spoke out against cheating :rolleyes: . Seems to me as another half hearted attempt to appear impartial. Why would THG try and paint ATI with same brush as Nvidia here.
 
I agree the whole article reads like they have been bringing a blow by blow account to their readers over the last few months but if you were solely a toms hardware news reader this article would be a bolt from the blue.

My own feeling is that at least it does not seem to completely rid nvidia of blame and many people will hopefully investigate further or that at least nvidia realises that now all the "popular" sites can't go on ignoring it as it has not blown over in the few weeks the sites were expecting.
 
Ozymandis said:
Anand, still quiet :rolleyes:
Not for long. I expect him now to be under a bit of pressure to come out and tell us all about how he's been aware of it and secretly investigating it and only chose to speak out now because nVidia failed to implement a change they promised in private conversations that they now claim they didn't, yadda-yadda-yadda-ya. :rolleyes:

Expect 'em all to start falling in line with the truth now, just like dominoes or leppers or something. ;)

(I shouldn't be sarcastic, this is actually a good thing and something that I've kind of wanted for a long time....but it just seems SO hypocriticalish too. :( )
 
digitalwanderer said:
Expect 'em all to start falling in line with the truth now, just like dominoes or leppers or something. ;)
Do you mean lepers? Or lemmings?

A very strange mental image, if you mean lepers.
 
RussSchultz said:
digitalwanderer said:
Expect 'em all to start falling in line with the truth now, just like dominoes or leppers or something. ;)
Do you mean lepers? Or lemmings?

A very strange mental image, if you mean lepers.
Brainfart, I meant lemmings.

Now I got the stupidest mental image of a lot of lepers jumping off a cliff with apendages falling off. (Just got done mowing the lawn, sorry. :rolleyes: )

Thanks for the correction, I now feel like a proper idiot. :)
 
Weird

Commending a company for releasing a pile of PR slides they clearly do not adhere to is beyond weird.

Then this:

It would be desirable for NVIDIA, as well as ATI, to offer a general quality option that would let the user unoptimized graphics quality.

Hello, Earth to Tom's Hardware, there is a setting called application in ATI's control panel app, just as they note in the preceding paragraph. Calling doing bilinear filtering when the application requests bilinear filtering an "optimisation" is truly weird.
 
nelg said:
NVIDIA's new optimization guidelines represent a step in the right direction following the massive amount of criticism the company has received during the past few months. NVIDIA shows it is taking buyers' wishes and considerations into account and is taking action (similar to how it made efforts in the past to remedy the loud noise level of the FX 5800 Ultra's (NV30) fans).

Just like THG themselves spoke out against cheating :rolleyes: . Seems to me as another half hearted attempt to appear impartial. Why would THG try and paint ATI with same brush as Nvidia here.

Heh...;) Sites like [H] and THG should really constrain themselves to looking at hardware other than 3d technology in the future--every time they tackle a 3d issue they screw it up royally.

Quotes like this are just plain amusing:

THG said:
However, the extent of those optimizations is not clear to us, since THG is not a member of Futuremark's beta program and we therefore do not have access to the 3D Mark 2003 developer's version, which would be required to determine optimizations such as clipping frames and unperformed buffer clears that have been under criticism. NVIDIA assures us, however, that these are no longer in the new driver-its own rules forbid it.

Translated: "We have no earthly idea what we're talking about concerning these issues with 3dMK03, and we're far too busy to concern ourselves with such things, but nVidia assures us they are following their "new guidelines"--but we can't tell. Wish we could, but that's life."

Great. They don't know whether it's true or not--so why are they commenting in the first place?

I also love it when people say "nVida told us" and "nVidia assures us...", etc. It would be far more interesting if instead of citing an abstract legality known as the nVidia corporation people might actually think to say, "John Doe at nVidia has the following to say, blah, blah." After all it is presumably real people working in the company who presumably have real names and who are presumably "assuring" people of various things. It could be that the few people making such "assurances" just don't want their names connected with those assurances, but that's another subject...;) Man....

And, I doubt THG knows anything about the UT2K3 trilinear problem with the newest Dets--or the fact that 3dGPU has indentified Brain Burke of nVidia as stating that nVidia--and nobody else--decides when nVidia meets its own "guidelines" and when it doesn't--and that nVidia has decided that the lack of full trilinear in UT2K3 does not constitute an IQ drop from full trilinear--and has also decided none of its customers needs full trilinear in UT2K3. Yep, such "assurances" as that really put nVidia on the right track of adhering to its own "guidelines", oh, yea.... :rolleyes:
 
Re: Weird

Bolloxoid said:
It would be desirable for NVIDIA, as well as ATI, to offer a general quality option that would let the user unoptimized graphics quality.
Hello, Earth to Tom's Hardware, there is a setting called application in ATI's control panel app, just as they note in the preceding paragraph. Calling doing bilinear filtering when the application requests bilinear filtering an "optimisation" is truly weird.
I think the optimsation they were trying to point out was when you select a Quality AF override, you only get trilinear on the first texture stage. So if you have an old game with no AF controls, you'd think that by forcing on Quality AF in the control panel you'd get the best quality, but actually you'd only get bilinear in some places if the game was using multitexturing. There is no way to force both trilinear and AF on all texture stages if your game doesn't have an AF control.
 
digitalwanderer said:
Ozymandis said:
Anand, still quiet :rolleyes:
Not for long. I expect him now to be under a bit of pressure to come out and tell us all about how he's been aware of it and secretly investigating it and only chose to speak out now because nVidia failed to implement a change they promised in private conversations that they now claim they didn't, yadda-yadda-yadda-ya. :rolleyes:

Expect 'em all to start falling in line with the truth now, just like dominoes or leppers or something. ;)

(I shouldn't be sarcastic, this is actually a good thing and something that I've kind of wanted for a long time....but it just seems SO hypocriticalish too. :( )

Funny to see all those old-school 'big' sites are gettin' the %# they deserve... :D

THG... ahhh... IMHO one of the most disgusting piece of something in the whole industry.
Pick any subject, any year, any issue - you'll always find something... :(
 
T2k said:
Funny to see all those old-school 'big' sites are gettin' the %# they deserve... :D
Funnier still is the spin their putting on it to try and smell like a rose, ala THG's article acting like they've been putting up news about it all along and such.

I really am looking forward to the spins the big sites are gonna put on it almost as much as the actual news they report. :LOL:
 
Tom said:
And they should do this, if possible, through a global quality setting through which the optimizations can also be completely unselected. This could prevent conflicts with the settings that can be made with the applications, thus also allowing maximum quality in games that don't have settings for filtering.

The same goes for ATI. Here, real trilinear filtering is possible only if an application can invoke it. This feature should be made abundantly clear in the driver

Yes, they both should have a setting in the drivers to Force True Complete tri-linear filtering (and other setting for max eye-candy) in the control panel. Most users don't (know how to) edit an ini file, and with a card's priced that high we should get max quality !!
 
IIRC though, isn't only the first texture stage all trilinear needs to applied to for many games, and that UT2003 is the exception rather than the rule? Not only that, but if it were applied to all stages, then games which didn't require this would still suffer a performance hit?

If this is true, and I'm not talking gibberish (trying to remember the wise lesson of B3D :)) then what ATi have done makes perfect sense (and we should really be bashing developers for not pulling their finger out and adding some REAL in-game options).
 
Quitch said:
IIRC though, isn't only the first texture stage all trilinear needs to applied to for many games, and that UT2003 is the exception rather than the rule? Not only that, but if it were applied to all stages, then games which didn't require this would still suffer a performance hit?

If this is true, and I'm not talking gibberish (trying to remember the wise lesson of B3D :)) then what ATi have done makes perfect sense (and we should really be bashing developers for not pulling their finger out and adding some REAL in-game options).
It's true that most games put their "main" texture, the most important one, on stage 0. Other stages are usually used for things that aren't as apparent to the user, like lightmaps, detail textures, or environment maps. So the image quality change is not as apparent by keeping trilinear on the main one and bilinear on all the others.

However that's really second-guessing the developers... no game has to put their main texture on stage 0, in which case the bilinear effect would be apparent. UT2003 uses many layers of textures so it's more apparent in that game, older titles just use one or two so it's less obvious. But if a game doesn't use the extra texture stages, it wouldn't have any performance impact to use trilinear on them because they'd never get sampled.
 
Myrmecophagavir said:
However that's really second-guessing the developers... no game has to put their main texture on stage 0, in which case the bilinear effect would be apparent. UT2003 uses many layers of textures so it's more apparent in that game, older titles just use one or two so it's less obvious. But if a game doesn't use the extra texture stages, it wouldn't have any performance impact to use trilinear on them because they'd never get sampled.

Which is a great illustration of why it's better to control IQ settings from the application rather than forcing them from the cpanel. If you include stage treatment controls in the cpanel then all you've done is move the guessing game from the IHV to the end user. Control from within the application is the only way to ensure things are done properly. Relative to ATi's approach, this is a completely moot issue, IMO.

nVidia is much different because the company has completely removed full trilinear support for UT2K3 from its drivers.
 
WaltC said:
Myrmecophagavir said:
However that's really second-guessing the developers... no game has to put their main texture on stage 0, in which case the bilinear effect would be apparent. UT2003 uses many layers of textures so it's more apparent in that game, older titles just use one or two so it's less obvious. But if a game doesn't use the extra texture stages, it wouldn't have any performance impact to use trilinear on them because they'd never get sampled.
Which is a great illustration of why it's better to control IQ settings from the application rather than forcing them from the cpanel. If you include stage treatment controls in the cpanel then all you've done is move the guessing game from the IHV to the end user. Control from within the application is the only way to ensure things are done properly. Relative to ATi's approach, this is a completely moot issue, IMO.

nVidia is much different because the company has completely removed full trilinear support for UT2K3 from its drivers.
Yep. I feel it's important that developers put these controls in-game. They shouldn't be so lazy as to rely on the control panel settings. In time, users can be "trained" to go to game settings instead of the control panel, but only if there's a concerted effort from developers. Come on Mr Vogel, lead the way...
 
I sent an annoyed email to the article's author, asking for some editorial backbone. Prolly more people need to do that.

rms
 
Back
Top