ATI's and Futuremarks Response to Nvidia Claims

worm said:
Err.. I think this thread's subject is kinda wrong.
Yeah it should read:

ATI's response to nvidia's comments on Futuremarks claims of being an DX9.0 benchmark. :p
 
I'd love to know ATI's comments on optimized / cheat drivers for benchmarks, do they, or have they done anything similar in Cat drivers for instance.

Maybe it would have been wiser that rather than just recommending WHQL drivers be used the licence should say ONLY WHQL drivers be used for public tests, comparision reviews.

"Furthermore, 3DMark03 includes advanced analytical tools to enable independent observers to catch any potential questionable driver optimizations. By taking a tough stance against any kind of driver optimization, the media can discourage this practice"

So.. Worm, with this info are Nvidia's 42.67/68 using optimizations / cheats and if so how? Or did they just change the direction of their prayer mat :LOL:
 
THe_KELRaTH said:
I'd love to know ATI's comments on optimized / cheat drivers for benchmarks, do they, or have they done anything similar in Cat drivers for instance.

Where were you when the 8500 was released?!
 
SanGreal said:
THe_KELRaTH said:
I'd love to know ATI's comments on optimized / cheat drivers for benchmarks, do they, or have they done anything similar in Cat drivers for instance.

Where were you when the 8500 was released?!

in context the Quake3 optimization had been there for the R100 as well. Cheat or optimization?

IMO

Cheat = not rendering things properly or completely to gain speed.

Optimization = studying app code and adjusting drivers accordingly to respond better to the nature of that app, which may also have beneficial effects on other, similarly coded apps or ones driven by the same engine.

Should benchmarks be optimized for? In Utopia not specifically no, games should always be higher on the priority list - in the real world - what IHV isnt going to spend time optimising their code, evena littl bit, for the most common benchmarks be they synthetic or games?
 
Here is my take on the issue of optimization, and my reasoning as to why 3dmark03 doesn't worry me as much in this regard as prior 3dmark applications.

The beginning of my text:

The ability to optimize for games presents a complex opportunity for "optimization". For my discussion, optimization can be thought of as either "invisible cheating" or "removing inefficiency". What is undesirable is when "benchmark specific" optimizations occur that are specifically of the "invisible cheating" variety and are exclusively applicable to the benchmark alone. "Invisible cheating" can be valid, IMO, if it is general and not intended for distortion of comparison (think of hidden surface removal), such as targetting a program whose only function is to provide benchmark results.
...
 
Randell said:
SanGreal said:
THe_KELRaTH said:
I'd love to know ATI's comments on optimized / cheat drivers for benchmarks, do they, or have they done anything similar in Cat drivers for instance.

Where were you when the 8500 was released?!

in context the Quake3 optimization had been there for the R100 as well. Cheat or optimization?

IMO

Cheat = not rendering things properly or completely to gain speed.

Optimization = studying app code and adjusting drivers accordingly to respond better to the nature of that app, which may also have beneficial effects on other, similarly coded apps or ones driven by the same engine.

Should benchmarks be optimized for? In Utopia not specifically no, games should always be higher on the priority list - in the real world - what IHV isnt going to spend time optimising their code, evena littl bit, for the most common benchmarks be they synthetic or games?

I think IIRC that he's refering to the fact that when the Radeon 8500 was released it soundly beat the GF3 Ti500 in 3DMark01, yet lost in nearly every single game benchmark.
 
Yup, and let's not forget it wasn't using CAT drivers ;)

I consider a game cheat as something like the missing Fog. My reasoning is that it's back on in WHQL drivers but disappears in all the recent beta's.
It has a high impact on multiplayer fps games as the gamer with the fog cheat can view much further than a gamer with fog effects on.
I believe it was DICE, in a recent interview, (Battlefield1942), that classified it as a cheat when asked if players could turn it off ingame.
In this situation Nvidia are deliberately making sure their graphics cards have an advantage for gamers - if it were a coding error it wouldn't just be on in WHQL drivers.

In a recent statement from Futuremark they claim they have code that checks for optimizing for 3Dmark2003. If this is true then why not supply this information about the 42.67/68 drivers and put an end to all the speculation.
 
In a recent statement from Futuremark they claim they have code that checks for optimizing for 3Dmark2003. If this is true then why not supply this information about the 42.67/68 drivers and put an end to all the speculation.
That's a very wise advice, sort of like a machine that checks whether the person is lying or telling the truth :D
 
I think IIRC that he's refering to the fact that when the Radeon 8500 was released it soundly beat the GF3 Ti500 in 3DMark01, yet lost in nearly every single game benchmark.

Did it "soundly beat" the ti500? Or "beat, but by a small amount."

What about games today? (Which is essentially where 3DMark2001 was trying to 'predict' performance.) Does the GeForce3 Ti500 beat the Radeon 8500 in "nearly every single benchmark?" Or is it the other way around? (I believe it's the latter, though it's hard to find Geforce3 benchmarks around today.)

Was 3DMark01 ultimately more correct in assessing Radeon 8500 vs. GeForce3?
 
Joe DeFuria said:
I think IIRC that he's refering to the fact that when the Radeon 8500 was released it soundly beat the GF3 Ti500 in 3DMark01, yet lost in nearly every single game benchmark.

Did it "soundly beat" the ti500? Or "beat, but by a small amount."

What about games today? (Which is essentially where 3DMark2001 was trying to 'predict' performance.) Does the GeForce3 Ti500 beat the Radeon 8500 in "nearly every single benchmark?" Or is it the other way around? (I believe it's the latter, though it's hard to find Geforce3 benchmarks around today.)

Was 3DMark01 ultimately more correct in assessing Radeon 8500 vs. GeForce3?

Thats a good point Joe. From what I have read in terms of reviews the Ti500 looses substantially to the Radeon 8500 now. In fact I saw a few reviews where the Radeon 8500 actually gave the Geforce 4 Ti 4600 a run for its money with AA and AF turned off. The Radeon 8500 is a little more on par with the Geforce 4 Ti 4200 IMO. But back to the driver issue upon the initial previews of the Radeon 8500 some sites (EG TomsHardwareGuide and Anandtech) received some special drivers (wink wink nudge nudge) from Nvidia that according to Anandtech were supposed to be released not more then a few days after the preview but never did show up for nearly three months. Without these speacial drivers from Nvidia the Radeon 8500 soundly beat the Geforce 3 IIRC at launch but there were very few up in arms with regards to that matter.

http://www.anandtech.com/video/showdoc.html?i=1516

AnandTech said:
While originally intended to be released alongside NVIDIA’s fall product line, increasing pressure from their chief competitor forced NVIDIA to push the release of their Detonator 4 drivers earlier than expected. The drivers will be released this week by NVIDIA and carry a version number of 20.xx, we tested with 20.80. Do not ask us to send you the drivers, you will have to wait for NVIDIA’s release later this week.


Likely the drivers were buggy as hell anyway, same goes with the ones used recently on 3DMark2003 @ [H]. Likely won't see these wink-wink nudge-nudge drivers shipping with the GeforceFX either.

BTW nvidia did a bunch of optimising on their GeforceFX drivers does anyone know if they gave up on trying to use PS1.1? God knows how much faster PS1.4 is compared, they must have gone with PS1.4 on GeforceFX to get the boost they needed. Does anyone know for sure though?
 
NOt sure if this is still the case but back in june an 8500 could beat a ti4600 at higher res jk2 and a ti4200 at RTCW. MOst of the time now its a little below the ti4200.
 
Back
Top