Doomtrooper
Veteran
Yeah it should read:worm said:Err.. I think this thread's subject is kinda wrong.
THe_KELRaTH said:I'd love to know ATI's comments on optimized / cheat drivers for benchmarks, do they, or have they done anything similar in Cat drivers for instance.
SanGreal said:THe_KELRaTH said:I'd love to know ATI's comments on optimized / cheat drivers for benchmarks, do they, or have they done anything similar in Cat drivers for instance.
Where were you when the 8500 was released?!
The ability to optimize for games presents a complex opportunity for "optimization". For my discussion, optimization can be thought of as either "invisible cheating" or "removing inefficiency". What is undesirable is when "benchmark specific" optimizations occur that are specifically of the "invisible cheating" variety and are exclusively applicable to the benchmark alone. "Invisible cheating" can be valid, IMO, if it is general and not intended for distortion of comparison (think of hidden surface removal), such as targetting a program whose only function is to provide benchmark results.
...
Randell said:SanGreal said:THe_KELRaTH said:I'd love to know ATI's comments on optimized / cheat drivers for benchmarks, do they, or have they done anything similar in Cat drivers for instance.
Where were you when the 8500 was released?!
in context the Quake3 optimization had been there for the R100 as well. Cheat or optimization?
IMO
Cheat = not rendering things properly or completely to gain speed.
Optimization = studying app code and adjusting drivers accordingly to respond better to the nature of that app, which may also have beneficial effects on other, similarly coded apps or ones driven by the same engine.
Should benchmarks be optimized for? In Utopia not specifically no, games should always be higher on the priority list - in the real world - what IHV isnt going to spend time optimising their code, evena littl bit, for the most common benchmarks be they synthetic or games?
That's a very wise advice, sort of like a machine that checks whether the person is lying or telling the truthIn a recent statement from Futuremark they claim they have code that checks for optimizing for 3Dmark2003. If this is true then why not supply this information about the 42.67/68 drivers and put an end to all the speculation.
I think IIRC that he's refering to the fact that when the Radeon 8500 was released it soundly beat the GF3 Ti500 in 3DMark01, yet lost in nearly every single game benchmark.
Joe DeFuria said:I think IIRC that he's refering to the fact that when the Radeon 8500 was released it soundly beat the GF3 Ti500 in 3DMark01, yet lost in nearly every single game benchmark.
Did it "soundly beat" the ti500? Or "beat, but by a small amount."
What about games today? (Which is essentially where 3DMark2001 was trying to 'predict' performance.) Does the GeForce3 Ti500 beat the Radeon 8500 in "nearly every single benchmark?" Or is it the other way around? (I believe it's the latter, though it's hard to find Geforce3 benchmarks around today.)
Was 3DMark01 ultimately more correct in assessing Radeon 8500 vs. GeForce3?
AnandTech said:While originally intended to be released alongside NVIDIA’s fall product line, increasing pressure from their chief competitor forced NVIDIA to push the release of their Detonator 4 drivers earlier than expected. The drivers will be released this week by NVIDIA and carry a version number of 20.xx, we tested with 20.80. Do not ask us to send you the drivers, you will have to wait for NVIDIA’s release later this week.