WTF? HardCOREware's views on cheating:

Althornin

Senior Lurker
Veteran
http://www.hardcoreware.net/reviews/review-158-1.htm
Let me give you a few precious pearls from this editorial:

Yes, you all may have heard about both NVIDIA and ATI being naughty about tweaking their drivers' quality settings with this mark, and Futuremark denouncing the action, but then releasing a joint statement with NVIDIA saying it was OK (at least on NVIDIA's part).
Supposedly the 330 patch eliminates the tweaks that both ATI and NVIDIA added to their drivers to drop quality levels when 3DMark03 is detected
I'm a journalist
too bad he cant seem to do any research into the matter.
 
Doomtrooper said:
:LOL:

Quote:
I'm a journalist

A: No a guy with a webpage :!:

I would add to that: "A guy with a web page who has delusions of journalism."

--'course, now if he'd said "I'm a yellow journalist," or "I'm a tabloid journalist"....well....how could you argue?
 
WaltC said:
Doomtrooper said:
:LOL:

Quote:
I'm a journalist

A: No a guy with a webpage :!:

I would add to that: "A guy with a web page who has delusions of journalism."

--'course, now if he'd said "I'm a yellow journalist," or "I'm a tabloid journalist"....well....how could you argue?

Yellow journalist?

I'm a yellow journalist (as long as that means I urinate on myself). ;) :LOL:
 
K.I.L.E.R said:
Yellow journalist?

I'm a yellow journalist (as long as that means I urinate on myself). ;) :LOL:

Heh-Heh...not quite what I had in mind...;)

The term "yellow journalism" is a term for sensationalism--and a "yellow journalist" is someone who writes wildly sensational, exaggerated stories. But I think you've just put a brand-new connotation on the term which just might stick...;)
 
Maybe we should talk about this :

I think what the graphics card industry (as well as us techie-journalists) need is an open source, DirectX9-based benchmark that implements optimized codepaths from EVERY GPU manufacturer. That means the two high performance players (NVIDIA and ATI) as well as all of the mainstream/wannabe guys (Add in SiS). Also, when a new player comes to market (S3: DeltaChrome), they should be given the same advantage as the others, thereby creating a level playing field. If the code is open, no one can sneak anything in (except Microsoft, by sneaking it into DirectX itself, but there's nothing we can do about that). It's there for the world to see and analyze if it's fair or not, and there's no cases of "well, X paid to be in the beta program; that's why they score higher than Y." Open source code would allow few secrets: not only would the results be available for public view and interpretation, but so would the code used to GENERATE these results. The first time someone tries to sneak something into it, someone else will see it, and blow the whistle.

Anyone else think this is just a bad idea ? If the source is open everybody can exactly see what is done so driver writers will know exactly what to look for and tweak. Also if the full source is available you'll have tons of people playing with it and generating different versions and nobody will know what to compare with what unless you stick with one single version.

Optimised code paths for all cards, why ? Why do people think a good benchmark would have to use different code on different hardware ? Should we re-introduce company specific APIs again so that you code to the metal of the hardware and make optimal use ? Don't think so, DX is a standard and you should use the standard and all cards should handle that standard fine with no tweaks or changes needed per card or company. Its apples versus apples, not apples versus pear, kiwi's, grapefruits, etc...

Anyway abd idea IMHO...

We have no way of testing performance under FP16, FP24, or FP32 color modes.

Thats because there is no such thing as FP16, FP24 and FP32... there is just FP (24+) under DX. Obviously you can PP everything but should developers really need to figure out where to put it ? When I code a shader I sure don't worry about what accuracy some operation would need to be at. But maybe thats just me.

A benchmark for stencil shadows will exist soon, in EgoSoft's X2 rolling benchmark, but nowhere else that I'm aware of.

Err, FableMark is a stencil shadows benchmark... SIGH...

Identical screenshots for IQ tests, ability to create color bars to easily see filtering level boundaries, no auto-tweaking of the program for whatever card is detected (though it might be more beneficial to allow this, and either enable or disable it). I love the built-in image-quality test idea. If some company is tweaking their drivers to lower quality levels in order to do better in the benchmark program, it would show up in that quality test screenshot.

Err, has this guy actually used 3DMark2003 since this is exactly what they offer. You can select a specific frame and render it, and even render and compare it using the ref rast. What more would you want for IQ analysis ? The filtering test also allows you to look at the filtering level boundaries. So he asks for exactly what 3DMark2003 offers, did he ever use the benchmark to its full potential ?

IMHO he is just trying to grab some extra hits on his site...

K~
 
Kristof said:
Optimised code paths for all cards, why ? Why do people think a good benchmark would have to use different code on different hardware ? Should we re-introduce company specific APIs again so that you code to the metal of the hardware and make optimal use ? Don't think so, DX is a standard and you should use the standard and all cards should handle that standard fine with no tweaks or changes needed per card or company. Its apples versus apples, not apples versus pear, kiwi's, grapefruits, etc...

Anyway abd idea IMHO...

Kristof, dunno if you missed this:

http://216.180.225.194/~beyond3d/forum/viewtopic.php?p=126626#126626

Anyhow, the problem IMHO is that we will then primarily be benchmarking the driver team.
 
Kristof said:
Maybe we should talk about this :


Anyone else think this is just a bad idea ? If the source is open everybody can exactly see what is done so driver writers will know exactly what to look for and tweak. Also if the full source is available you'll have tons of people playing with it and generating different versions and nobody will know what to compare with what unless you stick with one single version.

Optimised code paths for all cards, why ? Why do people think a good benchmark would have to use different code on different hardware ? Should we re-introduce company specific APIs again so that you code to the metal of the hardware and make optimal use ? Don't think so, DX is a standard and you should use the standard and all cards should handle that standard fine with no tweaks or changes needed per card or company. Its apples versus apples, not apples versus pear, kiwi's, grapefruits, etc...

Anyway abd idea IMHO...

I agree completely with this and all of your points. He seems to think the whole purpose of a benchmark is to showcase best-case hardware performance without regard to to ensuring that the processors tested are all doing the same amount of work in an environment close to what you find in shipping games. He just doesn't understand that principle or how it is germane to comparative hardware benching, apparently. The irony is that running 3DMark 03 in the 330 state is very, very much like running 98% + of all of the 3D games that ship, since the hardware companies do not optimize their drivers heavily for them and the games themselves contain no highly optimized vendor paths. The purpose of such a benchmark is to benefit the consumer--not any particular hardware vendor--and it does so by illustrating likely-case performance instead of special-case performance.

Behind all of the assorted nonsense circulating about "what's wrong with 3DMark" you'll find nVidia, simply because the company doesn't want the average-case performance of its current hardware to become exposed in any fashion. It's understandable, but still deplorable. It's also equally understandable that some people with web pages are confused about the issue, and unfortunate. The best benchmarks primarily benefit the consumer--the worst ones benefit the hardware companies. How many consumers want a benchmark to describe product performance while running 2% of all titles shipped? Certainly, if you run nothing but that 2% then such a benchmark would have value. I'd rather look at a benchmark which describes performance for the 98% of titles shipped that do not contain optimized vendor paths, and for which the drivers have not been heavily hacked on a special-case basis.
 
Kristof said:
Anyone else think this is just a bad idea ? If the source is open everybody can exactly see what is done so driver writers will know exactly what to look for and tweak. Also if the full source is available you'll have tons of people playing with it and generating different versions and nobody will know what to compare with what unless you stick with one single version.

Agreed for the second part, making it meaningful would require websites explaining which version they use. Not a big problem, but I certainly think the "guys with webpages" could use some super-dooper "tweaked" versions from their friends the IHVs...

For the first part, recent history has proven that a closed-source benchmark was very vulnerable to "optimizations". At least, with an open-source benchmark, you can modify it at home and recompile it, which gives lots of ways of finding the various cheats^H^H^H^H^H^Hoptimizations (changing textures, shaders, camera paths...).

What the author doesn't understand is that the only way to get total exposure would be to have an open-source benchmark and open-source drivers. The second part just isn't going to happen, and for good reasons.
 
What I was thinking when I read that was not the benchmark that are open source, but the drivers. In fact I was thinking it might be nice if we could see the driver code for the newer nVidia drivers to see exactly what cheats do or do not remain.

The only way this would be exposed though is through decompilation...which is something that most web page owners probably don't know about, and also I for one wouldn't want to publish an article on "what I found decompiling their driver". Reverse engineering is something many do, but few discuss openly. In fact this is what's implied with how ATI found the lattest cheat "The ATI folks say they were studying the Detonator FX driver's new filtering routines, to see what they could learn, when they discovered this quirk...", but in tipping tech-report.com off pointed out the renaming of 3dmark03.exe...

But open source drivers would allow one to detect who is cheating and who is not. Of course, if a company wants to cheat on a benchmark, they aren't going to want to get caught, so...
 
Son Goku said:
But open source drivers would allow one to detect who is cheating and who is not. Of course, if a company wants to cheat on a benchmark, they aren't going to want to get caught, so...
You'll never see open source drivers because there is far too much IP in them. Hell, you can get a good idea of how a chip works just looking at the register specs and drivers contain a lot more than that.
 
first open source drivers wont happen since that would release to the public many patents that are only licenced (but not owned) by nv, ati,...
second what is the point of having a standard dx, if everyone is going to optimize their code their own way. I agree with you 100% kristof. ;)

later,
 
Kristof said:
Anyone else think this is just a bad idea ? If the source is open everybody can exactly see what is done so driver writers will know exactly what to look for and tweak.

Doesn't the BETA members of FM already have access to the source code ?

Also if the full source is available you'll have tons of people playing with it and generating different versions and nobody will know what to compare with what unless you stick with one single version.

I don't see this as being a problem as long as you do what SA proposed. Keep one codepath fixed and let the IHV's make their own codepaths and use both while testing. And everybody would see what the IHV did to gain performance and could have them explain how that could lead to the supposed performance increase in question. Now, they could of course still cheat since the code still has to pass the drivers.

But, if you allow the IHV's to do this you'd create a situation where the they couldn't really blame the benchmark if the performance is crappy. Thus, cheating (or should i say optimizing :)) would create a lot bigger problem for a IHV if they were caught. And the risk of getting caught is rather big since the code (i'm guessing that we're talking about the shader code only) is open source.

Optimised code paths for all cards, why ? Why do people think a good benchmark would have to use different code on different hardware ? Should we re-introduce company specific APIs again so that you code to the metal of the hardware and make optimal use ? Don't think so, DX is a standard and you should use the standard and all cards should handle that standard fine with no tweaks or changes needed per card or company. Its apples versus apples, not apples versus pear, kiwi's, grapefruits, etc...

Anyway abd idea IMHO...

There's a standard API but is there a "standard way" of using the API ?
What we do know is that you can use the standard API differently and gain performance even while doing exactly the same thing. Thus, how are you going to create a codepath that you are sure is optimal for all IHV's ? Maybe not optimal but completely neutral ?

The probable answer is, you can't.

And won't this problem be bigger and bigger the more advanced the hardware will become ?

Another thing, we're moving towards HLSL's (quickly ?) and then the "apples to apples" comparision will be a thing of the past anyway. Though hopefully only AFA the workload goes then and not the image quality.

/Note: Opinions from a non 3D programmer :)
 
Yes, you all may have heard about both NVIDIA being naughty about tweaking their drivers' quality settings with this mark, and Futuremark denouncing the action, but then releasing a joint statement with NVIDIA saying it was OK. If you haven't, here's the latest press release (the joint one):
He took ATI out of there now :?
 
Back
Top