Teasy said:
No I don't know exactly how it works but if your asking me to believe its cheat proof then sorry but I don't believe that. If someone drops their cards lod through the drivers to extremely low and the test looks bad but runs faster then that's cheating to me. Can you stop that from happening? If someone overclocks their graphics card and doesn't mention it then that's cheating, do you stop that (I know you don't).
Hmmm.. Changing LOD is IMHO not cheating. It lowers (or vice versa) the imagequality to a certain degree, and by that you can gain some performance. Same goes with overclocking. Why would it be considered cheating to overclock? Even manufacturers "overclock" their cards to be better than others.. I don't see why these 2 things even would be considered cheating? In the future 3DMark's we will try to get a GPU MHz detection, which makes it possible to print out the speed of the card's core and RAM. Let's hope we get it implemented..
Your going to stop future 3dmark's working with this app in order to stop cheating? Why would you need to do that? The app changes your cards name to 3D Analyze.. surely you can just disalow any 3dmark score that's entered into your database where the renderer is called 3d analyze. To actually stop the benchmark working with this app is silly IMO because this app is useful for allot more then cheating. In fact this app isn't even useful for cheating considering it changes your renderer name to 3D Analyze.
Useful how exactly? We don't see it useful with 3DMark at all.
Too suggest that developers should make thier games so that this prog doesn't work is just insane.
Well, there are games that have a sorta built in "benchmark". They are also affected by this. Same goes for games designed for DX8. Using this utility only makes the game look bad, and maybe get a bad rep. Who knows..
I don't need to know how your database works.. again how do you filter out overclocked graphics cards and driver settings that seriously hurt the look of the app to up the score?.. you don't.
So what you are saying is that it is cheating to use FSAA too? Is changing imagequality cheating nowdays?
And what comes to overclocking, come on! You should know better.
Once again the app changes the renderer name to 3D Analyzer, its not hard to filter this out or for anyone looking at the database to see that the test is using this app and so cheating.
We just checked this, and the renderer name is not submitted with the data when submitting a project. Also, if the author changes version number or util name, the "detection" would fail. We need to get a string that will be there in all future versions, and we are working on it.
AFAIK this app is not intended for cheating and so future versions will not specifiy any graphics card name you want as that would only help cheaters and nobody else. If you really know different then give me some solid info on this. Because if this does happen then you could simple try to ban the new version and leave the old version working.
All I can say here is that we will try to prevent this util (current and future versions) to work with 3DMark, in order to avoid _any_ problems in the future.
Go to the madonion database and take your results from a normal system and put them up against the same system. I guarentee you'll get loads of systems the same as yours with all with significantly higher scores.
Why is it that people really think they know something, when they don't? Geez.. Listen up. You think we have only(!) submissions from people who have overclocked their systems? *BEEP* wrong answer. The fact is that we have most of the submissions coming from "normal" users, who haven't overclocked, tuned nor lowered imagequality to gain better scores. Most of our data is so called "normal" data. We have peaks from guys like Macci, and then we have lows from some other peeps. Still the very most of the projects are around "normal" level. If you look at published scores, you might see mostly highs. The reason is that "normal" users very seldom publish their projects. Only the ones that are "into" benchmarking and tweaking want their scores to be comparable.
You can show me your beta program list till you are blue. You can try to brush aside the evidence all you want.. In the end it simply does not change what is pretty evident.
Your program FAVORS Nvidia cards. PERIOD. It has for years. Just becasue Ati had the lead for a whle *in spite of your efforts* means nothing. After all you guys quickly *fixed* that problem with the release of SE.
What exactly is your so called evidence?
After all you did not choose to support a PS 1.4 test that would clearly favor ATI.
We have the Advanced PixelShader test in 3DMark2001 SE, and here is what we have in the help file:
This is a new test included in 3DMark2001 SE, and it uses Pixel Shader version 1.4, introduced in DirectX 8.1. The same effect can also be achieved using Pixel Shader 1.0, but then rendering the water surface requires two passes. Graphics hardware that supports Pixel Shader 1.4 (or higher) render the water in a single pass.
We don't favor any specific card nor manufacturer. The test
favors any card that support PS 1.4.
Yes, the water has a bump-mapped reflection of the terrain around it, so you would definitely need to use bump mapping in conjunction with cube mapping.
This is actually possible on a Geforce/2 card, though it is very expensive to perform. If it was emulated on these video cards, they'd probably get in the range of 2-5 fps in the nature demo.
and
Especially when you could have EASILY coded the nature demo to use EMBM
Not exactly. I have been told that you can not have cube mapping on EMBM. You can map some texture on EMBM but it'd be "2D" and always follow the camera, which would make it look dumb. Do correct me if I'm wrong.. Besides, we wanted to use the PixelShaders, as it is a game test that uses Hardware PS and VS.
Now back to work.