Forcing DX8 features by a tool

BTW, its very interesting to see that the so called "very advanced" Nature test in 3DMark2002 hardly uses any PS & VS effects.....

In fact you could do the same effects (or very close to them) using DX7 hardware.

How does this really show the potential and flexability of PS and VS?

Has anyone tried out the Advanced Pixel Shader test from the SE version? What effects are missing from this test when this app is used on DX7 hardware?


One more interesting thing...What performance would a GeForce 2 ULTRA, the fastest DX7 card (I think!), get in the Nature test? Anyone think it would beat a GeForce 3...?
 
VS is used extensively, AFAIK (although Wom and Neeyik would be better placed to answer) but VS can also be adequately done via software - DirectX has a software VS path that is used by default if hardware VS is not detected.
 
which is way a 'Nature' scene appears in demo mode which runs on all cards. That has no PS'd water, just VS grass/trees.
 
DaveBaumann said:
VS is used extensively, AFAIK (although Wom and Neeyik would be better placed to answer) but VS can also be adequately done via software - DirectX has a software VS path that is used by default if hardware VS is not detected.

Thanks, shows how much I know :oops:

So could you, with this app, force a GeForce 3/4 to use the software VS path?

It would be interesting to know how much faster a GeForce 3/4 would be when using hardware VS over software VS.
 
Well you have 3 paths for Gf3/4 & Rad8500 - Software TnL, Hardware TnL & Pure Hardware TnL (using VS/PS's I beleive).

So I always thought to compare a Gf3 to a Gf2/1 use Hardware TnL not Software TnL.
 
Hello all!

Worm,

As Nappe1 perfectly noticed, you cannot use this software to cheat, it clearly says "3D Analyzer" as a rendering device.

I fail to understand why should game developers prevent this app from running with their games? Why should they care if someone runs their T&L only games on non T&L cards? (this is just an example)

I think this is getting ridiculous , are we going to see a "coalition" of developers against a 350k app which does nothing else but reporting some caps? What's next on the ban list, debuggers?
 
It's all about benchmarks. If you can disable certain features when benchmarks are run, you can cheat on the benchmarks. It's that simple.
 
Chalnoth said:
It's all about benchmarks. If you can disable certain features when benchmarks are run, you can cheat on the benchmarks. It's that simple.

Why does cheating at benchmarks matter?

My view of benchmarking is a way of finding out which card is the best for me, depending on features and the system specs the tests were run on.

As long as the reviewer is honest, I don't see a problem.
 
Hellbinder[CE said:
]
Please I would wager that at least 50% of the scores in the hall of fame are spoofed in some fasion.
50% of the scores? Hehe.. You really think so? We work every day to check for cheated scores. In most cases they are traceable, and we delete them. The database we use for all our services is very accurate. Everytime we find some new way of cheating, we start digging in the DB, and usually find a few cheated ones and they get deleted. There are some things you may not know about how we detect cheated scores. No I won't say them here. If I would, someone would most probably make some new "hack" again using my info to cheat.

Hellbinder[CE said:
]Another thing this clearly brings to light. the ABSOUTE bias and Favrotism your company shows Nvidia. If there was any doubt before it is GONE now. A simple water reflection that could easily be done with EMBM sends Nvidia scores more than DOUBBLE every one elses hardware. Im not leaving ATi out here, either, but they are clearly simply recieving the benefit as an afterthought.
Hehe.. You MUST be joking, right? I won't even bother to spend more minutes on replying to this ancient "claim". Still very funny though! If you really want to know which companies work with us and how, be my guest and visit this page:

http://www.madonion.com/betaprogram/

Hellbinder[CE said:
]The little respect i had for you people and your god forsaken benchmark, is *completely* gone.
Well, that's all up to you.

Ok, I should have explained myself a bit better when it comes to the "other developers". You see, some games are used as "benchmarks" and they will also get affected by this. I didn't mean that I will go on a crusade here! Just exchange some words with some people. If they are not interested, or think it's cool to have such a util, is totally up to them.

Also, games designed for DX8 and upwards are designed for DX8 and upwards. If I would be a game developer myself, and someone would use hacks to play that game, I would be pretty pissed. People would see the game wrong! Not how I made it. That would be very annoying! o_O Now I already hear someone say "but isn't it good to get a wider audience, eventhough they use some hack to play?". In this case, no. If I would have decided to go for DX8 and upwards, I would already be prepared that a lot of gamers who still for some reason have DX7 hardware, would not be able to play. (Note: I don't currently develop any games!)

I doubt that anyone who has a DX7 class card, would go out to the shop and buy a DX8 game, and go home and use the hack to play.. I simply can't believe anyone would do that!

The GameTest4 in 3DMark2001 SE uses both VertexShaders and PixelShaders pretty heavily. If you want more detailed info on the test, you can read the help file that comes with the software. There you will get pretty good info on what is what.

It's completely obsolete to compare a GF2 Ultra (using the hack) and a GF3 in GameTest4. You simply can't compare. The GF3 has to render VS & PS in hardware, whereas the GF2U renders neither. The CPU would take care of the VS, and no PS would even be rendered. That is like comparing bananas to apples. Simply doesn't work that way.

The lake in GT4 wouldn't be the same using any other technique. Maybe you could do something like it, but surely not the same.

As Nappe1 perfectly noticed, you cannot use this software to cheat, it clearly says "3D Analyzer" as a rendering device.
That's true, but it is only a matter of time when it says whatever gfx card the user has.. Trust me! ;)

Why does cheating at benchmarks matter?
It's pretty simple. A huge ammount of potential buyers of new hardware read reviews, benchmark scores etc. on the net, and make their desicion based on the data they read. What if everything was crap? People would buy crap, and something they didn't even want. Do they trust the reviewers? Yes they do. It's our job to make sure the benchmarks are reliable, accurate and provide usefull data for the consumers.

I'm very tired (been working the whole day), and now I need some sleeeeeep... Sorry for any typos and .. bad behaviour! ;)
 
Good points Worm. I can see where your coming from now…

I still think that it’s a useful app though, IMO.
 
Sorry for bringing the matter up, but it simply is an amazing tool. weather one like it or not.
And, frankly, not cheating but tweaking matters for me!


I don't like to bore you anyway, but i allways felt so fooled about Naturebenchmark, because it looked so well in the Demomode and was unaccessable in Naturebenchmark.
Now i know that there is almost no PS-made water ; but water is just the weakest part in that benchmark and Trees and Leaves are looking so stunning vibrant!
If the Impact of the PS in Nature results e.g. in 25 fps (Gf3ti200) and 50 fps (Gf4ti4400) what happens when the PS part scales up?
That would paralyze almost any recent hardware and explains the call for much more GPU-power by JC and MR!

BTW, i was hoping for a big boost by ISSE2 for SW-T&L (100%) and with 4000Mhz CPU-power ante portas it would be interesting to tweak around a bit ;o)
any chance?
 
McElvis said:
Why does cheating at benchmarks matter?

My view of benchmarking is a way of finding out which card is the best for me, depending on features and the system specs the tests were run on.

As long as the reviewer is honest, I don't see a problem.

Right, I don't believe it is a problem as far as reviews of video cards are concerned, but it could be an issue when you're comparing your benchmarks to those of other video cards on the Madonion website. That is, you might notice that some people are getting many hundreds of points higher than you, and it turns out the only reason they're able to do that is because of tweaks.

Of course, it's not that big of a deal, but it should still be considered.
 
Good luck using this wrapper on Doom III... it's an OpenGL program

OpenGL is supported in the app so they makers of this app will likely end up supporting the same features in OpenGL as they support in D3D.

Excuse me if I understand you incorrectly, but do you mean that by using this cheat, people with DX7 class cards can get a glimpse of Pixel Shaders on their systems? IF you mean that, you should go and hide! No PixelShaders are even drawn when using this "utility". They are simply not displayed. If I misapprehended your sentence, please forgive me.

"Bar" means appart from.. at least it does where I come from anyway. So I was saying they could run the nature test appart from the pixel shaders which is a very small part of the test anyway.

Excuse me but do you even know how our database works, how we filter out scores that are cheated, how the Hall of Fame is done, and what it is based on? Please do share it with me, as you clearly know.

No I don't know exactly how it works but if your asking me to believe its cheat proof then sorry but I don't believe that. If someone drops their cards lod through the drivers to extremely low and the test looks bad but runs faster then that's cheating to me. Can you stop that from happening? If someone overclocks their graphics card and doesn't mention it then that's cheating, do you stop that (I know you don't).

True, but it has never been this easy. I have already got an email from the author, and we will work together to prevent this util to be used with 3DMark. I have a feeling that game developers will do the same.. There is a reason why for example DOOM III minimum requirements are higher than any other games..

Your going to stop future 3dmark's working with this app in order to stop cheating? Why would you need to do that? The app changes your cards name to 3D Analyze.. surely you can just disalow any 3dmark score that's entered into your database where the renderer is called 3d analyze. To actually stop the benchmark working with this app is silly IMO because this app is useful for allot more then cheating. In fact this app isn't even useful for cheating considering it changes your renderer name to 3D Analyze.

Too suggest that developers should make thier games so that this prog doesn't work is just insane. I mean what in the world is your reasoning for thinking anyone should do that? These are games for god sake. If someone wants to use an app to disable a feature or fake it to allow their card to run the game then that's their choice, its no skin off the developers nose. I see no sense at all in someone like Carmack for instance writting Doom3 so it can't work with 3D Analyze.. it baffles me that you would even suggest this.

Again.. You seem to know how our database works and what kind of data there is, don't you? I haven't seen you in our offices, so how is it possible?

I don't need to know how your database works.. again how do you filter out overclocked graphics cards and driver settings that seriously hurt the look of the app to up the score?.. you don't.

This "utility" enables ways to cheat in 3DMark and possibly other D3D benchmarks, so yes we can and will blame the utility.

Once again the app changes the renderer name to 3D Analyzer, its not hard to filter this out or for anyone looking at the database to see that the test is using this app and so cheating.

I admit that the "utility" is cool in some aspects, but I do question its usefulness and goals.

Yeah right, a program that is extremely useful in loads of ways and even when used to cheat disliberately distinguishes itself by changing the renderer name must deffinately have the goal of cheating :rolleyes:

I'm going to work hard to get this util not to work in 3DMark, and try to encourage other developers to do the same. It's causing more harm than good. Trust me!

What are you talking about? Encourage developers to stop the app working in their games because its doing more harm then good? Please explain yourself because this makes no sense to me. WTF has it got to do with anyone how I or anyone else runs a game that we have paid for? Why should a dev not allow me to use a app in a game that simply allows me to log framerate's/polycounts and fake features? I've got news for you mate, their is no cheating in games, its a game, its their to be played.

That's true, but it is only a matter of time when it says whatever gfx card the user has..

Oh I see so now your not trying to ban a current app but your infact trying to ban a app that doesn't even excist yet (one that allows a graphics card name to be chosen). AFAIK this app is not intended for cheating and so future versions will not specifiy any graphics card name you want as that would only help cheaters and nobody else. If you really know different then give me some solid info on this. Because if this does happen then you could simple try to ban the new version and leave the old version working.

It's pretty simple. A huge ammount of potential buyers of new hardware read reviews, benchmark scores etc. on the net, and make their desicion based on the data they read. What if everything was crap? People would buy crap, and something they didn't even want. Do they trust the reviewers? Yes they do. It's our job to make sure the benchmarks are reliable, accurate and provide usefull data for the consumers.

I have a news flash for you. Firstly reviews use graphs or tables made by the reviewer. The reviewer doesn't have to fake the test to show fake numbers. He just needs to make up the numbers and put them in a graph or table. Secondly if they want to really fake an app they can already do that easily without this app (overclock the video card that they want to look best, overclock the system while doing the benches on the vid card they want to look best, drop lod ect).

but it could be an issue when you're comparing your benchmarks to those of other video cards on the Madonion website. That is, you might notice that some people are getting many hundreds of points higher than you, and it turns out the only reason they're able to do that is because of tweaks.

This happens anyway. Go to the madonion database and take your results from a normal system and put them up against the same system. I guarentee you'll get loads of systems the same as yours with all with significantly higher scores. That's because people overclock their cards to the nth degree and tweak their drivers to drop texture detial ect and up their scores without mentioning it, allot of people do mention their overclocks but not all of them. This cheating is already rife in the madonion database and because of this its not a totally reliable database when trying to look at relative performance of different graphics cards.
 
Teasy said:
Good luck using this wrapper on Doom III... it's an OpenGL program

OpenGL is supported in the app so they makers of this app will likely end up supporting the same features in OpenGL as they support in D3D.
I think what you meant was "not supporting". :rolleyes:

So what will happen when the application attempts to call an extension function that doesn't exist in the driver? If the wrapper properly handles this, it will take a lot of work as there are tons of extensions out there that your card/driver may or may not support. Do you really think playing Doom III this way will be enjoyable, assuming it works at all?

Personally, I see this wrapper more useful as a way to disable features than to enable features. Then you can see what sort of impact different features have on different applications. Enabling features that your board doesn't support sounds pretty useless.
 
From the screens, all it does is reports an extension as supported, and then doesn't do anything when that extension is called.
 
Hehe.. You MUST be joking, right? I won't even bother to spend more minutes on replying to this ancient "claim". Still very funny though! If you really want to know which companies work with us and how, be my guest and visit this page:

Nobody is joking or laughing :rolleyes: ....if you worked with half of these companies you wouldn't be able to advertise a Peformance Analyzer that FOOLS consumers into thinking a Geforce 2 MX is faster than a Kyro II ;)

http://www.madonion.com/measurementservices/

Lets not even talk about 3Dmark 2001 SE that was supposed to be a DX 8.1 benchmark yet the only DX 8.1 card got Zero points in any of the game tests that counted (Nature being one of them) instead we got to watch some fish swimming through water..

In case you and Madonion didn't realize PS 1.4 is not a 'feature' its part of Dx 8.1 and when John Carmack states he can see as much as 30-40% speed improvement I tend to believe him.


I'm very tired (been working the whole day), and now I need some sleep also ;)
 
Of course, the Radeon 8500 apparently isn't able to make use of this increased speed, as it still underperforms GF4's that have to do quite a few more passes.
 
Chalnoth said:
Of course, the Radeon 8500 apparently isn't able to make use of this increased speed, as it still underperforms GF4's that have to do quite a few more passes.

A test of light interaction speed initially had the 8500 significantly slower
than the GF3, which was shocking due to the difference in pass count. ATI
identified some driver issues, and the speed came around so that the 8500 was
faster in all combinations of texture attributes, in some cases 30+% more.
This was about what I expected, given the large savings in memory traffic by
doing everything in a single pass.

http://www.webdog.org/plans/1/
 
Chalnoth said:
Of course, the Radeon 8500 apparently isn't able to make use of this increased speed, as it still underperforms GF4's that have to do quite a few more passes.

Funny. A few months ago, round nVnews, you were touting the GF3 as being better than the R8500. Looks like ATI's drivers and card dont suck as much as you think, huh?
I think its great that now you have to start comparing the 8500 to the GF4.
Shows how much ATI ahs improved, and how little you credit them with that.
 
Back
Top