Forcing DX8 features by a tool

Althornin said:
Funny. A few months ago, round nVnews, you were touting the GF3 as being better than the R8500. Looks like ATI's drivers and card dont suck as much as you think, huh?

I still think the Ti500 is superior to the 8500 in today's games. I really have no way of knowing how it will do in DOOM3, and since I've only heard JC compare it to the GF4 (and not 3) recently, I cannot draw any conclusions about how it will do against the GF3.
 
Btw, that .plan file is rather old, but I could have sworn I remember him stating around the same time that he still felt that a GeForce3 Ti was a safer bet for DOOM3.
 
Chalnoth said:
I still think the Ti500 is superior to the 8500 in today's games.

Why considering it doesnt win many benches (using the latest drivers) including game benchmarks. In fact, if you look at HardOCP's latest "review/preview" of the Xabre400 : http://www.hardocp.com/article.html?art=MzA5LDI= you will see a 128Mb 8500LE (yep, the LE version, clocked slower!!!) beating a GF3 TI500 in Commanche4, Jedi Knight2, but not Quake3, which suggests to me that the 8500 IS superior in todays games (quake3 is not "todays games", it is old - JK2 is based on the Q3 engine, and is new, and the GF3 TI 500 gets tromped on by the 8500LE).

BTW: Any links you post of reviews with old drivers are pointless. Todays games are played on todays machines which can have todays drivers. Not 3 month old drivers.
 
Althornin said:
Why considering it doesnt win many benches (using the latest drivers) including game benchmarks.

A few things:

1. That was using the 128MB Radeon 8500 LE, which is a little bit superior to the stock Radeon 8500 (Apparently, particularly in games like JK2).

2. Even then, in Commanche 4, the results were a near-tie, making the overall result still closer to a tie.

3. This really isn't comprehensive enough for a good analysis. Perhaps another four or so games and I'd call it good.
 
worm[MadOnion.com]

You can show me your beta program list till you are blue. You can try to brush aside the evidence all you want.. In the end it simply does not change what is pretty evident.

Your program FAVORS Nvidia cards. PERIOD. It has for years. Just becasue Ati had the lead for a whle *in spite of your efforts* means nothing. After all you guys quickly *fixed* that problem with the release of SE.

There is simply no excuse for Kyro cards to score below GF2 mx cards, or even GF2 TI cards. Especially when you could have EASILY coded the nature demo to use EMBM, or made the pixel test a non scoring test. After all you did not choose to support a PS 1.4 test that would clearly favor ATI. Even though you did the EXACT OPPOSITE for Nvidia just one release before. The absolute hypocracy is in bold red ink on every single page.

It is totally redicuous that the ONLY people who say its not true are YOU and the Nvidians. Gee.... I WONDER WHY THAT IS....

:rolleyes:
 
Hellbinder[CE said:
]There is simply no excuse for Kyro cards to score below GF2 mx cards, or even GF2 TI cards.

Why, pray tell? It's not like the Kyro cards will always outperform even an IMR with similar specs, let alone IMR's with specs far beyond it....
 
Hi Hellbinder,
Especially when you could have EASILY coded the nature demo to use EMBM
Pray, how? I don't see a possibility to create the same effect without any use of a (dynamically rendered?) Cube Map. But then, I haven't watched the nature bench in some time, so I might be wrong.

ta,
.rb
 
Yes, the water has a bump-mapped reflection of the terrain around it, so you would definitely need to use bump mapping in conjunction with cube mapping.

This is actually possible on a Geforce/2 card, though it is very expensive to perform. If it was emulated on these video cards, they'd probably get in the range of 2-5 fps in the nature demo.
 
Teasy said:
No I don't know exactly how it works but if your asking me to believe its cheat proof then sorry but I don't believe that. If someone drops their cards lod through the drivers to extremely low and the test looks bad but runs faster then that's cheating to me. Can you stop that from happening? If someone overclocks their graphics card and doesn't mention it then that's cheating, do you stop that (I know you don't).
Hmmm.. Changing LOD is IMHO not cheating. It lowers (or vice versa) the imagequality to a certain degree, and by that you can gain some performance. Same goes with overclocking. Why would it be considered cheating to overclock? Even manufacturers "overclock" their cards to be better than others.. I don't see why these 2 things even would be considered cheating? In the future 3DMark's we will try to get a GPU MHz detection, which makes it possible to print out the speed of the card's core and RAM. Let's hope we get it implemented..

Your going to stop future 3dmark's working with this app in order to stop cheating? Why would you need to do that? The app changes your cards name to 3D Analyze.. surely you can just disalow any 3dmark score that's entered into your database where the renderer is called 3d analyze. To actually stop the benchmark working with this app is silly IMO because this app is useful for allot more then cheating. In fact this app isn't even useful for cheating considering it changes your renderer name to 3D Analyze.
Useful how exactly? We don't see it useful with 3DMark at all.

Too suggest that developers should make thier games so that this prog doesn't work is just insane.
Well, there are games that have a sorta built in "benchmark". They are also affected by this. Same goes for games designed for DX8. Using this utility only makes the game look bad, and maybe get a bad rep. Who knows..

I don't need to know how your database works.. again how do you filter out overclocked graphics cards and driver settings that seriously hurt the look of the app to up the score?.. you don't.
So what you are saying is that it is cheating to use FSAA too? Is changing imagequality cheating nowdays? ;) And what comes to overclocking, come on! You should know better.

Once again the app changes the renderer name to 3D Analyzer, its not hard to filter this out or for anyone looking at the database to see that the test is using this app and so cheating.
We just checked this, and the renderer name is not submitted with the data when submitting a project. Also, if the author changes version number or util name, the "detection" would fail. We need to get a string that will be there in all future versions, and we are working on it.

AFAIK this app is not intended for cheating and so future versions will not specifiy any graphics card name you want as that would only help cheaters and nobody else. If you really know different then give me some solid info on this. Because if this does happen then you could simple try to ban the new version and leave the old version working.
All I can say here is that we will try to prevent this util (current and future versions) to work with 3DMark, in order to avoid _any_ problems in the future.

Go to the madonion database and take your results from a normal system and put them up against the same system. I guarentee you'll get loads of systems the same as yours with all with significantly higher scores.
Why is it that people really think they know something, when they don't? Geez.. Listen up. You think we have only(!) submissions from people who have overclocked their systems? *BEEP* wrong answer. The fact is that we have most of the submissions coming from "normal" users, who haven't overclocked, tuned nor lowered imagequality to gain better scores. Most of our data is so called "normal" data. We have peaks from guys like Macci, and then we have lows from some other peeps. Still the very most of the projects are around "normal" level. If you look at published scores, you might see mostly highs. The reason is that "normal" users very seldom publish their projects. Only the ones that are "into" benchmarking and tweaking want their scores to be comparable.

You can show me your beta program list till you are blue. You can try to brush aside the evidence all you want.. In the end it simply does not change what is pretty evident.
Your program FAVORS Nvidia cards. PERIOD. It has for years. Just becasue Ati had the lead for a whle *in spite of your efforts* means nothing. After all you guys quickly *fixed* that problem with the release of SE.
:LOL: What exactly is your so called evidence?

After all you did not choose to support a PS 1.4 test that would clearly favor ATI.
We have the Advanced PixelShader test in 3DMark2001 SE, and here is what we have in the help file:

This is a new test included in 3DMark2001 SE, and it uses Pixel Shader version 1.4, introduced in DirectX 8.1. The same effect can also be achieved using Pixel Shader 1.0, but then rendering the water surface requires two passes. Graphics hardware that supports Pixel Shader 1.4 (or higher) render the water in a single pass.

We don't favor any specific card nor manufacturer. The test favors any card that support PS 1.4.

Yes, the water has a bump-mapped reflection of the terrain around it, so you would definitely need to use bump mapping in conjunction with cube mapping.

This is actually possible on a Geforce/2 card, though it is very expensive to perform. If it was emulated on these video cards, they'd probably get in the range of 2-5 fps in the nature demo.
and
Especially when you could have EASILY coded the nature demo to use EMBM
Not exactly. I have been told that you can not have cube mapping on EMBM. You can map some texture on EMBM but it'd be "2D" and always follow the camera, which would make it look dumb. Do correct me if I'm wrong.. Besides, we wanted to use the PixelShaders, as it is a game test that uses Hardware PS and VS.

Now back to work.
 
Btw, I just have to say that my system performs a fair bit higher than most others of its class (sometimes 1000 points above others), and I have only overclocked my processor.

Here's my project:
http://service.madonion.com/compare?2k1=3524356

Do a search consisting of:
Athlon 902-968MHz
All graphics chipsets
All operating systems
Default benchmark settings

It beats out a large number of Ti 4400 and Ti 4600 systems with slightly faster processors, without any tweaks of any sort. The main reason for this is that I have an nForce, and I run at 266MHz fsb (The nForce has a DASP that basically brings my processor almost up to the speed of an Athlon XP).

You don't have to cheat to get a system that performs above many others.
 
worm[MadOnion.com said:
]
We don't favor any specific card nor manufacturer. The test favors any card that support PS 1.4.

The test as you call it doesn't account for any scoring though, its a demo or in other words you can add your Pixel Shader test to ATI's screensaver demo :rolleyes:

2560-2423.jpg


Additionally you name this test a advanced pixel shader test YET PS 1.1 will run it, so in other words it's not so advanced is it.
The screensaver linked above can only be done on 1.4, so if you are at least not going to include PS 1.4 cards in scoring, then at least make a test/demo that can only be done on 1.4..and I refer to demo as thats all your advanced PS test is..a demo.

No offense Worm, you can honestly say Madonion leans towards no specific vendor yet link and support a performance analyzer that would tell me if I had a Kyro II to get a Geforce 2 MX.... :LOL:

UT_1024_min.gif


SS_1024.gif
 
Chalnoth said:
Isn't that what you always say? "The drivers are better now!"
Well, when people like you constantly rate a company based on 6 month old performance, one has to correct you ALL THE TIME.
 
worm[MadOnion.com said:
]

Once again the app changes the renderer name to 3D Analyzer, its not hard to filter this out or for anyone looking at the database to see that the test is using this app and so cheating.
We just checked this, and the renderer name is not submitted with the data when submitting a project. Also, if the author changes version number or util name, the "detection" would fail. We need to get a string that will be there in all future versions, and we are working on it.

AFAIK this app is not intended for cheating and so future versions will not specifiy any graphics card name you want as that would only help cheaters and nobody else. If you really know different then give me some solid info on this. Because if this does happen then you could simple try to ban the new version and leave the old version working.
All I can say here is that we will try to prevent this util (current and future versions) to work with 3DMark, in order to avoid _any_ problems in the future.
|/quote]


I will only change the version number in the ID string, so you can always parse for "3D-Analyze". I will never include such features as "change the name of your card or renderdevice", because the main purpose of this tool isn't (and will never be) cheating. I also told you a way (dll crc check), to prevent the tool from working with future versions of 3D-Mark, so you can incorporate this changes in future versions of 3D-Mark.


Regards,
Thomas
 
Back
Top