Wait for news from Unwinder (nvworld.ru)

Bambers said:
interesting but not hugely suprising, i always though the 70-80 fps scores were a little too high.

Still, the 8500 went from 20ish at launch and is now about 45-50+ might be interesting to see if any of that is iffy.

The 8500 drivers at release were terrible under XP. They were lurching around like a drunken sailor on my system until ATI released a fix for 'systems with large amount of memory' in January '02. Performance was awesome across the board after that.
 
http://www.nvnews.net/articles/david_kirk_interview.shtml+David+Kirk+optimize&hl=en&ie=UTF-8

David: I’ve heard this mentioned before, and I haven’t really dug into it too much, since it’s not a very realistic real-world example. The lit triangles test in 3Dmark2001 renders a lot of offscreen (invisible) triangles, which results in a lot of what I would call “meaningless computationâ€. So, I’m not sure exactly what capability is being measured in the test. It’s not visible. Often, benchmarks test Software or Hardware code paths that are not general and are not commonly used. We at NVIDIA don’t make it a practice to optimize our pipeline for specific benchmarks - we want to provide high quality and high performance on a wide variety of useful and entertaining applications, rather than just getting a good score. Ask yourself (or, better yet, ask ATI) why Radeon 8500 performs well on this one test, and poorly on many other 3DMark2001 tests.

http://firingsquad.gamers.com/hardware/kirkint/page3.asp

It's very easy for people to target synthetic benchmarks and optimize their drivers and hardware for them. That does not add anything to real world performance. You need to optimize your drivers to make game engines and API features go faster, not just specific game titles.
 
Think his excuse might be that he was smoking something hallucinogenic?

Pass it along, David...
rasta.gif


73 elements and 50 shaders, hey ?
/me thinks the drivers guys at Nvidia really deserve a raise...
 
how many of the ~50 shaders are for real games? How many shaders does shadermark have?? :devilish:
 
Well first results for R300 (no results for R200 drivers yet)
Unwinder:
Catalyst 3.4

3DMark Score 11676
Game 1 - Car Chase - Low Detail 167.6 fps
Game 1 - Car Chase - High Detail 61.3 fps
Game 2 - Dragothic - Low Detail 206.8 fps
Game 2 - Dragothic - High Detail 119.7 fps
Game 3 - Lobby - Low Detail 157.2 fps
Game 3 - Lobby - High Detail 71.1 fps
Game 4 - Nature 66.0 fps
Fill Rate (Single-Texturing) 961.3 MTexels/s
Fill Rate (Multi-Texturing) 2160.8 MTexels/s
High Polygon Count (1 Light) 54.8 MTriangles/s
High Polygon Count (8 Lights) 13.6 MTriangles/s
Environment Bump Mapping 129.5 fps
DOT3 Bump Mapping 128.0 fps
Vertex Shader 149.6 fps
Pixel Shader 199.6 fps
Advanced Pixel Shader 149.4 fps
Point Sprites 27.0 MSprites/s

Ñatalyst 3.4 + ATIAntiDetector

3DMark Score 11291
Game 1 - Car Chase - Low Detail 165.4 fps
Game 1 - Car Chase - High Detail 63.2 fps
Game 2 - Dragothic - Low Detail 207.5 fps
Game 2 - Dragothic - High Detail 119.3 fps
Game 3 - Lobby - Low Detail 155.5 fps
Game 3 - Lobby - High Detail 70.8 fps
Game 4 - Nature 47.1 fps
Fill Rate (Single-Texturing) 963.4 MTexels/s
Fill Rate (Multi-Texturing) 2154.2 MTexels/s
High Polygon Count (1 Light) 54.0 MTriangles/s
High Polygon Count (8 Lights) 13.7 MTriangles/s
Environment Bump Mapping 129.2 fps
DOT3 Bump Mapping 128.0 fps
Vertex Shader 149.1 fps
Pixel Shader 199.4 fps
Advanced Pixel Shader 149.6 fps
Point Sprites 27.0 MSprites/s

AT least 3 pixel shaders are detected (1@1.1 and 2@2.0), texture detections (thats how GT4 in 3dmark2001 is detected) ....
Full texture patterns& shader code are in the driver, but although easy to see what they do, its very hard to make FULL anti-detect patch.....
.... what ATi make with these 2 p.sh. 2.0 is hardly "shifting" ......
....

any anti-ATi comments? :)
 
For shame for shame. ;)

I'm not suprised. All the companies seem to do it. There's been a long history of 'cheating' in benchmarks from just about any of the companies, all the way back to the VGA days.
 
chavvdarrr ati already admit to it . They stated it will be gone from the next driver . The same thing that caused the diffrence in 3dmark 2003 is causing that. At the end of the day at least ati owned up to it .
 
Hu? There's no pb with GT2-3 on 3Dmark03.
jvd said:
chavvdarrr ati already admit to it . They stated it will be gone from the next driver . The same thing that caused the diffrence in 3dmark 2003 is causing that. At the end of the day at least ati owned up to it .
 
Interesting that both ATI & Nvidia got a large improvement in Nature with optimizations (though Nvidia did the better job o_O ) without sacrificing image quality (or did they?). I'm wondering what exactly they did - shader optimizations probably won't cut it (as those are relatively simple PS 1.1 shaders) and GT4 is usually quite bandwidth limited (due to the alpha-tests).
 
Evildeus said:
Hu? There's no pb with GT2-3 on 3Dmark03.
jvd said:
chavvdarrr ati already admit to it . They stated it will be gone from the next driver . The same thing that caused the diffrence in 3dmark 2003 is causing that. At the end of the day at least ati owned up to it .

actually one of the test drops 13% with out the optimizations .
 
Damn shame.
Personally, i iwll wait till i hear what the cheats do to IQ before i decide which is worse, but i dont like either of them.
I also wonder which of them cheater "first"?
IE, was it a case of "they cheated, we had to to keep up" (which i dont buy, you can always just expose them, of course, they did that this time, and look! ATI and 3dmark are getting all the bad press, so obviously, its not always the best path, eh?)?
 
Does anyone know if this anti-detection patch script will ever be released to the public? I am most interested in seeing the results of 3dmark2001 with a gfFX card, and not a gf4Ti4600. And of course, I'd like to see some extensive game testing.
 
StealthHawk said:
Does anyone know if this anti-detection patch script will ever be released to the public? I am most interested in seeing the results of 3dmark2001 with a gfFX card, and not a gf4Ti4600. And of course, I'd like to see some extensive game testing.

Hey SH, I would also like to see the FX perform with Unwinder's anti-detection patch script.
 
Althornin said:
Damn shame.
Personally, i iwll wait till i hear what the cheats do to IQ before i decide which is worse, but i dont like either of them.
I also wonder which of them cheater "first"?
IE, was it a case of "they cheated, we had to to keep up" (which i dont buy, you can always just expose them, of course, they did that this time, and look! ATI and 3dmark are getting all the bad press, so obviously, its not always the best path, eh?)?

the ati thing doesn't do anything for iq . But there is a diffrence with it on. They found a way to speed up thier rendering . I dunno i will look for where i read it and post back asap
 
There's no drop in GT2/3 in 3Dmark03 which use PS1.4 contrary to GT4 which uses PS2.0. So there's some optimisation for PS1.4 in 3DMark01 and not in 3Dmark03.
jvd said:
Evildeus said:
Hu? There's no pb with GT2-3 on 3Dmark03.
jvd said:
chavvdarrr ati already admit to it . They stated it will be gone from the next driver . The same thing that caused the diffrence in 3dmark 2003 is causing that. At the end of the day at least ati owned up to it .

actually one of the test drops 13% with out the optimizations .
 
Evildeus said:
There's no drop in GT2/3 in 3Dmark03 which use PS1.4 contrary to GT4 which uses PS2.0. So there's some optimisation for PS1.4 in 3DMark01 and not in 3Dmark03.
PS 1.4 is not used in any of the game tests in 3D Mark 2001. The advanced pixel shader test in 3D Mark 2001 uses PS 1.4, if supported, but it has no bearing on your score.

-FUDie
 
Well it's not true, it's used in the advanced pixel shader. But you are right i confused with it ;). Thx you for correcting me.

Anyway, the "optimisation" on nature doesn't seem to be the same as the one on GT4 on 3DMark03
 
Anyone consider that ATi optimized the PS1.0 into PS2.0 (or PS1.4 for 8500/9100) for the higher FPS w/o loss of IQ? Could simply be the ATi cards use the highest PS they support & I don't see anything wrong with that in 3DM2K1. I hope they do it for my games too! :D

Mountain out of a molehill, IMHO. ;)

.02,
 
Yes ATI is probably doing more than re-ordering in 3Dmark 2001, they are possibly even replacing the shader with a superior PS 1.4 version since Futuremark never included PS 1.4 in scoring during the benchmarks 2 years of existence...a really big joke.

Is image quality reduced, I would say absolutley not...and I personally have no problem with optimizations that don't reduce quality, as those examples could also be used in games.
 
Back
Top