You may have noticed my distinct staying away from any topics to do with 3dmark2003 . Well that's about to change
Let me start by saying that in large part , I agree with NVIDIA, HardOCP etc on their point that actual games need to be stressed in any review. I think for the longest time , reviewers have used Quake3, 3dmark (whatever version) , Serious Sam, Unreal Tournament 2003 and for the most part forgotten why a person normally buys a videocard when writing a review. I've felt that way for years, not just last week.
I remember the old days where Tom , for example used to care about things like image quality, considerations for the gamer, considerations for the professional and included screenshots of various videocards using relatively current games of the time. Somewhere around the release of the Voodoo2, Tom decided to do away with the image quality differences mostly. And I kind of miss that. Back in 1997 , when I first visited his site, he used to include at least several images of current games and point out the differences.
Contrast that to today, where Lars (Tom's hardware writer) doesn't usually use a single image in his reviews on videocards. It's basically a benchmarkfest. Shrug.
Anyway, I digress. Back to 3dMark. Madonion/Futuremark first released a "3dMark" in 1998. At the time their partners (many of the people who worked on 3dmark1999 also worked on Max Payne) at Remedy were working on Max Payne using a engine called Max-FX. I loved the first 3dmark, and as it was using a game engine that would eventually appear in a actual shipping game, it was a valid benchmark for game performance (Note, I'm not saying 3dmark EVER had a real direct relationship with Max Payne performance, as a game engine can and does have features that never make it into a game.) I"m saying that there was and is a relationship between the people who make Max Payne and the people who make 3dmark and you can see some of that influence in both the game and the benchmark.
3dmark2000 continued that trend , adding support for Hardware T+L and other features which are common to DirectX 7.0 cards.
3dmark 2001 began the divergence from the game engine. It included a DirectX 8.0 Pixel Shader benchmark for cards. As there was only one videocard with PS 1.0 support (the Geforce3) some might have the opinion that it was biased towards Nvidia cards. It's rather unfortunate that plans didn't go exactly as planned. With the cancellation of Rampage because of 3dfx's bankruptcy, and the cutting of the Matrox "G800" project into the G550, instead of 4 DX8 cards by different manufacturers out by the end of 2001, there was only 2 Nvidia Geforce3 (note I am placing 3, ti200/500 in the same category) and ATI's Radeon 8500. Had the other manufacturers introduced their cards on time, the picture of performance and support for 3dmark 2001 would most likely have been entirely different.
When DirectX 8.1 came out, MadOnion had a choice. They could release a apples to apples comparison and do a separate test for PS1.4, or they could include a new test for PS1.4 cards.
MadOnion's stated policy on cards are to use features that 2 or more manufacturers support. As of today, only ATI supports PS1.4 in shipping videocards. This , and the fact that they wanted a apples to apples comparison for the score was the reason that SE had a separate APS test.
When 3dmark 2003 was being designed , they faced a similar challenge. Except this time they were basing the benchmark on DirectX 9.0. This meant that all cards that support 2.0 shaders would also support PS 1.4 as well as PS 1.3, 1.2, 1.1 and 1.0. It was therefore appropiate to use PS 1.4.
I'm sure if they could collapse passes on PS 1.3 to improve performance basically doing the same thing, they would have used it as a fallback as well. Unfortunately, PS1.3 still has the 12 instruction limit of the pixel shader, and you really can't collapse the pass like in PS1.4.
This is why the PS1.4 tests will run 2 times or morefaster on a Radeon 9700 Pro/8500 than on a Geforce4 .
Let's talk about the specific game tests
Game 1- For a flight sim to me , it was rather boring in all honesty. Not talking about single versus multitextured here, I think games like IL-2 Sturmovik have done it better.
Game 2. This was supposed to simulate a fps like Halo, Doom3 et al. Again , I found this test rather boring , and felt Unreal 2 has done a better job of providing a cool look. But me, I'm not that enamored of FPS shooter type games right now so that's understandable
Game 3-meant to represent a RPG game. I loved the look of the sword, and the look of the female's hair . I also liked the look of the trolls and the lighting of their lair
Game 4- meant to show Pixel Shader 2.0 effects and be a DX9 benchmark. As such, it does a wonderful job with the water, and the sky. The fact that there are parts of the benchmark using PS 1.4 is really irrelevant. If the developers coded everything in PS2.0, no card existing today (unless R400/NV35 whatever is in proto form ) would run it at any kind of reasonable speed.
I'm not sure I agree with the weighting of the individual tests, but then they tried to give each test equal weight.
Do some people buy it to see how many 3dmarks a card gets? Yes , but for the most part people buy faster videocards to run the games they play today better.
Is 3dmark 2003 relevant ? Yes. No other existing synthetic or game benchmark uses pixel shaders 2.0 at all. When I asked NVIDIA when we might see a DX9 benchmark from them, their response was: We code for OpenGL to show off the full capabilities of our cards. And that's my problem with NVIDIA's argument. They think that 3DMark does it wrong? Fine. Show us some demos of the way you think benchmarks should be done! Chameleonmark, TreeMark and other previous demos are good examples of this and in fact do show off some very nice effects .
The fact that at ATI's SF 9700 launch , they stated that the 9700 Pro was getting over 2x the performance of Nvidia's Geforce4, in ChameleonMark (Dave Nalasco , a very straight up guy at ATI) says a lot about the benchmark.
Continued....
Let me start by saying that in large part , I agree with NVIDIA, HardOCP etc on their point that actual games need to be stressed in any review. I think for the longest time , reviewers have used Quake3, 3dmark (whatever version) , Serious Sam, Unreal Tournament 2003 and for the most part forgotten why a person normally buys a videocard when writing a review. I've felt that way for years, not just last week.
I remember the old days where Tom , for example used to care about things like image quality, considerations for the gamer, considerations for the professional and included screenshots of various videocards using relatively current games of the time. Somewhere around the release of the Voodoo2, Tom decided to do away with the image quality differences mostly. And I kind of miss that. Back in 1997 , when I first visited his site, he used to include at least several images of current games and point out the differences.
Contrast that to today, where Lars (Tom's hardware writer) doesn't usually use a single image in his reviews on videocards. It's basically a benchmarkfest. Shrug.
Anyway, I digress. Back to 3dMark. Madonion/Futuremark first released a "3dMark" in 1998. At the time their partners (many of the people who worked on 3dmark1999 also worked on Max Payne) at Remedy were working on Max Payne using a engine called Max-FX. I loved the first 3dmark, and as it was using a game engine that would eventually appear in a actual shipping game, it was a valid benchmark for game performance (Note, I'm not saying 3dmark EVER had a real direct relationship with Max Payne performance, as a game engine can and does have features that never make it into a game.) I"m saying that there was and is a relationship between the people who make Max Payne and the people who make 3dmark and you can see some of that influence in both the game and the benchmark.
3dmark2000 continued that trend , adding support for Hardware T+L and other features which are common to DirectX 7.0 cards.
3dmark 2001 began the divergence from the game engine. It included a DirectX 8.0 Pixel Shader benchmark for cards. As there was only one videocard with PS 1.0 support (the Geforce3) some might have the opinion that it was biased towards Nvidia cards. It's rather unfortunate that plans didn't go exactly as planned. With the cancellation of Rampage because of 3dfx's bankruptcy, and the cutting of the Matrox "G800" project into the G550, instead of 4 DX8 cards by different manufacturers out by the end of 2001, there was only 2 Nvidia Geforce3 (note I am placing 3, ti200/500 in the same category) and ATI's Radeon 8500. Had the other manufacturers introduced their cards on time, the picture of performance and support for 3dmark 2001 would most likely have been entirely different.
When DirectX 8.1 came out, MadOnion had a choice. They could release a apples to apples comparison and do a separate test for PS1.4, or they could include a new test for PS1.4 cards.
MadOnion's stated policy on cards are to use features that 2 or more manufacturers support. As of today, only ATI supports PS1.4 in shipping videocards. This , and the fact that they wanted a apples to apples comparison for the score was the reason that SE had a separate APS test.
When 3dmark 2003 was being designed , they faced a similar challenge. Except this time they were basing the benchmark on DirectX 9.0. This meant that all cards that support 2.0 shaders would also support PS 1.4 as well as PS 1.3, 1.2, 1.1 and 1.0. It was therefore appropiate to use PS 1.4.
I'm sure if they could collapse passes on PS 1.3 to improve performance basically doing the same thing, they would have used it as a fallback as well. Unfortunately, PS1.3 still has the 12 instruction limit of the pixel shader, and you really can't collapse the pass like in PS1.4.
This is why the PS1.4 tests will run 2 times or morefaster on a Radeon 9700 Pro/8500 than on a Geforce4 .
Let's talk about the specific game tests
Game 1- For a flight sim to me , it was rather boring in all honesty. Not talking about single versus multitextured here, I think games like IL-2 Sturmovik have done it better.
Game 2. This was supposed to simulate a fps like Halo, Doom3 et al. Again , I found this test rather boring , and felt Unreal 2 has done a better job of providing a cool look. But me, I'm not that enamored of FPS shooter type games right now so that's understandable
Game 3-meant to represent a RPG game. I loved the look of the sword, and the look of the female's hair . I also liked the look of the trolls and the lighting of their lair
Game 4- meant to show Pixel Shader 2.0 effects and be a DX9 benchmark. As such, it does a wonderful job with the water, and the sky. The fact that there are parts of the benchmark using PS 1.4 is really irrelevant. If the developers coded everything in PS2.0, no card existing today (unless R400/NV35 whatever is in proto form ) would run it at any kind of reasonable speed.
I'm not sure I agree with the weighting of the individual tests, but then they tried to give each test equal weight.
Do some people buy it to see how many 3dmarks a card gets? Yes , but for the most part people buy faster videocards to run the games they play today better.
Is 3dmark 2003 relevant ? Yes. No other existing synthetic or game benchmark uses pixel shaders 2.0 at all. When I asked NVIDIA when we might see a DX9 benchmark from them, their response was: We code for OpenGL to show off the full capabilities of our cards. And that's my problem with NVIDIA's argument. They think that 3DMark does it wrong? Fine. Show us some demos of the way you think benchmarks should be done! Chameleonmark, TreeMark and other previous demos are good examples of this and in fact do show off some very nice effects .
The fact that at ATI's SF 9700 launch , they stated that the 9700 Pro was getting over 2x the performance of Nvidia's Geforce4, in ChameleonMark (Dave Nalasco , a very straight up guy at ATI) says a lot about the benchmark.
Continued....