Well I look at 3dmark 2003 as having the same functionality as shadermark. Its designed to test a specific thing.
Up until there we agree. It's after that that we part mindshare.
If we would have today a game containing dx9 shaders with no software fallback modes to render said effects where compliance isn't met, what would your best guestimate be that it'll turn out? CPU bound or GPU bound?
I wouldn't call 3dmark 2003 an indicator of future games.
I see it as a guestimate. If I'd be hardpressed to make the same guestimate myself, I'd probably result to the same conclusion as 2k3.
Ironically issues get constantly recycled in this thread and Battle of Proxycon has been already mentioned. Let's see it makes extensive use of stencil ops. If you should object that much with 2k3's result in that singled out test, than just use PowerVR's Fablemark demo. If you should then see that accelerator X does rather crappy in either/or, then it's an indication how crappy that accelerator will run games that make extensive use of stencil shadows and unless you turn off stencils in say Doom3, X will suffer from severe performance penalties.
If one knows what conclusions to draw from each individual synthetic test, then it can become a useful tool for pure testing purposes.
I realize that there've been notions in this thread, that FM doesn't make it all as clear to the layman. They do essentially but there's no way you can repeat the same over and over again, besides it's in their best interest to keep users conducting silly score pissing tests, just as much as it's in IHVs interest too after all since they do market extensively advanced featuresets which are hardly of any immediate usability for the gamer.
But Dx9 doesn't totally clear the CPU of any responsibilities like 3dmark2003 seems to reflect.
As I said before: all their applications start out being GPU limited and then become CPU limited. It happens in demanding games too and it most likely won't stop occuring either. The margin might be small but developers like Carmack have that luxury.
Scroll up look again at the GF2U- GF3 paradigm I posted and tell me that things were actually different.
As for people running older 3D hardware with older 3D software who complain that '03 isn't "indicative" of their performance running older games with no interest in IQ apart from resolution and color depth, heh...if these folks try running new games with AA/AF & shader function requirements on their GF2's they'll see quickly how accurate '03 really is...
Agreed on the IQ improving features, yet I disagree on the supposed accuracy of 2k3 in that department.
Shader performance is completely irrelevant there.
That said anyone who buys a recent high end card and runs it in 800*600*16 with no AA/AF deserves to be shot.
By the way it also depends what exactly you mean with a GF2; alas if it isn't a higher end model, even UT2k3/dx7 is a tough cookie for that one.
PS: FYI I've tested once (2-3 days after the release) my PC with it something like 1200 pts.
Ironically I have a dx9.0 compliant accelerator here and I haven't run it yet. I don't even have the application to be honest.
Real world gaming and 3DMark03 scores are very different. Just by looking at 3DMark03 scores one could easily be mislead into believing that the FX5600 is much better in gaming then the TI4800SE.
Fine then why not blame NVIDIA then for releasing a relatively underwhelming mainstream product?
Isn't there a chance that if you contact NV and ask them what the point is in getting a 5600 over a GF4Ti, part of the answer will be dx9 functionalities?
Can I please have an educated answer about my former question concerning the GF2U and GF3 in early 2001?
In contrast would you say the same if the comparison would be between a Radeon8500 and a Radeon9500PRO? (same ballpark: former mid-high end compared to recent mid-end).
The only other thing behind those wild theories concerning FM's benchmarks are IHV specific interests. When it favours IHV A, then it's a perfectly legitimate and accurate application both for said IHV and it's followers, while it's completely irrelevant and not representative for IHV B. Should tables turn down the line it's just the same story in reverse.
In the meantime if you take a CPU bound game like RtCW let's say and run dx8.1 cards against dx9.0 cards in say 1024*768*32, then it shouldn't come at a surprise that both will have pretty close if not identical performance. Pick a higher resolution and add AA/AF and the tables will turn quite a lot.
Needless to say that if one doesn't care about today's high end cards strengths, it's pretty senseless to waste money on an upgrade in the first place.