3Dmark 2001 doesn't reflect gaming peformance either...a Geforce 2 MX scores higher than Kyro cards yet in MAX PAYNE gets killed by the Kyro
I hope u arent going to compare completely different scenes (even though its same engine) right? how can you compare, for example, the Dragon test (Game 2), Car Test (Game 1) or Nature (Game 4) with Max Payne ?
How would they run the same having a completely different content ?
I can only speculate that in 3DMark a Geforce 2 mx is faster than a kyro (is it?) because there are more polys involved, while in Max Payne the amount is quite ridicle.
...this benchmark was showing a P4 Wiliamette beating a AMD Thunderbird...OK but in the real world in Unreal Tournament a CPU intensive game it was getting killed
U can be sure that when a game needs bandwidth a P4 will be faster, and when a game needs raw FPU power Athlon will be faster.
So tell me exactly what is accurate about this benchmark, IMO its whoever pays the most cash as it certainly doesn't reflect games today.
Yes, it surely doesnt reflect games today, there is no doubt on that, games today have a ridicle amount of polygons compared to 3DMark, though im sure if u try the "game demo", the playable car test, it will likely reflect the performance seen during the benchmark.
In the Advanced pixel shader demo even by doing a single pass vs the double pass a Geforce3/4 have to do, R8500 is slower, this is exactly what JC verified and said in his interviews.
There's another point made by fanATics, that 3DMark 2001 should of made a test that wouldnt work at all on a non pixel shader 1.4 card (then i guess the whole 3DMark should of have used pixel shaders in every test and make the benchmark incompatible with any Kyro, Geforce 2 and rest of cards), i dunno honestly if it would be better, my view is that in doing this, if it wasnt for the R8500 fault, the advanced pixel shader test would be an ADVANTAGE for ATi considering it would show a ~2x performance improvement compared to a PS 1.3- card because of the double pass needed (infact a R9700 completely destroys the Geforce 4 in the adv. pix. test).
The NVidia chamaleon test and BenMark show bigger performance difference for the R9700 compared to 3DMark, where is the point? maybe NV deliberately UNoptimized their tests waiting for ATi to release the new card so it would show a 3x difference even for like the BenMark ? (38,xMPoly vs around 111M) i guess there's a conspiarcy going on between nVidia workers, i think Mulder should investigate
Dont get me wrong, those things can and DO happen, i followed the whole SysMark crap (new Sysmark having the "Athlon friendly" tests removed) and it sounds resonable, but its a different story compared to 3DMark.