Comment on graph by Neeyik:
Of course newer games use more fillrate than older games.
Too bad you didn't show that graph on a PII 350 with a Radeon 9800/GeForceFX 5900. Then they'd all be CPU bound, even Splinter Cell.
The problem isn't that 3Dmark made it more GPU bound than before, that's good.
The problem is that they made it less CPU bound than before. In fact, its less CPU bound in most of its benchmarks than any game I've seen in years.
Future games will be using more fillrate per frame, no doubt, but they'll use more CPU, not less as well.
At this point I think I'm just summarizing what a lot of others have said.
3DMark 03 tests the video card, at the exclusion of almost all else.
At any point in time over the last 5 years I could have made a very similar graph, with newer games being more fillrate dependant. Only a few new games came out CPU dependant over the years. Most games are a combination when they are released.
What was the difference between Quake 2 and 3? Well, more fillrate used, and more CPU used. It was actually an oddity, because neither of them were double what Q2 seemed to require, and Q3 was more CPU bound than Q2 at release.
UT to UT2003? More of both, lots more fillrate, and a lot more CPU required.
Over time, GPU speed has gone up much faster than CPU speed, and will probably continue to do so. That makes all older games become CPU bound with time.
Unfortunately, that will take almost forever with this version of 3DMark.
If every 2 years CPU speeds double and GPU's quadruple, it will take near 5 to 6 years, since most of 3DMark '03 doesn't seem able to strongly distinguish a 3Ghz CPU from a 350Mhz one.
By then all of today's games, next years games, and the games of '05 will be CPU bound most likely.
3DMark is still a great tool. But it is like ALL benchmarks:
A benchmark is useful ONLY when you know what it is measuring, and apply it as a tool to measure that.
If you are using it to measure game system performance, then it is the wrong tool for the job.
If you are using it to compare video card performance, its a good tool (provided there's no cheating, but that's another discussion).
There are no benchmarks, no matter how falwed they are in what they truly measure compared to what they were intended to measure, that are useless*. They just become useful for something different from what was intended.
*ok, so it is possible that it is so broken it does nothing and returns the same number for all tests, or it returns results randomly, etc.
Of course newer games use more fillrate than older games.
Too bad you didn't show that graph on a PII 350 with a Radeon 9800/GeForceFX 5900. Then they'd all be CPU bound, even Splinter Cell.
The problem isn't that 3Dmark made it more GPU bound than before, that's good.
The problem is that they made it less CPU bound than before. In fact, its less CPU bound in most of its benchmarks than any game I've seen in years.
Future games will be using more fillrate per frame, no doubt, but they'll use more CPU, not less as well.
At this point I think I'm just summarizing what a lot of others have said.
3DMark 03 tests the video card, at the exclusion of almost all else.
At any point in time over the last 5 years I could have made a very similar graph, with newer games being more fillrate dependant. Only a few new games came out CPU dependant over the years. Most games are a combination when they are released.
What was the difference between Quake 2 and 3? Well, more fillrate used, and more CPU used. It was actually an oddity, because neither of them were double what Q2 seemed to require, and Q3 was more CPU bound than Q2 at release.
UT to UT2003? More of both, lots more fillrate, and a lot more CPU required.
Over time, GPU speed has gone up much faster than CPU speed, and will probably continue to do so. That makes all older games become CPU bound with time.
Certainly. And 3dmark2003 will have flipped from GPU bound to CPU bound, like any other GPU bound game there ever was.
Unfortunately, that will take almost forever with this version of 3DMark.
If every 2 years CPU speeds double and GPU's quadruple, it will take near 5 to 6 years, since most of 3DMark '03 doesn't seem able to strongly distinguish a 3Ghz CPU from a 350Mhz one.
By then all of today's games, next years games, and the games of '05 will be CPU bound most likely.
3DMark is still a great tool. But it is like ALL benchmarks:
A benchmark is useful ONLY when you know what it is measuring, and apply it as a tool to measure that.
If you are using it to measure game system performance, then it is the wrong tool for the job.
If you are using it to compare video card performance, its a good tool (provided there's no cheating, but that's another discussion).
There are no benchmarks, no matter how falwed they are in what they truly measure compared to what they were intended to measure, that are useless*. They just become useful for something different from what was intended.
*ok, so it is possible that it is so broken it does nothing and returns the same number for all tests, or it returns results randomly, etc.