Interesting (3DMark2003) article at Aces Hardware

Slides said:
So you want Ace's to ignore what FM says on their main page and take into account some hidden whitepaper?

the whitepaper was the first thing released by FM wrt 3dmark03...

I fail to see your point... I am contending ace's should do more research before using a broadsweeping stroke and erroneously branding the product to be something it is not... it is their article and therefore they are accountable...

FM has its pr stuff on its website... I am not saying this is right or wrong... I am saying Ace's should perhaps have posted something wrt to the whitepaper and what FM contends...
 
Maybe you should contact Ace's on why they don't include the whitepaper info in their article since I don't want to speculate on their part. But what is clear is that FM likes to present their benchmark as an indication of gaming performance by labeling it as "The Gamers Benchmark" and trying to appeal to the gamers visceral senses.

With the exception of the the pdf's (that 90% of users will never get to read), 3DMark is not presented as a purely synthetic benchmark. What Ace is trying to do is make sure more people understand what 3DMark really is and do not misrepresent 3DMark scores as something they are not.

The bottom line is that the majority of people who use 3DMark are not 3D tech geeks, but gamers.
 
Slides said:
What Ace is trying to do is make sure more people understand what 3DMark really is and do not misrepresent 3DMark scores as something they are not.

Sure. But it would have been more fair for AH to note that making 3DMark03 more GPU-dependent was an intentional design decision that was well documented and publicly discussed. After noting that, the merits and disadvantages of such a decision could have been discussed but the article falls short of that.

They perhaps should also have mentioned that Futuremark recommends 3DMark2001 for evaluating performance on existing DX8-class titles, which was what they were doing with their game benches.

Looking at these points, the case of FM misrepresenting their products becomes a little thinner.

The bottom line is that the majority of people who use 3DMark are not 3D tech geeks, but gamers.

Yes, which is why a more in-depth look would have been warranted.
 
The bottom line is that the majority of people who use 3DMark are not 3D tech geeks, but gamers.

I´d say that it´s even debatable that the widest majority of FutureMark users are in fact gamers.

You may excuse the interruption; carry on.
 
I'm not, and if I am not mistaken, neither is the Ace's article debating for what purpose 3DMark03 is designed for. It's largely meant to be a GPU synthetic benchmark. The problem here is the rather upfront marketing claims by FM that may not be 100% accurate, describing 3DMark as representative of overall system gaming performance. Semantics you might say, but this can mislead people into thinking that video cards play a greater a role in gaming performance then is actually the case in most games played at "normal" resolutions.
 
To quote what I said in the Ars thread on this article:

I'm not surprised by the results, particularly as 3DM01SE has gone from GPU-limited to CPU-limited (someone at Anand's hit 13K 3DM01SE with an OC'ed GF3 and a--get this--2.8GHz Athlon XP ). I don't think anyone but Ace's is going to buy a $300 video card to stick in a five-year-old PC. Heck, I'm not sure PCs that old will have the power supply to accomodate a 9700. Futuremark's website statement needs to be qualified if they think such an extreme case as this will occur often enough to warrant preventing support calls--but I doubt they will.

3DMarks are aimed ahead of the curve, so it shouldn't be suprising that it's GPU-bound, especially since until a few months ago there were only a handful of cards that could even complete the entire benchmark suite. The 6fps I get in GT2 & GT3 is a pretty clear indication that my 9100, not my XP1700+, is holding me back.
 
This silliness has been mostly picked over, but one additional point:

Games have various options that can be changed in order to make the workload more GPU or CPU bound. (In general the base settings are the most CPU bound, and additional options put more stress on the GPU without changing that on the CPU. There are exceptions, though.) However, this tradeoff only specifies a limited range, which in the case of a typical performance intensive game might go from mediocre performance at lowest settings with a CPU and GPU that were mainstream 2-3 years before release, to any-improvement-is-not-noticeable (i.e. ~100fps) at highest settings with a CPU and GPU that are top-of-the-line about a year after release.

The important point here is that GPUs increase in performance much faster than CPUs do. In recent years it might be fair to say that CPU performance has roughly doubled every 18-24 months, while GPU performance (in the discrete market, at least) has roughtly doubled every 9-12 months. That is, GPU performance increases as the square of CPU performance.

Let that sink in a bit.

Now let's move on.

If you think about it, this means that later games inevitably stress the GPU much much much more than earlier games. (Just to be clear, this only applies to the sort of graphics intensive games we benchmark--the ones that assume a discrete GPU in the target machine.) It has nothing to do with graphics "inevitably" or "inherently" being more complex than physics, AI, etc. Believe me, developers are just as capable of thinking up stuff to suck up extra CPU cycles as they are stuff to suck up extra GPU power. (Which is to say, quite capable.)

It's just that they need to target their game at the machines that will actually exist to run it.
 
Still Dave and Dave,

3DMark2003 is skewed. That a PII 350 MHz can keep up with the high end Athlons and P4s is just plain rediculous and shows how overdone the GPU stressing part is. 3DMark2003 has no relation at all to games and in the foreseeable future won't have.

If they marketed it as a GPU only benchmark I wouldn't have any problems, but they market it as a gamers benchmark. And it simply doesn't reflect any of todays games. And as I have said, I don't foresee games becoming fully GPU limited (OK, on older cards), because all other parts of games do also become more complex. A game is NOT only about graphics.
 
sonix666 said:
If they marketed it as a GPU only benchmark I wouldn't have any problems, but they market it as a gamers benchmark.

As has been pointed out, they specifically state in the whitepaper that the game tests are GPU benchmarks much more than GPU/platform benchmarks. There's a CPU test included if you want to stress the platform. Anyways, if they wait until DX10 to come out with a new 3dMark, 3dMark03'll be plenty CPU-bound before it's through.

I realize that you hadn't read the whitepaper before, but any tech web site certainly ought to have, and the OEMs who use 3dMark in GPU buying decisions certainly have and understand its use.

If Ace's wanted to point out this apparently little-known fact to the millions of gamers who are apparently getting duped into downloading a free benchmark, they could have saved the trouble of all the benchmarks and just posted an article about the whitepaper.

And it simply doesn't reflect any of todays games.

Good, it's not supposed to. Or did you think today's games used unified per-pixel lighting with shadow buffers? Or heavy PS/VS 2.0??

3dMark01 reflects today's games (or perhaps those of a few months back).

And as I have said, I don't foresee games becoming fully GPU limited (OK, on older cards)

The 9700 Pro is an older card from the perspective of the games 3dMark03 is trying to model.

because all other parts of games do also become more complex. A game is NOT only about graphics.

I never said it was. The CPU-stressing part of a game's workload will become more complex to match the improvement in CPU performance around the time the game is released. The GPU-stressing part of a game's workload will become more complex to match the improvement in GPU performance around the time the game is released. That's a given. If that doesn't happen, the game was badly targeted or badly programmed.

That fact alone means that future games are always GPU limited by today's standards, and past games always CPU limited.
 
And as I have said, I don't foresee games becoming fully GPU limited (OK, on older cards), because all other parts of games do also become more complex.

Q3a as an example was abysmally fillrate limited upon release. Is it the same today?

Theories about 3dmark2003 being completely unrealistic as above, will break down as soon as it'll become CPU limited in the foreseeable future just like any other former applications of FM.

IMO FM's stuff has always had the same value as it had in all their applications. Sadly enough many use it for juvenile pissing contests with no real meaning, but that's a chapter of it's own. The whole thing gets actually more attention than it's actually worth it.

In retrospect there are also popular games out there, where users use them as a measuring unit, despite the fact that they don't play it anymore or the fact that they might be more than a couple of years old.

Finally if IHVs would play fair there wouldn't be even much material for debates wether X, Y or Z application reflects reality or not. I personally prefer an application to make my VGA sweat, than pulling out senseless 300+ fps score just for the heck of it but that's just me.
 
Errr... what do we want a GPU benchmark or a benchmark that shows how good one game with a specific CPU Load works ? Should we get FutureMark to add sound to the benchmark again ?

Game 1 is an old style game, and as older style games on latest hardware it shows its CPU limit. Game 2 and 3 are not matching with any of todays games but they do largely match the render model of future titles such as DoomIII (similar principle) and Game 4 has always been the most extreme test case. So whats wrong with these tests ? IMHO the only thing we can complain about is that there are no true DX9 VS/PS2.0 games out there which could prove that 3DMark2003 is actually modeling those type of games correctly.

And for a mainly 3D Graphics Card benchmark I am very happy to see it is soo little dependent on the CPU... while still doing a physics model.

Oh and can someone please explain to me why AcesHardware had to complicate things by not only having variable CPU but also variable GPUs in his overview... if he wanted to prove the CPU independence he should have stuck to one graphics board and not polute things by comparing 9700 and 9600 on the 2 key systems in the analysis :rolleyes:

Just my 2c...

K-
 
I really thought that part of the point of 3DM03 WAS to remove the platform and pretty much just test the GPU.......

The point was taken - in many, many places, that 3DM03 is NOT a game, so don't try to make it one! ;) It is a synthetic benchmake that's testing GPUs, not systems, period!

Is anyone buying a $400+ videocard to play todays games well with absolutely no thought of what the future :rolleyes: might have in store for it? Just what does a gurrent game benchmark tell you? Well, that the card will run current games! DOH! Tells you not much about how that card will work in the future! :LOL:

Hope everyone enjoys this episode of Fun with Fonts! ;)
 
Well, I notice that both NVNews and The Inquirer are currently reporting the Ace's article. The Inquirer seems surprised that it indicates the 3DM2003 is very GPU-bound.

The commentary on NVNews, on the other hand, seems to think that it proves 3DM is not a viable benchmark after all:

Pretty Damning!

Personally, I feel the Ace's article is misleading. I thought 3DM was supposed to represent how future games will play (I didn't know about their FutureMark's PR stuff saying 'current games'). The only worry is that all the NVidia-centric sites are going to take this as proof that NV were right all along with their comments/'optimisations'.
 
I must agree with Kristof on his views. 3dmark03 is built to test graphics hardware for the future. The old dx7 test (game 1) is totally CPU bound, much like the tests of earlier versions of 3dmark now that the hardware has finally caught up. It doesnt take a genius to guess a faster CPU would increase results across the board and sometimes greatly in instances where the GPU is not stressed.

Given a year or two, 3dmark03 scores will end up somewhere around where current 3dmark01 scores are now, when the tech catches up. The need is for dx9 game development and benchmarks, as well as more vendors with hardware capable... cmon PowerVR, S3, SIS.
 
Personally, I feel the Ace's article is misleading.

I doubt though (and I think it deserves a special note) that it's been done on purpose. Aceshardware specializes more on CPU reviews than anything else.

Of course would a bit of research helped a bit more, but it's not the first time that a review or a reviewer gets led to the wrong conclusion in the end, experienced with the subject or not.

Oh and can someone please explain to me why AcesHardware had to complicate things by not only having variable CPU but also variable GPUs in his overview... if he wanted to prove the CPU independence he should have stuck to one graphics board and not polute things by comparing 9700 and 9600 on the 2 key systems in the analysis.

Excellent call. I found that highly annoying while reading it too. Alternatively he could have run all CPU's with both accelerators in all tests.
 
Personally, I feel the Ace's article is misleading. I thought 3DM was supposed to represent how future games will play (I didn't know about their FutureMark's PR stuff saying 'current games'). The only worry is that all the NVidia-centric sites are going to take this as proof that NV were right all along with their comments/'optimisations'.

I think it's a perfectly valid article and not misleading at all.

From FM's homepage:

The high quality game tests, image quality tests, sound tests and others give you an extremely accurate overview of your system’s current gaming performance. Benchmarking has never been so good!

And as has been said already, how many of the majority of the 3D Mark users read the whitepaper ? My guess is not that many, and since that's probably the case, then i think that they should consider changing that text a bit.
 
DaveBaumann said:
However, as far as I'm concerned as a reviewer for a 3D specific site if an application is a 3D graphics benchmark and its actually stressing 3D graphics then thats perfect! There are, afterall, plenty of games out there to represent game playing performance! :)

Thanks for pointing me here dave(from other area).

I have to also respectfully disagree, and agree. First I agree that it is good to stress the GPU(especially wrt a 3d review I mean duh), but I disagree that future games will not be CPU intensive. There may be a wave of mindless games, but I certainly hope that the AI, and other parts of the game (like the physics you pointed out) advance to the point that the CPU is working as hard as the GPU afterall that is the point isn't it? To be fun and not just pretty.

People do think of it as a gaming benchmark, and not just a 3d benchmark even if they should not.
 
Sxotty said:
I have to also respectfully disagree, and agree. First I agree that it is good to stress the GPU(especially wrt a 3d review I mean duh), but I disagree that future games will not be CPU intensive. There may be a wave of mindless games, but I certainly hope that the AI, and other parts of the game (like the physics you pointed out) advance to the point that the CPU is working as hard as the GPU afterall that is the point isn't it? To be fun and not just pretty.

Yes. And the rendering techniques used in 3DMark highlight that you can create complext gaming scenes and character animations with very little CPU overhead - now game can use the excess overhead for those things. the point being is that, at present, they aren't because they are still too bogged down with dealing with rendering that they don't need to with modern graphics boards.
 
People that still have 3Dmark 03 installed, run all four Game Tests with Software T&L, then compare to your Pure HT+L :LOL:

Specifically Vertex Shader Speed.

I know that on my 'average' system..Xp 2100 @2200 Nforce 2 that my R300 was 2.5 times faster. So if game developers WOULD use the card more, then CPU cycles could be used for something else.
Dell is the major reason for CPU limited titles, they ship garbage video cards with 3GHZ processors and the developers have no choice.
 
The high quality game tests, image quality tests, sound tests and others give you an extremely accurate overview of your system’s current gaming performance. Benchmarking has never been so good!

CPU tests are included in that statement. Sure the final score is based on game tests, but maybe benchmarkers should try to interpret the results more carefully.
 
Back
Top