Interesting (3DMark2003) article at Aces Hardware

Ichneumon said:
It does not seem to me that it should be so difficult to comprehend the point that neeyik was trying to make that "Benchmark for Gamers" does not equal "Game Benchmark".

Lets not try to act is if you don't know how FM markets 3DMark03.

FM's website said:
The high quality game tests, image quality tests, sound tests and others give you an extremely accurate overview of your system’s current gaming performance.

Neeyik said:
Correct - so why then does it get used so frequently as a benchmark? Why can Apple use a non-benchmarking program to claim higher performance than a PC for their products? Misuse of an application, in order to review or test a product, is probably the biggest and most consistent flaw in computing journalism.

I don't know, is Apple correct in it's claims? If it is, then it's not a misuse of Photoshop. And Adobe does not forward claims that Photoshop can be used as an accurate benchmark of digital graphics performance. FM does make such claims.

As for reviewers putting too much emphasis on 3DMark, they share the blame along with every other company that is promoting 3DMark as the only benchmark standard.
 
slides, the fact is you have not the slightest clue what games will be like in the future. FM on the other hand studied the issue and came to a conclusion that games will become much more GPU bound. And they have evidence to support it (i.e thoose graphs neeyik posted). I mean sure FM could be totaly completly wrong, but they might not be, wait 2 years before getting pissed. Also graphics card advance much faster than CPUs so over time it will get more and more CPU bound.
 
Slides said:
Neeyik said:
Have a look at this little graph:

boundingtests.png

I'm not sure what that graph is supposed to represent?

It's a graph of realized fillrate (i.e. fps * resolution) vs. resolution (that'd be the unlabeled x-axis; presumably it goes from 640*480 on the left to 1600*1200 on the right.). It can tell you a great deal about the performance characteristics of a graphics workload. (That's why this is the sort of graph used in Wavey's reviews.)

A straight diagonal line going up at 45 deg. represents the situation where fps remains constant even as the resolution is increased. This means the game is not at all bound by the pixel rendering half of the GPU pipeline, even at high resolutions. In other words, the bottleneck is in those parts of the rendering process that don't see their workload increase as the resolution goes up: the game is either platform (CPU) bound or geometry bound. In the case of Q3A and even UT03, we know the game is definitely not geometry bound on a 5600 Ultra. Thus the graph shows that UT03 is CPU bound at low resolutions, and Q3A is CPU bound even all the way up to 1600*1200, particularly at low quality.

A straight flat line represents the situation where the realized fillrate remains exactly the same even as the resolution goes up. If the game were at all bottlenecked by those parts of the rendering workload that are constant per frame and not dependent on resolution--CPU and geometry workload--then as fps went down, realized fillrate would go up. As this is not the case for Splinter Cell and UT03 at high resolution, the graph shows that they are GPU limited (in particular, fillrate or possibly bandwidth limited) at these settings.

(Note BTW that one would expect a slight rise in realized fillrate at high resolutions even in a completely fillrate-limited situation, because at high resolutions more textures will be magnified than minified, which means less of a bandwidth hit and much less of a fillrate hit per pixel. Trust me on this. The fact that the Splinter Cell graph is at a slightly positive angle doesn't mean that it's not completely fillrate limited even at low resolution.)

UT03, and Q3A HQ to a lesser degree, show the "expected" behavior for a "well balanced" game (but of course a game can only be well balanced with respect to a particular target machine), namely, CPU/geometry bound at low resolution (steep line) transitioning (curve) to fillrate bound (near-horizontal line) at high resolution.

Q3A LQ is completely CPU bound.

Splinter Cell is completely fillrate (GPU) bound.
 
Freak'n Big Panda said:
slides, the fact is you have not the slightest clue what games will be like in the future. FM on the other hand studied the issue and came to a conclusion that games will become much more GPU bound. And they have evidence to support it (i.e thoose graphs neeyik posted). I mean sure FM could be totaly completly wrong, but they might not be, wait 2 years before getting pissed. Also graphics card advance much faster than CPUs so over time it will get more and more CPU bound.

In 2 years we will likely have a new version of 3DMark. :?:

Thanks to Dave for the explanation.
 
well we might but if we do it will aim for 2007 games not the games of 2005. But the next 3dmark will be released near the release of DX10 and I don't know if it will be coming out in '05 maybe but I would guess early-mid 06. Splinter cell is totaly GPU bound as you can see, I would assume doom 3 will be as well from some alpha testing I've done :) I got absolutly no performance increase (not one fps, seriously) going from a 1600+ to a 2400+.

EDIT: thats using a GF3 classic, new GPUs (ie radeon 9700+ would probly gain somthing but guru3d says otherwise. In the 5800 ultra review they published a while back, they too gained not one fps going from a 1800+ to a 2.4gig P4 even on 640x480.)
 
Slides said:
AzBat said:
I'm not getting excited over it. I just ignored it.

Good for you, let's hope all 3DMark03 users also learn to ignore it.

It amazes me how you turned around what AzBat said to your advantage. It, shows me though that you seem ignorant to what he really meant.

AzBat meant he ignored Ace's Article.. or at least that's how I picked it up.

I guess you also didn't get that 3DMark2001 did the same thing when it was released. Most GPU's couldn't handle it and so Intel and AMD released faster CPU's which increased the score. Nvidia and ATI did the same and so now you have the situation where scores where getting ridiculously high, so FM decided to create a new benchmark where they could implement future technology to try and get a score that was GPU bound.

Now, ACE's did the article that made FM look inaccurate, but ACE's never took into account the fact that the games they used were not GPU bound either. Now if they used Splinter Cell and all had the same Graphics card i.e 9700 Pro then maybe the results could've been given more credit.

I just noticed something form the ACE article. Right at the bottom.

Understanding is the key to benchmarking, and hopefully this small article has provided our readers with a greater insight into 3DMark03. While benchmarking, we must always remember not to draw any conclusions from its results beyond what the benchmark actually measures. From what we know right now, it appears 3DMark03 heavily favors video card performance over other elements of a given system, and so we must take this into consideration when interpreting its results. As I mentioned earlier, we have performed a number of benchmarks and we will be continuing our analysis of these results in the future. So, as always, we'll keep you posted.

See, so it seems ACE also noticed that the benchmark is actually GPU bound and not CPU bound.

There you have it.

US
 
Slides said:
Ailuros said:
We have already established that, and the Ace's article proves this further.

The comparison methodology and conclusions are flawed. I don't see where Ace's article has established anything.

How so? 3DMark03 has little relevance to current games? Do you agree or disagree?

Reread the review, what parts he used, how he combined them and the conclusion. I´m not going to rechew on the same points again and again.
 
Slides said:
Freak'n Big Panda said:
slides, the fact is you have not the slightest clue what games will be like in the future. FM on the other hand studied the issue and came to a conclusion that games will become much more GPU bound. And they have evidence to support it (i.e thoose graphs neeyik posted). I mean sure FM could be totaly completly wrong, but they might not be, wait 2 years before getting pissed. Also graphics card advance much faster than CPUs so over time it will get more and more CPU bound.

In 2 years we will likely have a new version of 3DMark. :?:

Thanks to Dave for the explanation.

Certainly. And 3dmark2003 will have flipped from GPU bound to CPU bound, like any other GPU bound game there ever was.

We could also go into a much further speculative mode what PS/VS3.0 next generation cards and them coming closer to CPU logic concerns, but I´m afraid it´ll create more confusion.

There will always be a need for a powerful platform/CPU/host ram etc combined with a recent card to play recent games.

How do you know that if you combine a let´s say R300 with a 6GHz CPU that it will not turn out CPU bound in 3dmark2003?
 
Ailuros said:
How do you know that if you combine a let´s say R300 with a 6GHz CPU that it will not turn out CPU bound in 3dmark2003?
If it's GPU limited with a 2 Ghz CPU, then changing to a 6 Ghz CPU isn't going to change that. Now, if you had a 6 Ghz R300, then you could end up CPU limited. Please don't try clocking your R300 to 6 Ghz to find out! :D (Unless you promise to buy a new R300 to replace the one you fried ;))
 
Unknown Soldier said:
See, so it seems ACE also noticed that the benchmark is actually GPU bound and not CPU bound.

Exactly, and most current games remain CPU bound. Splinter Cell being a rare exception, and I have no idea about future games, and nor have I questioned how future games will respond to GPU advances.

But 3DMark03 is not a good indicator of general gaming performance on most current games. It is a good indicator of video card performance. Whether this will translate to future gaming performance remains to be seen.

And I thought AzBat meant he ignored 3DMark03 scores.
 
OpenGL guy said:
Ailuros said:
How do you know that if you combine a let´s say R300 with a 6GHz CPU that it will not turn out CPU bound in 3dmark2003?
If it's GPU limited with a 2 Ghz CPU, then changing to a 6 Ghz CPU isn't going to change that. Now, if you had a 6 Ghz R300, then you could end up CPU limited. Please don't try clocking your R300 to 6 Ghz to find out! :D (Unless you promise to buy a new R300 to replace the one you fried ;))

OT: unless my memory is weaker than I thought, did I state somewhere that I actually own a R300? You weren´t wrong a bit to set that clear, I´m just wondering, if it´s pure coincidence ;)

But 3DMark03 is not a good indicator of general gaming performance on most current games.

Where did anyone say or imply that?

Au contraire I saw ridiculous claims in this thread that dx9.0 cards are hardly any faster in todays games than dx8.1 cards.
 
Ailuros said:
OpenGL guy said:
Ailuros said:
How do you know that if you combine a let´s say R300 with a 6GHz CPU that it will not turn out CPU bound in 3dmark2003?
If it's GPU limited with a 2 Ghz CPU, then changing to a 6 Ghz CPU isn't going to change that. Now, if you had a 6 Ghz R300, then you could end up CPU limited. Please don't try clocking your R300 to 6 Ghz to find out! :D (Unless you promise to buy a new R300 to replace the one you fried ;))

OT: unless my memory is weaker than I thought, did I state somewhere that I actually own a R300? You weren´t wrong a bit to set that clear, I´m just wondering, if it´s pure coincidence ;)
Hehe. My comment about frying an R300 wasn't directed at you but forum readers in general :)
 
Slides said:
Whether this will translate to future gaming performance remains to be seen.

Only to the extent that it "remains to be seen" whether GPU performance will increase at a faster rate than CPU performance. Which is an extraordinarily safe bet.
 
Dave H said:
Slides said:
Whether this will translate to future gaming performance remains to be seen.

Only to the extent that it "remains to be seen" whether GPU performance will increase at a faster rate than CPU performance. Which is an extraordinarily safe bet.
If IHVs continue to deliver low-cost parts that suck in performance compared to their high-end parts, developers will continue to take this into account. Because the majority of game buyers do not own high-end cards. Because they cost extra (duh).

Surely this is should be clear to everyone... why are we talking about this?

FM's 3DMark03 is nothing more than "Games will probably look, and perform, like our Game Tests if developers decide to do things the Game Tests do things. The Game Tests use the latest 3D tech because they are available. Period, nothing more than that."

Again, why are we talking about this? Just for the sake of it? :rolleyes:
 
Again, why are we talking about this? Just for the sake of it?

Probably yes :rolleyes:

Hehe. My comment about frying an R300 wasn't directed at you but forum readers in general.

Like there ever would be a chance to clock a R300 that high heh :p
 
Forgive me if I have got the wrong end of the stick, however I would like to put in my 2c.

It seems to me Aces (an generally exellent site) has taken exception to futuremarks marketing comments

By combining full DirectX®9.0a support with completely new tests and graphics, 3DMark03 Pro continues the legacy of being the industry standard benchmark. The high quality game tests, image quality tests, sound tests and others give you an extremely accurate overview of your system’s current gaming performance.

(copied from the aces article)

Now I have not read the white paper on 3d mark but it appears to me (from statements made here and elsewhere) that it states that they have tried to make the program more GPU dependant.

They seem to be reading the above comment as " The 3dmarks value at the end of the program represents an accurate overview of your current gaming performance" or something like that.

I do not get the impression that they think its bad to have a GPU benchmark, they are just upset (as are people on this and aces forum) that it does not fit in with the above statement " extremely accurate overview of your system’s current gaming performance."

the statement actually says "The high quality game tests" and "image quality tests" and "sound tests" and "others". I presume that the CPU tests come under others (though it would have been good to mention them!) so futuremark in the above statement are not claiming that only the tests run by aces (ie game tests 1-4) are represenative of current gaming performance, anymore than they claim that the CPU tests in isolation are represenative of current games or any of the other tests for that matter.

So the game tests are currently a gpu test (may be less so in future?) however the BENCHMARK is not it has a quite clearly titled CPU section which perhaps should be given more weighting in peoples minds in combination with the "3D"mark score. EDIT For current or older games
 
Vortigern_red said:
I do not get the impression that they think its bad to have a GPU benchmark, they are just upset (as are people on this and aces forum) that it does not fit in with the above statement " extremely accurate overview of your system?s current gaming performance."

Bah, they're bringing it up because of all the hub-bub regarding nVidia/FM at the moment and know it'll create quite a stir and attract a lot more hits. That is WHY they wrote it it, as certainly any reviewer worth a damn knows what the benchmarks show. (And if they were actually concerned, they would annotate said benchmark accordingly in their reviews. Some do.)

Does it have applicability? Sure, but certainly there's no hidden menace behind it, and they show less than they figure. 2003 has plenty of processor legacy, as test one shows to the extreme, and 2 and 3 to lesser degrees. This, of course, is factored in at the final score as well, which is why the 350/9700 and 2.8/9600 equal out. CPU is obviously a factor, just not the weighted factor.

The testing looks very slapdashed and hurried, and should have been much more in-depth to come up with better results. First off, the inclusion of the 8500 is rather a red herring, since FM would not recommend you use 2003 to benchmark that card anyway; I would much rather have seen the broadest scale they could offer in just the DX9 cards (say a 256MB 9800 Pro, a 9700 Pro, and 9600 non-pro which is a smaller scale, but fit within the designs and recommendations of the benchmark), and have each card compared across all the systems, with the ones they wanted to highlighted, but enabling viewers to follow all the trends. Also, they mention a point regarding the Gunmetal bench vs. the heavier DX9 tests in 2003, but fail to point out the disrepeny in Gunmetal itself, where a "GPU-limited" test has two wildly different GPU's scoring the same between 350 and 1.4Ghz CPUs, and that it shows only slightly more increase between the 2.8/9600 and 1.4/9700. Quite obviously both tests are not wholly GPU-limited, and have different methodologies that show some similarities and some differences.

Not to mention were are given only a vague listing of parts between all the test systems--which obviously have wide differences--and are left to guess if they shared enough equivalents to be testing the GPUs as much as possible, and not other parts of the architecture as well.

Basically, the end result is an unsatisfying article with meager conclusions towards a result that anyone worth a damn already understands and anyone foolish enough to purchase off one benchmark is too hopeless to help anyway--especially considering they'd have to take pains to ignore that the name of the company is "Futuremark."

If they'd wanting to do something that could be much MORE useful, they could research how 3D Mark 2001 performed between systems when it was new, how those systems performed on their games of the time, and how those same systems perform on games now. That would come a lot closer to deducing if FM is on the ball about following gaming trends and testing accordingly.

As it is, all we get is the knowledge that "benchmarks are finicky" and "the more you know, the better"--which hopefully anyone who'd be reading these sites knows anyway. 8)
 
It seems to me that Ace's only reinforced what everybody already knew, that 3dMk03 is a vpu benchmark, primarily. Of course it is perfectly accurate to state that '03 is a "gamer's benchmark" because the features it tests are found and will be found only in 3D games. Whatever cpu you use is entirely incidental and beside the point, IMO. Ace's might as well have run a GF1 on P4 3.2GHz/Athlon3200+ and tried to make something of that as well. I'm not really sure what Ace's point was, actually...
 
slides said:
Unknown Soldier said:
Slides said:
AzBat said:
I'm not getting excited over it. I just ignored it.

Good for you, let's hope all 3DMark03 users also learn to ignore it.

It amazes me how you turned around what AzBat said to your advantage. It, shows me though that you seem ignorant to what he really meant.

AzBat meant he ignored Ace's Article.. or at least that's how I picked it up.

And I thought AzBat meant he ignored 3DMark03 scores.

No. Unknown Soldier was right, I ignored Ace's article. The last paragraph that he mentioned showed that Ace's understood part of 3DMark03, but I still think they got it all wrong. That's why I ignored it.

Tommy McClain
 
Reverend said:
FM's 3DMark03 is nothing more than "Games will probably look, and perform, like our Game Tests if developers decide to do things the Game Tests do things. The Game Tests use the latest 3D tech because they are available. Period, nothing more than that."

Again, why are we talking about this? Just for the sake of it? :rolleyes:

I'll go with Reverend's post on this. No point in arguing when it's obvious what 3DMark03's purpose is and is not.
 
Back
Top