Interesting (3DMark2003) article at Aces Hardware

Sxotty said:
I have to also respectfully disagree, and agree. First I agree that it is good to stress the GPU(especially wrt a 3d review I mean duh), but I disagree that future games will not be CPU intensive. There may be a wave of mindless games, but I certainly hope that the AI, and other parts of the game (like the physics you pointed out) advance to the point that the CPU is working as hard as the GPU afterall that is the point isn't it? To be fun and not just pretty.

People do think of it as a gaming benchmark, and not just a 3d benchmark even if they should not.

Well you have to balance this. You can not have AI or phyics that are so complicated that on a low end CPU you have "dumber" bots. Also what about network games? That makes up rather good chunk of a games life (and yes it depends on the game itself). For example your CPU is not spending time to cacluate AI cycles while playing in your UT2k3 clan match. Idealy CPU requirements would rise with the processor technology so we could have the best of both. But to say that one is going to take over as being the major bottle neck is a guess at best and IMHO not a very good guess....
 
Yes. And the rendering techniques used in 3DMark highlight that you can create complext gaming scenes and character animations with very little CPU overhead - now game can use the excess overhead for those things. the point being is that, at present, they aren't because they are still too bogged down with dealing with rendering that they don't need to with modern graphics boards.
Hey, we can make a game that uses such an advanced AI that it is totally CPU bound, and the GPU makes no difference at all. I doubt that that will be a realistic approach for the average game. Just as realistic as your example.
 
Bjorn said:
And as has been said already, how many of the majority of the 3D Mark users read the whitepaper ? My guess is not that many, and since that's probably the case, then i think that they should consider changing that text a bit.

As any judge will tell you, ignorance is not an excuse.
Here is a no brainer. Do not test <Dx9 class hardware with a Dx9 class benchmark. If 3DMark03 shows that two Dx9 class cards perform within 20% of each other, do you think that this would probably not translate into a comparable difference in real world DX9 game test?
 
Oh and can someone please explain to me why AcesHardware had to complicate things by not only having variable CPU but also variable GPUs in his overview... if he wanted to prove the CPU independence he should have stuck to one graphics board and not polute things by comparing 9700 and 9600 on the 2 key systems in the analysis.

Excellent call. I found that highly annoying while reading it too. Alternatively he could have run all CPU's with both accelerators in all tests.[/quote]



Actually it makes perfect sense. Consider you have a dated system and want to upgrade in order to play current games, there are two ways to go assuming that you have a defined budget, you either get a moderate combo which will cost around 250 $ and buy a comprimising video card such as radeon9600(non pro) OR you spend all your budget for a top notch video card and keep your old system. Futuremark being a pure synthetic graphic system benchmark (not neccessarily a bad thing) tells you to go for an expensive video card forseeing that future titles will be mostly VPU limited in nature.

AH bring the issue because they observed a contradicting results with current titles. Their mistake was not performing a more comprehensive analysis by including some of the shader intensive titles to their test. They (most probably intentionally) just demonstrated the examples (CPU/system bound titles) contradicting to futuremarks predictions.
 
DaveBaumann said:
I think we all knew older games were CPU intensive, however ask yourself if this will be as much the case for newer titles. Look at the difference a high end card has in comparison to a mainstream board in a relatively new title such as Splinter Cell - this is going to even more evident when we see some more PS2.0 games bencharks (soon).

However, for the fact that the 3D benchmark is stressing the 3D element is "Yay" as far as I'm concerned.


Being in 3d bussiness you wish to see that trend in the game industry. I, being only an enthusiast share the same wishes with you but I just dont see that coming. There will always be games making comprimises in the visual quality in favor of other aspects thus stressing CPU/chipset instead of graphic cards.

Not everybody have R350 to play games but most people do have fast CPUs. Most people are playing games with moderate cards on moderately fast systems. It is easier and sometimes cheaper to upgrade the system instead of paying all your money to the VPU. Developers know that. Thats why they released software renderer to UT2003. Am I wrong?
 
Slides said:
That's great. But before you accuse Ace's of misleading people, you should take a second look at FM's own statements wrt 3DMark03. Most people who use 3DMark will not read their white paper or whatever it is. They'll run it and compare the scores with others, and use it as a measurement of their system's gaming performance.

lol .. I guess then that you didn't know that just the Game Tests were used in 3DMark2001 to get a final score.

Guess u got a bit of learning to do then.

Ace's should've know this too if they've ever benchmarked anything with any of the Futuremark products.

I haven't read the whitepaper but it's been common knowledge that the GT's are used for the score.

3DMark2k3 fills my need.. I see what it can run on my GF3(graphics wise), what my GPU actually does(score wise) and when I do the sound tests, what my PC's performance actually is(on how much fps difference it takes).

I used to have a P4 1.7 GHZ and with my GF3(non-ti) I would get 1013 with 43.45 Drivers. Now I got a AMD NF2 3200+ 333 Sync'd and with my GF3 i get 1053 max(please take note this is 3DMark2003 std not 320). So while I got a 40 point increase, it means that what I'll have to do in the future to increase my score will be to upgrade to a NV35/40(highly unlikely) or a ATI9700/9800 Pro(very likely).

US
 
Yes. And the rendering techniques used in 3DMark highlight that you can create complext gaming scenes and character animations with very little CPU overhead - now game can use the excess overhead for those things. the point being is that, at present, they aren't because they are still too bogged down with dealing with rendering that they don't need to with modern graphics boards.

Though I do somewhat agree that 3dmark03 is absolutely indicitive of 3d card performance in the future; I don't see it being a good indicator of actual games.

Some reasons being why.
1. I doubt seriously that 3dmark is actually calculating any AI at all; just relying on scripted events; which leads into 2.
2. Its apparent that theres no reason to do any dynamics calculation what so ever. Everything is scripted so if your trying to bench a video board you might as well put the least possible load on the CPU that you can.
3. Since everything is likely a scripted event, other things that might normally be going on in a given game won't have to be calculated by the CPU nor loaded into system memory.

Of course if I were developing 3dmark, and I had a dynamics engine available. I would evaluate my scenes and "bake" them to a cache of sorts. Since I know none of my scenery will change except for camera angle(even then few people would see, but for sake of argument) this wouldn't be a problem. Theres no reason to recalculate collisions if the same thing is happening over and over again.

I don't expect Dx9 games will be as free of the CPU as 3dmark2003 is. Maybe some time in the future we'll have "Consoles-on-a-card" type of devices that leave the CPU almost 100% idle. But thats some measure away.

Of course this is why I never considered synthetics to be an important representation of real game performance. Though they certainly make fantastic feature tests and can reflect how the card itself will do under heavy load.

As for wether futuremark or aces hardware are misleading; its easy for me to see both sides. Imagine the poor bastard who drops 400 dollars into a 9800 pro to play half-life2 on only to discover that his PIII 500 is still running it at 20fps tops no matter what the resolution.
On the flip side, 3dmark2003 does specifically come with tools to benchmark most every aspect of a typical gaming rig. And ignorance is no excuse as they say.
 
Slides said:
The bottom line is that the majority of people who use 3DMark are not 3D tech geeks, but gamers.

and only geeks read ACE's hardware, which is why more research and a more creditable article is not excusable.

If you want to compare CPU's look at the CPU test scores.
 
Unknown Soldier said:
lol .. I guess then that you didn't know that just the Game Tests were used in 3DMark2001 to get a final score.

Guess u got a bit of learning to do then.

Huh? When did I say that anything other then game tests go into the scores? That is exactly why 3dmark scores are not representative of overall system performance but video card performance.

Ace's should've know this too if they've ever benchmarked anything with any of the Futuremark products.

I haven't read the whitepaper but it's been common knowledge that the GT's are used for the score.

Who is disputing this? The thing being said is that the game tests are not entirely representative of current games. Which is true.

3DMark2k3 fills my need.. I see what it can run on my GF3(graphics wise), what my GPU actually does(score wise) and when I do the sound tests, what my PC's performance actually is(on how much fps difference it takes).

Well ok, but for me 3dmark03 is not a good indicator. I still use a GF2 and while I run my games with 1024x768 with no AA or AF, I still get reasonable performance with many current games. If I were to just look at 3dmark scores this would be impossible and I would need at least a 9600 to get anywhere. Of course, I realize I cannot hang onto my GF2 for the future games, so for that purpose 3dmark scores would be more accurate.
 
Randell said:
and only geeks read ACE's hardware, which is why more research and a more creditable article is not excusable.

If you want to compare CPU's look at the CPU test scores.

Ace’s came to the correct conclusion by labeling 3DMark03 as a synthetic GPU test. The only issue anyone can raise is that Ace’s only looked at FM’s PR claims on the main page and not the whitepaper.
 
Slides said:
Huh? When did I say that anything other then game tests go into the scores? That is exactly why 3dmark scores are not representative of overall system performance but video card performance.

It's not meant to represent the overall system .. just the GPU performance.

Slides said:
Who is disputing this? The thing being said is that the game tests are not entirely representative of current games. Which is true.

It's also not meant to represent current games .. but Future games.

Slides said:
Well ok, but for me 3dmark03 is not a good indicator. I still use a GF2 and while I run my games with 1024x768 with no AA or AF, I still get reasonable performance with many current games. If I were to just look at 3dmark scores this would be impossible and I would need at least a 9600 to get anywhere. Of course, I realize I cannot hang onto my GF2 for the future games, so for that purpose 3dmark scores would be more accurate.

Exactually, which is where Futuremark are actually aiming. New or future graphics cards.

US
 
Given the nature of the benchmark; I would say future looking video performance is measured more than future looking game performance.

See my above post as to why.
 
Actually it makes perfect sense. Consider you have a dated system and want to upgrade in order to play current games, there are two ways to go assuming that you have a defined budget, you either get a moderate combo which will cost around 250 $ and buy a comprimising video card such as radeon9600(non pro) OR you spend all your budget for a top notch video card and keep your old system. Futuremark being a pure synthetic graphic system benchmark (not neccessarily a bad thing) tells you to go for an expensive video card forseeing that future titles will be mostly VPU limited in nature.

I´d really love to see somebody pair a P2 350MHz setup with a 9700PRO. What about the motherboard or the PSU as two quick thoughts, despite the fact that it´s completely senseless.

Most of you just read the final FM score -as does Ace´s - and draw your conclusions from there. It hasn´t lighted apparently a lightbulb in anybody´s head yet, that each individual test conducted in the benchmark represents something.

Accordingly:

P2 350MHz :

9 CPU Marks
1.1 fps
0.2 fps

Celeron 1.4GHz:

147 CPU Marks
17 fps
2.5 fps

P4 2.8GHz:

557 CPU Marks
60.1 fps
10.2 fps

Now while most agree that in order to play recent games adequately you need a good combination of a well performing CPU and GPU, the oxymoron is that conclusions get drawn entirely from a meaningless number that represents a summary. Nope the CPU test in there is definitely decorative and FM implemented it just for kicks.

I guess 9 CPU marks or 0.2 fps in the second CPU test don´t tell a story at all. Good luck playing a game like say UT2k3 with that one LOL ;)
 
I'm seeing alot of 3dmark03 is worthless comments in forums lately, but looking at the Doom3 benchmark at HardOCP
I see that R9800Pro is about 50% faster than R9500Pro and GFFX5900Ultra is about 80% faster than GFFX5600Ultra. Does not that translate well to the 3DMark03 scores? Doom3 is pretty GPU limited.
Looking at R9700/9800 vs NV30/35 3DMark03 scores however does not give the right picture (at least not in HardOCP Doom3 benchmark), but that can be due to IHV specific optimizations.

Or Im I totally off here?
 
Unknown Soldier said:
It's not meant to represent the overall system .. just the GPU performance.

Exactly, so FM should better clarify what their benchmark is supposed to do, since a lot of people are acting surprised and shocked after reading Ace's article.

Unknown Soldier said:
It's also not meant to represent current games .. but Future games.

I know this, you know this and now Ace's readers know this too. But do 3DMark03 users really know this? Has FM come out and said that it's meant for future games and NOT necessarily current games? I don't know, I haven't spent enough time investigating this issue.

Unknown Soldier said:
Exactually, which is where Futuremark are actually aiming. New or future graphics cards.

Then why exactly is it billed as "The Gamers Benchmark" when it is clearly debatable whether it is a accurate representation of gaming performance.

Look, I appreciate 3DMark03's value as a video card benchmark, but people continue to misuse it's intent as by thinking that upgrading video cards will give them a huge increase in performance, when in many current games, this is not the case. I think FM could have been a little more upfront about this.
 
Well then
it looks almost like we're all in agreement for the most part.

Though its more of a video card benchmark than a game benchmark.
 
Look, I appreciate 3DMark03's value as a video card benchmark, but people continue to misuse it's intent as by thinking that upgrading video cards will give them a huge increase in performance, when in many current games, this is not the case.

Agreed as far as the misuse goes, yet from both users as reviewers too.

As for new generation cards not giving a huge increase in performance, what exactly is the performance difference between let´s say R8500/GF4Ti and 5900/9800 in your opinion? Minimal?

Again 3dmark2001 was just as GPU limited as 2003 today with the NV20 when it launched. Wasn´t the GF2 Ultra in some cases embarassingly faster than the GF3, at the latter´s launch?

Wasn´t 3dmark2001 one of the few tests that the GF3 could take a significant lead over the GF2Ultra? Wasn´t the GF3 significantly faster than the Ultra as games became more complex?

Though its more of a video card benchmark than a game benchmark.

What´s a "game benchmark"? If you want a game benchmark use real games to measure performance, and leave synthetic applications to do the "lucky guestimates" in relative terms for what the future might hold. Wether it´s accurate or not should be determined if games start to actually use even partially dx9.0 shaders, and that´s still going to take time.
 
Frankly, ever since running across 3DMark, I've always known, accepted, and expected that it was far more of a video card measurement than overall system--and I approved. I'm not sure what people are so confused about now, nor what they're in arms about. Plus, I don't think Ace's testing process was very conducing towards any real conclusions. For an article like this, and taking the stance they are, they need a far more in-depth analysis, else it mainly looks like they have some kind of tiff with FutureMark and go about it in a shady manner.
 
Slides said:
I know this, you know this and now Ace's readers know this too. But do 3DMark03 users really know this? Has FM come out and said that it's meant for future games and NOT necessarily current games? I don't know, I haven't spent enough time investigating this issue.
Yes, they have.
Slides said:
Then why exactly is it billed as "The Gamers Benchmark" when it is clearly debatable whether it is a accurate representation of gaming performance.
*Sigh* Because it is a benchmark for "gamers" - simple as that. It's not for office workers, it's not for internet connection testing, it's not for Photoshop users. The only people interested in buying and using 3D graphics cards are people who play 3D games, aka "gamers". Perhaps I should have said more about this in the help file but I had obviously made the mistake in granting too many people with enough intelligence to see this for themselves.
 
Well, as it is future proof, it's for future gamers :D.
I would say, it's for people with a video card simply i.e. everybody ;)
 
Back
Top