Higher 3DMark05 scores = Better Equipped for Future Games?

3DMark shows me if I have a better rig than others for Future Games :

  • No, definitely

    Votes: 0 0.0%
  • Not sure

    Votes: 0 0.0%
  • What A Stupid Poll (give reasons)

    Votes: 0 0.0%

  • Total voters
    270
DaveBaumann said:
The likelyhood is that the issues you describe are just as likely to be present then as they are in the future, regarless of the desing on the engine.

I disagree. You are basically saying that it is likely that the entire pixelshading (with normalmaps) approach in 3DMark05 is going to be used in the next one. It should be obvious that this is not the case.
 
So, future games are not going that way?

(Bear in mind what directions future API's are going and what features are likely to be licensed for use in the API's...)
 
Don't mind me, I'm just here to watch.

popc1.gif
 
Scali said:
I think your logic is flawed.
3DMark03 already predicted the performance of Doom3/Source. 3DMark05 is one step further. And there is no 'validity interval' of 2 years or anything. Where do you get that 'two years' from?
Even if there is another 3DMark released after this one, before new engines arrive, 3DMark05 could still be useful, if the new 3DMark does not test the exact same things (which it probably won't, the next 3DMark will probably have even heavier workload and use next-gen hardware features).

I see what you're saying but maybe 3dmark05 is a bit too forward looking. Is anyone going to own the same hardware that they do now in 2 years? If anything performance in Doom3 and HL2 will provide the best insight into how your hardware will perform in games over the next 18 months.
 
trinibwoy said:
I see what you're saying but maybe 3dmark05 is a bit too forward looking. Is anyone going to own the same hardware that they do now in 2 years? If anything performance in Doom3 and HL2 will provide the best insight into how your hardware will perform in games over the next 18 months.

You have a point there, 3DMark05 is a long way ahead of games at this moment. But it is the first big measurement tool for 3.0 shaders and shadowmapping. 3DMark always tries to support all the new features as quickly as possible, and games sometimes have trouble keeping up... Just look at 3DMark03... It used normalmapping and stencilshadows everywhere... It predicts Doom3 and HL2-style games... but Doom3 was only just released, and we're still waiting for HL2.
3DMark seems to be the only thing that can keep up with the incredible speed at which hardware evolves. Guess the company isn't called Futuremark for nothing ;)
 
To some extent, yes. Obviously if you have two computers and one of which scored 1000 more, then it's most likely to perform at least a little better in most games (especially if the games use shadowmaps rather than stencil shadows). But it is only a game simulation. it doesn't take into account the AI, input, network performance, and general overall variations between game engines.

It gives a rough idea, but it isn't acurate if you ask me.
 
Miksu said:
Okey, I just finished 22 pages-thread from the other part of the forums. The discussion here seems to be a direct follower to that now closed thread.

My answer to poll would have been "Yes, somewhat".

Yea I agree. I voted may. As I dont think it will be a definete yes as who know what other bottle necks will show up. But in gernal the faster your card is the better you should be able to handle future games does make some sense.
 
Bjorn said:
jvd said:
I'm also not sure if the heavy vertex shader usage will happen in future games

That's actually a concern of mine also. Doesn't seem to match what we've heard from the developers. Haven't Carmack and Sweeney both mentioned that vertex performance won't be a problem in their upcoming engines ?

I'd still like to see what happens in higher res , AA, AF though.

Vertex performance won't really be an issue for at least another genaration, Unreal 3 uses 400k, GT1 in 3dmark05 is using 1 million to 2 million polys per scene.
 
hmm if u mean the near future then no. the x800s vertex power wont matter because no developer will design a game that will actually make use of it. they will stick to low poly so everyone with a 50$ video card can run it. or they will just do all skinning on the cpu. but eventually when these types of games do come out, then yes if ur still running an xt it will be faster than the nv40
 
The heavy emphasis on vertex performance gives me pause for thought, While I believe a good emphasis on the mixture of vertex/pixel shader performance will ultimately lead to a performance indication of future games.

I am not convinced futuremark's take on vertex performance will indeed be the correct one, Expecially in 2 years, We'll game devs will probably be considering the 9200/5200 has lowest common denominators, Like they did the geforce 4 MX for this year.
 
ChrisRay said:
The heavy emphasis on vertex performance gives me pause for thought, While I believe a good emphasis on the mixture of vertex/pixel shader performance will ultimately lead to a performance indication of future games.

I am not convinced futuremark's take on vertex performance will indeed be the correct one, Expecially in 2 years, We'll game devs will probably be considering the 9200/5200 has lowest common denominators, Like they did the geforce 4 MX for this year.

which is basically what i just said :rolleyes:
 
hovz said:
ChrisRay said:
The heavy emphasis on vertex performance gives me pause for thought, While I believe a good emphasis on the mixture of vertex/pixel shader performance will ultimately lead to a performance indication of future games.

I am not convinced futuremark's take on vertex performance will indeed be the correct one, Expecially in 2 years, We'll game devs will probably be considering the 9200/5200 has lowest common denominators, Like they did the geforce 4 MX for this year.

which is basically what i just said :rolleyes:

And your point? What makes you automatically assume I read your posts? Actually I dont ever read your posts, Because you're always looking for a fight, Except in an instance you quote me. Such as this, I thought I'd remind you that your post is not the be end all of this discussion.

Chris
 
Back in the day of 3dMark2001, I was one of those people who'd snear at 3dMark. To be more accurate, I sneared at those who made a game out of getting as high score as they possibly could. I followed the overly simple logic that "I don't play 3dmark" therefore the scores didn't mean anything. I use it to see if anything funky was going on with my system-- but not to measure performance.

Then I ran across a link to the top 10 cards by average score. I noticed 3dmark ranked them exactly how I'd rank them based on in game performance. That's when I first started to think that maybe the score tells something after all.

Fast forward to 3dMark03... the perfect benchmark at the perfect time. The fx was fatally flawed wrt pixel shading but no game available at the time could reveal just how deep the flaws ran.

I still remember the thread where someone noted that the sky's brightness was off in Nature test. It was eventually pinned down to drivers lowering precision to fp16. I think everyone on these boards should know the story well enough from that point on... just one hack or shader replacement after another. And it was easy to surmise (well, it was easy for some of us ;) ) that nVidia would need to use similar techniques to get good performance out of heavily shaded games.

Now, with 05, it's still too soon to judge it's value. I don't think it's as critical now as 03, because it seems neither 6800 nor x800 have any fx-scale fatal flaws to uncover. I'm also leery about issues such as DST and other concerns that have been raised.

I'm not sure how to vote: I'm still in wait and see mode concerning the usefulness of 3dmark05, but 03 was awesomely good.


woah, this was a post of nearly WaltC proportions :oops:
But at least it's on topic and I'm not bickering with anyone. ;)
 
ZoinKs! said:
Fast forward to 3dMark03... the perfect benchmark at the perfect time. The fx was fatally flawed wrt pixel shading but no game available at the time could reveal just how deep the flaws ran.

Now, with 05, it's still too soon to judge it's value. I don't think it's as critical now as 03, because it seems neither 6800 nor x800 have any fx-scale fatal flaws to uncover. I'm also leery about issues such as DST and other concerns that have been raised.

Well the performance disparity is quite something. This could be the "fatal flaw" you are speaking of, and like 3DM03, we'll start to see that problem replicated in real world games as soon as developers start trying to push the envelope.

Is it possible the 3DM05 is showing ATI once again made a more balanced part than Nvidia? X800PE general performance gives a faster/better game overall, than 6800U being able to hit peaks in other benchmarks for individual techniques.
 
Bouncing Zabaglione Bros. said:
ZoinKs! said:
Fast forward to 3dMark03... the perfect benchmark at the perfect time. The fx was fatally flawed wrt pixel shading but no game available at the time could reveal just how deep the flaws ran.

Now, with 05, it's still too soon to judge it's value. I don't think it's as critical now as 03, because it seems neither 6800 nor x800 have any fx-scale fatal flaws to uncover. I'm also leery about issues such as DST and other concerns that have been raised.

Well the performance disparity is quite something. This could be the "fatal flaw" you are speaking of, and like 3DM03, we'll start to see that problem replicated in real world games as soon as developers start trying to push the envelope.

Is it possible the 3DM05 is showing ATI once again made a more balanced part than Nvidia? X800PE general performance gives a faster/better game overall, than 6800U being able to hit peaks in other benchmarks for individual techniques.


This would be the case if games are going to come out using super high polygon levels but thats not going to happen for at least 2 years, if Unreal 3 is using around 400k a max of 1 million per viewable area, and thats not going to be out for another year, its going to be awhile before you see true vertex performance coming around. As it was mentioned earlier, lower end cards still have to be supported to some degree.

This can be done

A) dropping some complex shaders
B) dropping polygon counts and keeping complex shadesr
C) combination of both, but higher end cards can turn on complex shaders
 
ZoinKs! said:
Then I ran across a link to the top 10 cards by average score. I noticed 3dmark ranked them exactly how I'd rank them based on in game performance. That's when I first started to think that maybe the score tells something after all.

Yeah, that was pretty cool. I wonder why they don't publish the ranking thingy anymore.
 
Think 2.0 shaders will be the default, most cards will support that even on the low end in the next year or two. Would think having different rendering paths based on shader performance would be a pain when trying to do a rational global setup. More resources to have multiple models for everything, probably complicates collision detection and skinning, regardless I think they can have more detailed models in general than what exists today in everyday games outside the one or two A titles.. Don't have to get insane with polygon counts to improve things, just get rid of triangle shoulders in characters and stuff like that. Hopefully in two years a lot of people will be on pci express platforms, that should get rid of a lot of the really old cards.
 
I voted in the "silly poll", not because this is a silly poll as such, but as I personally believe that 3DMark is a demonstration of the current DX9.0 features and how well a card runs that demonstration.
Coders write code to do the same job differently..

Off topic.. one thing that really gets me each time I see '05 is the point where the monster jumps the ship - where it exits the water and where it enters the water there's no attempt to model what happens. No waves, no ripples, no refractive spray!
 
Back
Top