HardOCP is not going to use 3DMark03?

McElvis

Regular
A good write up and interesting results from HardOCP...

www.HardOCP.com

But the most interesting is the conclusion...

With all that we have seen so far it does not appear to actually give us any indication how video cards are going to compare in any real games. It produces an overall 3DMark which is taken from unbalanced game tests. Furthermore as we have seen directly above in the benchmarks, drivers can be optimized to run the game tests faster. The problem is this is just a benchmark and not based on any real world gaming engines. Therefore while you may get a higher 3DMark number you will not see ANY increase in performance in any games that are out there.

In closing, Kyle has informed me that [H]ard|OCP will not be using the overall 3DMark03 score to evaluate video cards.

What do people think?
 
We aren't going to rely on the overall result

the specific feature tests may be used where appropriate

we'd rather use actuall games whenever possible
 
I agree. This will hopefully put the companies back on track to optimize drivers for gaming performance instead of meaningless benchmarks used for dick measuring contests. We don't pay $300+ for a video card to "play" a benchmark do we? I play games and what i want is good performance in them.
 
You know, it still cracks me up when I hear people say things like "I lost 100 3dmarks with the new set of drivers, I am going back to the old set. These drivers suck", without even realizing the benefits in-game that the new drivers might give.

Edit: Hmmm... seems a bit out of context. What I was trying to say is that I agree, 3dmark scores are taken way too seriously.

Brent, seeing how the new drivers from Nv are doing wonders for 3dmark03, have you run any game benches with the new drivers? Do they work wonders for games too?
 
Hmm...I think the benchmarks are more representative than 3dmark ever was, though the proportions may not be.

The workload as far as I can tell (not having actually run it yet myself, just going by the Rage3D comparison ;) ) is balanced towards achieving a set level of quality if a card can at all, and as such is quite representative of the performance disparity between the cards in question. In actual games, the quality would be reduced to achieve acceptable performance, and that would not be a helpful practice in a benchmark. If games run on a 8500 tried to use all the features that the same games running on a 9700 would use and get 30 fps, I'd think it would crawl just as much as 3dmark03.

So, I'm actually of the mind that due to the nature of 3D gaming's upcoming evolution with regards to shaders, and the significant shader performance increase growth likely to be offered in future cards, that 3dmark03's approach is the best way for a benchmark to perform when focusing on that type of functionality. While the data is scarce and hasn't had a chance to demonstrate applicability, a 9700 Pro scoring near 4 times an 8500 score seems very reasonable to me.

Then again, I don't have my ego wrapped up very tightly at all in my "3dmark size", so I'm sure the vast majority of users will not share my opinion. However, other valid reasons besides ego bruising :p might exist for disagreement... :?:
 
Good idea, Brent. The only real good that ever came from 3dMark was the results of the synthetic benchmarks, where the CPU had only a very minimal impact on performance.
 
Brent said:
We aren't going to rely on the overall result

the specific feature tests may be used where appropriate

we'd rather use actuall games whenever possible

I'd say the overall result is a good indication of using all feature in combination...the problem is not the result, it is the assumption that has been fostered that it is representative of games at the time. In this case, with 3d gaming evolution taking a relatively predictable (as far as the focus on shader performance, if not the methodology and architectures for achieving it) path in the future, I'd say 3dmark03 is looking to be much more applicable to future gaming (when looked at as completely different than a gaming benchmark) than 3dmark 2001 was.

I'd say an idea like being able to scale "fallback modes" to see at what levels older cards might perform acceptably might be a good idea, but that would just bring us to 3dmark 2001 again (which was one big "fallback mode").

I'll just point out, as worm probably would, that 3dmark 2001 is still there for that...

But perhaps I speak too hastily on a benchmark I still haven't run yet. :p
 
Hi there,
McElvis said:
A good write up and interesting results from HardOCP...
In closing, Kyle has informed me that [H]ard|OCP will not be using the overall 3DMark03 score to evaluate video cards.

What do people think?
I applaud Kyle / [H]. The individual scores are interesting, but the 3DMarks? Naaaa.

ta,
-Sascha.rb
 
3dmark has always been a biased benchmark, so personally I dont see what all the fuss is about now. When the Gf3 was released 3dmark was biased to absurd proportions towards Nvidia, and I dont remember it being dropped from any of the big hardware sites then. So why are certain sites so vocal in their complaining now?
 
duncan36 said:
3dmark has always been a biased benchmark, so personally I dont see what all the fuss is about now. When the Gf3 was released 3dmark was biased to absurd proportions towards Nvidia, and I dont remember it being dropped from any of the big hardware sites then. So why are certain sites so vocal in their complaining now?

Because of the very obvious Nvidia smear campaign that is going on. Nvidia is putting pressure on the press because the benchmark is much more fair towards hardware vendors than in the past. Maybe FutureMark turned down NV dollars due to the beta program they outline on their site. Either way since NV is/was a member they had equal opportunity to voice this concern and influence the test design a long time ago. :!:
 
duncan36 said:
3dmark has always been a biased benchmark, so personally I dont see what all the fuss is about now. When the Gf3 was released 3dmark was biased to absurd proportions towards Nvidia, and I dont remember it being dropped from any of the big hardware sites then. So why are certain sites so vocal in their complaining now?

Maybe it's because the Geforce3 was the fastest card with the most features when it was released? Not some deliberate attempt to make the most powerful card any 'better' by writing some benchmark for it?

Or would you rather 3DMark target a Voodoo3 level card and make all the Geforce3's features irrelevent?
 
Back
Top