My response to the latest HardOCP editorial on benchmarks...

This entire arguement was not heard from Crusher or Evlideus during the launch of the 3Dmark 2001, why is that ??
Scoring is the same, if you are unable to run the DX compliance tests (last year it was Nature , now its Mother Nature) you were penalized.
The DX7 cards scored very low compared to DX8 cards in 2001, same as this version.
Finally there is no other DX9 engines out there to use as a benchmark, so for now all we have is this to compare technical efficiancy.
The only difference this year is that one IHV that is whining and was part of the Beta program doesn't have a DX9 compliant card on the market yet, if you are too blind to see that then try taking the tunnel vision glasses off.

Do you think Nvidia is going to release PR about how great the benchmark is when they don't even have a compliant card on the market for a consumer to buy yet...and the one they do is being cancelled.

My my how the roles reverse.
 
Doomtrooper said:
This entire arguement was not heard from Crusher or Evlideus during the launch of the 3Dmark 2001, why is that ??

Well, for one thing I didn't register here until March 2002. For another, the features being tested in 3DMark2001 were a lot more predictable insofar as their use in games. With the ammount of programability in DX9, there's a vastly greater number of ways of performing the same type of task. A hardware light is a hardware light in any program, and the T&L functionality is the same across all cards, the only thing that differs is the performance. Pixel shading version 1.1 was also fairly limited in what it allowed. Pixel shading 2.0, Vertex Shading 3.0, and the various ways of breaking them down to allow support for cards with lesser versions are completely different situations. To take the specific way 3DMark03 implements these features and the way the perform fallbacks for older cards, and say it's indicitive of the overall ability of one card compared to another is rediculous. Especially if you're going to extend that comparison to try and make it relavent to games, which was the whole purpose of 3DMark03 to begin with--showing what cards will be capapble of in terms of features and performance in future games.

And I do believe I've read you proclaiming many times that synthetic benchmarks are useless, Doomtrooper, why has your role reversed so much?
 
Can I ask what game could we use today to tell us how well the current generation cards do using advanced shaders ?
 
Doomtrooper said:
Doesn't have to be here Evildeus, I follow you posts on 'other' forums and ummm..they tell the whole story ;)
Really? but at the time i wasn't interested in benchmarks :p

And, i don't know what were the argumentations ;)
 
nelg said:
Can I ask what game could we use today to tell us how well the current generation cards do using advanced shaders ?

Can I ask why you so desperately need to know how well current generation cards do using advanced shaders if no games today use them? :) I thought just about everyone here was against the idea of buying hardware because it's "futureproof".

edit:

I should reiterate what I've said before, don't remember which thread it was in. If 3DMark03 can help bring games that look like Game Test 4 to the market sooner, I'm all for that. I would love to be playing games with that level of detail. But we all know games like that probably aren't going to show up in the lifetime of any cards available these days, and I don't think anyone would expect current video cards to perform acceptably in the games when they do come (and if you believe 3DMark03 is an accurate portrail of such games, indeed they won't perform acceptably). So why then, given that and my previous arguments, are we supposed to care how cards compare to each other in this benchmark?
 
Crusher,

No I said I wanted modern games used in reviews more, and not use 3Dmark scores to determine a cards value. I would be happy to trade a DX9 game vs. a synthetic benchmark, but unfortunatley we are forever caught in the hell called DX7.
I also stated 3Dmark does have uses, like image quality comparisons and technology 'previews'. but doesn't relate to gameplay and I stand by that, and again the hardware is far ahead of game engines we really don't know what a DX9 engine would operate at, this is our ONLY gauge at the moment.

I never whined about the Geforce 3 gaining extra points when I had a Radeon, as I was aware the scoring is also based on being compliant with DX8, I would expect the same from people here.
 
Can I ask why you so desperately need to know how well current generation cards do using advanced shaders if no games today use them?

Because games WILL use them at some point. (DX8 level shaders and above). What is wrong with trying to get a grasp on how "relatively well" today's hardware runs tomorrow's games?

Everyone has their own time-tables for upgrading. Estimating how well cards will perform "in the future" should absolutley be a FACTOR in your buying decision. How much of a factor is dependent on the person, there buying habits, etc.

Can I ask you, that if we don't need to know how well current generation cards do advanced shading...then why should we care at all about ANY synthetic tests? (Not just overall 3DMark score, but specific feature tests?)
 
Hrmm... I just realized something. I read the Deus Ex 2 preview at IGN and noticed they've got it down for a release date of June 2003. That one little piece of information has motivated me to upgrade my system more than a 3DMark score could ever hope to. Not only that, but rather than base the kind of system I'm going to buy off of 3DMark scores, I'm going to wait until the game comes out and people start talking about what hardware runs it the best. So I say it again, 3DMark03 is promoting irrelavent comparisons of current cards using future features that might never be implemented the way they're implementing them. What good does it do me? Perhaps if it could predict how well hardware that isn't released yet is supposed to perform as well as people think it can predict how games that aren't released yet are supposed to be made, it might have some usefulness.
 
Another demalion monster post

First let me distinguish between DX "functionality" and performance "level". The "functionality" is features that can be rendered using the card, and the "level" is a card that was designed to the level of performance of that DX generation.

Crusher said:
Well, you can look at it a couple of ways.

First, you can say that my computer getting 30 3DMarks shows how useless it's going to be on DX9 games. The fact that it's going to be useless on DX9 games is probably true. However, that score is based only on Game Test 1, since that's the only test my computer can run. Game Test 1 is supposed to be a DX7 benchmark.

A DX 7 "functionality" portion of a DX 9 "level" benchmark. This does not mean that a DX 7 "level" card will necessarily perform well, but that is testing the DX 7 "functionality" of DX 9 "level" cards. I view it as an input to they dynamic of the overall score that is not computationally bound, but other than that I tend to agree that in isolation it is exactly as useless/useful as 3dmark 2001 tests have been in my opinion (I'm not a fan of the focus of 3dmark 2001 scoring).

Well, last time I checked, the GeForce 2 was one of the best DX7 cards around.

DX 7 level cards, maybe.

I think the peak framerate I saw displayed was 8 fps, and it usually was below 3 fps. That's not indicitive of any DX7 game I know.

Well, it might be if you ran all games just using DX 7 functionality at maximum settings with it. That's strikes me as a valid difference between benchmark and game behavior. The thing is, there are DX 7 functionality games that came out, and they require DX 8 level performance to utilize maximum settings. You simply don't use those settings, and are neglecting that as a result apples to apples comparison will run into CPU limitations before you could demonstrate the difference in performance capability of a DX 9 level card, let alone the cards that come after that the benchmark is trying to benchmark representatively.

Even UT2003 and NOLF 2 run much better than that on my computer, and they're about the most demanding DX7 games I've seen.

Do you really run them at maximum settings?

That tells me that Game Test 1 is clearly not indicitive of the type of situation it's meant to portrait.

I disagree about the type of situation it is meant to portray. Futuremark recommends you use their prior DX 7 level and DX 8 level (but mostly DX 7 functionality) benchmarks for better representation. You have to admit it makes some sense that they recommend this in lieu of Game 1, doesn't it? Well, if you only care about getting fps like you would in your DX 7 games from the same era.

If GT1 can't properly judge the abilities of a DX7 card running a DX7 game, how is it going to properly judge the abilities of a DX8 or DX9 card running a DX7 game?

GT1 tries to represents (not judge) the ability of the card in question when performing DX 7 functionality. DX 8 level card, or DX 9 level card, and beyond have to be represented, and as a result many DX 7 level card performance will be low. It is the non computational representation in the benchmark...as I've said I consider it dismissible in isolation as a DX 7 functionality test, but as something that makes sense as a contributing factor in the scoring.

The idea of this properly judging the abilities of any card running a DX 7 game is what I agree this test does not do, and what I think prior 3dmark benchmarks did not do. Fortunately for 3dmark03, however, there are some other tests that this test makes sense to be associated with. I'll discuss that when I mention shader testing.

Now you might say, "who cares? there are DX7 games out to test with if you want to know how well a card works with DX7 games. 3DMark03 is supposed to compare cards with DX8 & DX9 level games."

No, it is supposed to benchmark DX 8 and 9 level cards, with DX 8 and DX 9 functionality and a bit of DX 7 functionality representation. The DX 7 functionality test is scaled to DX 8 and DX 9 performance levels, and DX 7 performance level cards suffer as a result. This is natural and logical in my view.

That's all well and good, except that Game Test 1 still counts towards the final score on DX8 and DX9 cards.

To represent a card's ability to handle a simple non-computational workload. Since not all games will be as computationally bound as the rest of the benchmark suite, this makes sense in my opinion. Another way to view it is as simply a simple performance yardstick that by necessity scales from DX 7 to DX 9, and beyond. The low fps values for a GF 2 don't seem too surprising to me in this regard.

So here we have a clear example of how the final 3DMark03 score is being based in part off an irrelavent and inaccurate test.

I don't agree with the summary in the way you mean it: Though this test does seem irrelevant and inaccurate to trying to judge DX 7 game performance, that is not its purpose. In fear of being accused of semantics games later :p, I'll clarify again here that "judge" and "represent" have significantly different meanings in my discussion.

Then you might extend the argument to say that, if the DX7 test is not indicitive of even the worst-case scenarios of DX7 applications, how can you trust that the DX8 and DX9 tests will be any more accurate?

That is a pure circumstantial association, where you say "this person in the family is totally unlike that person in the family, but this person is dishonest, so how can you trust the rest of the family?"

Indeed, NVIDIA's argument is that the methods of rendering scenes in the last 3 game tests are not efficient, and not what game developers are going to be doing in the future.

And there can be a whole valid discussion of this, but you are trying to circumvent it by the prior illogic. I think I've had this discussion at length elsewhere, but Dave H has a much more succint summary somewhere around...

And why would you expect them to be? How can a company designing an aplication in 2002 predict what methods game developers are going to be doing in 2003 and 2004? I don't think it's possible.
The road you travelled to get here is very winding. My argument, repeated very briefly, is that I think what has changed is that there is a common factor for measurement that can a very useful prediction, and that is shader performance. All GPUs moving forward will concentrate performance on executing shaders as quickly as possible, and similar to how one benchmark of fairly representative assembly instructions can show performance difference between CPUs with good representation, I think this serves to simplify a lot of the issues of predictability for 3dmark03.
Now, the efficiency of their implementaion is an interesting question, discussed at length elsewhere. I don't consider the matter settled, but Futuremark has proposed some creditable rebuttal to nvidia's comments (I think), and we could perhaps test their validity.

And that brings up the other twist--the fact that the majority of game developers inherintly try to do their best to make sure games perform acceptably on all makes of video cards.

Exactly the difference between games, and apples to apples benchmarks...the two have to behave differently when seeking a wide range of scalability (for the DX 8 functionality tests, this results in some DX 8 level cards performing as poorly as yor GF2 does in the first test).

If a developer does things the way 3DMark03 does them and finds tremendous descrepancies between how different cards from different vendors and/or different generations perform those actions, they will probably change the way they're doing things.

Games do change the way they are doing things. You act like you play all your games on a GF2 at maximum settings. 3dmark03 does adapt, but being a benchmark with equivalent output being the goal, it's adaptations are to expose functionality so that performance can be measured, not drop funtionality and scale back to achieve acceptable performance.

As I've said before, I think 3dmark03 has turned into one big collection of both simple and complex synthetic tests, and that due to dependence on shaders they've managed to, pretty likely in my estimation at this time, get it right.

You can argue the extent to which things will change, or the number of developers who will ultimately make such decisions, but that doesn't change the fact that it's a relevant variable.

I don't think people should confuse 3dmark scores with fps values, and I don't think they should have in the past. That doesn't mean this new benchmark is useless, though.

...

Don't reviews have a responsibility to protect the public from themselves in this regard?

Snipped a bunch of text I agree with. Yes, they do. I think they should do this by education.

In contrast, I have a problem with, for example, recognizing that responsibility only after an article that makes the very same points and leaves exactly the same questions unanswered as a message an interested party has been sending around to reviewers. This same interested party who used to benefit from reviewers ignoring that responsibility, and whose message has gaping inconsistencies and plentiful misinformation.

Compelling arguments can and have been made for both sides, but I think ultimately history and logic point towards the decisions [H] is making with their reviewing philosophy are more or less going in the right direction. And that's hard to say coming from someone who's never been particularly fond of [H] as a whole ;)

However I view the above, I must consistently point out that I recognize I may be wrong in my evaluation, and the 3dmark03 score is still not representative. I've given my reasons above and in plenty of places why I don't think I am...take it for what its worth, and I look forward to investigations that confirm things either way.
 
Joe DeFuria said:
Can I ask you, that if we don't need to know how well current generation cards do advanced shading...then why should we care at all about ANY synthetic tests? (Not just overall 3DMark score, but specific feature tests?)

I think that along with the vast ammount of programability in current hardware, has come the innate quality that it's impossible to do direct comparisons via synthetic tests, or make any predictions on how any methods chosen to do so would correlate to future games. It's like trying to say that we have been able to do first order polynomial equations in the past with 2 variables, so we should be able to easily make equivalent first order polynomial equations now that we have 500 variables.
 
Additionally, I just thought of a rather disturbing possibility for the misuse of 3DMark's results. It's possible that some developers who are beginning to incorporate some DX8/9 features might use 3DMark's performance results as a baseline comparison. They may implement certain features, compare the performance between cards and say "well, it's in line with what 3DMark shows, I guess that's good enough", when in fact there may be other, better ways of implementing those features. But what company would spend valuable development time trying to find those features when the current methods they're using are in line with what the public is expecting them to be according to 3DMark's results?
 
Has it been established whether Nvidia's update drivers used in the 3Dmark2003 test on HardOCP are using a cheat / 12bit precision or are genuine 81% (GT2) and 70% (GT3) improvements yet?
 
Crusher said:
nelg said:
Can I ask what game could we use today to tell us how well the current generation cards do using advanced shaders ?

Can I ask why you so desperately need to know how well current generation cards do using advanced shaders if no games today use them? :) I thought just about everyone here was against the idea of buying hardware because it's "futureproof".

I can't speak for anyone here but myself. I am in the market for a new video card (prob. 9500 pro) and would just like to get the most for my money. I went from a pentuim 166 to a P4 1.8a, so maybe I do hold on to my stuff longer than most people. To me it makes no difference to play Unreal at 300 FPS any thing over 50 will do fine. The question I have is if a new game comes out 1-2 years from now, that does use this tech, will it be playable ? So, does it matter to me ? YES ! I know there is no such thing as futureproof but somethings have more legs than others.

BTW I will take it the answer to my original question is no.
 
I think that along with the vast ammount of programability in current hardware, has come the innate quality that it's impossible to do direct comparisons via synthetic tests, or make any predictions on how any methods chosen to do so would correlate to future games.

Again, Crusher, you seem to be arguing that it is absoltuely POINTLESS to even run any synthetic tests at all? That they hold no value whatsoever?

Additionally, I just thought of a rather disturbing possibility for the misuse of 3DMark's results.

Well, hell, I can think of a thousand disturbing ways for anyone to abuse any benchmark test, synthetic, game, or otherwise. Let's ban them all!
 
The question I have is if a new game comes out 1-2 years from now, that does use this tech, will it be playable ? So, does it matter to me ? YES ! I know there is no such thing as futureproof but somethings have more legs than others.

You must be some whacko to actually consider future gaming perofmance as a factor in your decision! ;)

To be clear, 3DMark won't really tell you if your card will be able to play those games 1-2 years from now. What it DOES do, is tell you which cards have the best "chance" at doing that.

Benchmarking many of "todays games", you would think that the GeForce4 Ti and Radeon 9500 Pro have about equal chance...with even the GeForceTi having a better chance. 3D Mark says the 9500 has a much greater chance.

I'd wager that the "non acutal game but synthetic 3DMark03" is making a better prediction.
 
I'm saying that it appears to me that synthetic tests whose purpose is to compare the relative performance of features in modern video cards are losing their meaning in terms of how they relate to actual games. What good does it tell you to know that one card is faster than another at rendering stencil shadows when calculating the geometry redundantly at intermediate steps in the rendering process, if no game is ever going to do that? Does it tell you how fast their vertex shaders are relative to one another? Or does it tell you how much each card stalls when throwing in redundant processing at different points? Or does it say something completely different? The one thing that you know it doesn't say is how fast Doom 3's vertex processing will be. Does it matter how well a card can render a single textured background with quad textured models if no game is ever going to use a single textured background with quad textured models?

It seems to me like the only thing 3DMark03 is good for is telling you how well a video card is going to do at running 3DMark03. Whereas 3DMark2001SE seemingly had a little more relevance to the games of the past couple of years, and a much better chance at predicting how games released after it would be rendered, I see very little hope for 3DMark03 to be able to do this. So I guess if it's important to you how well you can run 3DMark03, then the scores are relevant to you. If it's more important how fast games that aren't made yet will run, about the only thing you can do is wait for those games to be made.

I guess the last point I would make is, if you really want a synthetic benchmark to compare the features of current video cards, then the benchmark should test each feature separately and completely independently of each other. By choosing game-like tests to compute the scores used to compare products, 3DMark03 inherintly shows they are focused on not showing feature comparisons, but gaming comparisons, and at that they fail miserably. If their scored tests for vertex and pixel shaders were more like their fillrate test, then I could see some legitmacy in the results. Instead, they've tried to fix the flaws that 3DMark2001 had (dependance on other parts of the system) in a day and age when the type of benchmarking methodology that 3DMark2001 used (make it similar to how they think games will be) is inadequate.
 
Back
Top