Help me understand the AF/[H] controversy

WaltC said:
The "funny" part for me is that one of my long-standing objections a couple of years ago to using 3DMk01 was that it was based on the Max Payne engine, which was only one game engine out of the myriad game engines 3D games are built around.

Actually I thought it was kind of nice because it gave some kind of relevance to real game and although you and others will disagree I believe you can extrapolate how your machine/card would do in other games(relatively speaking). Since 3DMark03 no longer uses a game engine it can have no relevance to any game at all. I'm not complaining about that. I like that fact it's a more of a synthetic GPU benchmark and that it puts more of a stress on the video card. However, I can understand why people would complain that it has no relevance to any games sans game engine. What Futuremark needs to do is build a games engine just for the 3DMark and then sell or give the engine to game developers. Provided they still want 3DMark to have relevance with games.

WaltC said:
That's why I think the whole "anti-3dMk03" crusade is dishonest and misrepresentative on its face.

Agreed.

Tommy McClain
 
Well, I'd only say to look at the difference between the performance of games built around the UT2K3 engine--like Unreal2 and UT2K3. Same engine, big performance differences in some cases. Look at all of the various games built on the Q3 engine--big differences between the games. And using the Max Payne engine for 3DMK01 is not the same thing at all as running Max Payne the game, for a number reasons, as I'm sure you'll agree. At best you can say that 01 was "based on" the Max Payne engine, not that it performs identically with Max Payne the game.

Different games can use varying degrees of rendering precision available in the same game engine, for instance. Game engines themselves differ widely in performance because of such factors. Then there are differences in texture use, resolution, ad infinitum--all of which will impact performance to one degree or another.

Let's also not forget the differences in IHV's drivers when it comes to one game or another, even if based on the same engine. Is running Soldier of Fortune II going to help me figure out how my system will run Q3? Will running UT2K3 OpenGL allow me to extrapolate my system running Q3, etc.? That's what I'm talking about, basically, which is why I was never much impressed that running 3DMk01 would be "more realistic" because it was "based on" a game engine used in a single 3D game.

Hey, though, it's OK if we disagree on this one...;)
 
digitalwanderer said:
I agree, but I don't disagree with Kyle's take on benching entirely either. I think that pushing for in-game benching is a GOOD THING(tm)(

It is a bad thing if this ideology carries with it the simplistic and superficial mentality of reporting only the performance within certain applications and not making analyses, generalisations and predictions about the cards potential in general. My main objection with Hardocp's approach is just this, that they plan (and encourage others) to just give readers impressions of gameplay in certain select titles.

This kind of an approach itself is ok for less in-depth reviews, but more professional reviewers must continue to analyse the cards performance, both strenghts and weaknesses, beyond individual applications. Hardocp's suggestions are completely at odds with this.

A more sophisticated approach to reviewing video cards requires the understanding that the benchmark results are indicators of the cards potential in different situations, and based on these indicators you make educated inferences of the cards performance in general. I don't want reviewers to disregard this and just be content with describing gameplay in a couple of titles.
 
WaltC said:
Well, I'd only say to look at the difference between the performance of games built around the UT2K3 engine--like Unreal2 and UT2K3. Same engine, big performance differences in some cases. Look at all of the various games built on the Q3 engine--big differences between the games. And using the Max Payne engine for 3DMK01 is not the same thing at all as running Max Payne the game, for a number reasons, as I'm sure you'll agree. At best you can say that 01 was "based on" the Max Payne engine, not that it performs identically with Max Payne the game.

Different games can use varying degrees of rendering precision available in the same game engine, for instance. Game engines themselves differ widely in performance because of such factors. Then there are differences in texture use, resolution, ad infinitum--all of which will impact performance to one degree or another.

Let's also not forget the differences in IHV's drivers when it comes to one game or another, even if based on the same engine. Is running Soldier of Fortune II going to help me figure out how my system will run Q3? Will running UT2K3 OpenGL allow me to extrapolate my system running Q3, etc.? That's what I'm talking about, basically, which is why I was never much impressed that running 3DMk01 would be "more realistic" because it was "based on" a game engine used in a single 3D game.

Hey, though, it's OK if we disagree on this one...;)

;)

Walt, I'm not totally disagreeing with you though. I agree that each game, each card, each machine and each drivers have varying effects on the performance. So yes, if you wanted to know the performance in Quake 3, then yes, it's BEST to test that game, that card, that machine and that driver. What I was trying to get at was that I don't believe the 3DMark overall score is totally useless in this regard. For those, that don't or can't get performance on the game or games they want, the 3DMark overall score can be a general indicator of performance. However, I will say that the latest cards do more to make this type of comparison difficult since their performance relative to each other is too close to call. So in a way, the latest hardware is making it more and more difficult, if not impossible, to use the overall score as a general indicator of performance in games. Eventually, I believe that if both and ATI and NVIDIA continue this route, then yes the overall score will become useless and Futuremark will either have to abandon it or change it in a way where it will continue to have relevancy.

Tommy McClain
 
CorwinB said:
WaltC, don't forget that according to Nvidia, FRAPS is evil too... :)

Yes, I saw that, too... I think nVidia's position on benchmarks is pretty clear--as long as a benchmark reflects what nVidia wishes it to reflect, it's fine. When it doesn't it's banished to the "enemies" list :D What an uncomplicated view nVidia has of the 3D industry--that all of the industry's software institutions exist as nothing more than extensions of nVidia, and if not they serve no purpose (ducking and running...)
 
Myrmecophagavir said:
CorwinB said:
WaltC, don't forget that according to Nvidia, FRAPS is evil too... :)
Really? What do they have to say about it?

There's a thread here somewhere. Basically NVIDIA said it questions the accuracy of FRAPS as well as the accuracy of the people using it.
 
Since 3DMark03 no longer uses a game engine it can have no relevance to any game at all.

So no game, now or ever, will use pixel shaders?

There is a terrifying trend where people think that if it isn't an engine, it has nothing to do with games. Yet what 03 does is give a general overview, which is GREAT! That's exactly what you SHOULD want! Telling you how a card performs in a highly optimised enviroment which is coded in such a way as to help the card is about the worst way to benchmark imaginable, yet is the way so many people would have us go.

A benchmark which suggests how a card will perform in all things is far more useful than something that tells us how it will in one (followed by the reviewer telling us this reflects all).
 
Quitch said:
Since 3DMark03 no longer uses a game engine it can have no relevance to any game at all.

So no game, now or ever, will use pixel shaders?

There is a terrifying trend where people think that if it isn't an engine, it has nothing to do with games. Yet what 03 does is give a general overview, which is GREAT! That's exactly what you SHOULD want! Telling you how a card performs in a highly optimised enviroment which is coded in such a way as to help the card is about the worst way to benchmark imaginable, yet is the way so many people would have us go.

A benchmark which suggests how a card will perform in all things is far more useful than something that tells us how it will in one (followed by the reviewer telling us this reflects all).

No, I think many people here think that devs will code optimized shaders for NVIDIA hardware. Or like Valve, they will give the green light for IHVs to optimize the game's shaders. In other words, many people think that NV3x's "real power" won't be an issue.
 
Quitch said:
So no game, now or ever, will use pixel shaders?

There is a terrifying trend where people think that if it isn't an engine, it has nothing to do with games. Yet what 03 does is give a general overview, which is GREAT! That's exactly what you SHOULD want! Telling you how a card performs in a highly optimised enviroment which is coded in such a way as to help the card is about the worst way to benchmark imaginable, yet is the way so many people would have us go.

A benchmark which suggests how a card will perform in all things is far more useful than something that tells us how it will in one (followed by the reviewer telling us this reflects all).

Right, and the performance differential between 3D game engines can be so great, and so conditional, that there is simply no substantive difference between comparing two game engines against each other and comparing a synthetic benchmark with a game engine. If it is true that a synthetic benchmark does not equate to a game engine, it is also true that one game engine does not equate to another game engine. When a benchmark professes to be based on a game engine there's the real problem of just how loosely that basing is, just to name one problem with the concept. If the benchmark takes serious liberties with the engine in order to meet the needs of the benchmark then the claim has no foundation in the first place.

Also, benchmarks which claim to be based on game engines are of necessity backwards looking rather than forward. For instance, in the two years or so that people used 3dmk01, which was promoted as being "based on" the Max Payne D3d engine, the assumption had to be that game engines, API's, and 3D hardware did not progress during that period in order for the "game engine" theory to have any merit relative to the benchmark. But of course the industry sits still for no benchmark--heh--and hence 3dMk03 was created, which takes an entirely different approach.

Who wants to buy 2d games anymore, or run cpu-dependent software rendering? Everything is 3d these days. And 3d is most certainly the province of the vpu/gpu, and not the cpu. So the vpu/gpu dependency of 3dMk03 was inevitable, and we'll see the same sort of dependency manifest in games requiring DX9 and up to run. If you don't mind the fact that you can't describe how a given system will run HL2 by running Aquanox, or you can't describe your system's performance of Doom III by running NWN, etc., then running 3dMk03 should present no dilemma whatever.

So what's caused some people to forget that a benchmark is a benchmark and a game is a game, that it's OK to look at both, and that this is the way the way things have always been? IMO, nVidia's handling of its problems post GF4 is the root of the situation.
 
I think that it must be said with regards to Max Payne and 3DMark2001, that 3DMark2001 came out before Max Payne and uses more advanced effects. Game Tests 1, 2 and 4 were nothing like Max Payne, and the high detailed version of GT3 did a number of things that Max Payne didn't.

Either Max Payne has one of the most flexible engines or all time, or Futuremark wrote significant amounts of custom code themselves.
 
Getting back towards the original point, I see two issues that have come up repeatedly.

1) [H]ardOCP's stance that Nvidia's pseudo-trilinear settings, and overriding of the user's preferences are acceptable.
2) Kyle Bennett's willingness to ban from his forums those who are willing to stand up to his interpretation.
 
Rugor said:
Getting back towards the original point, I see two issues that have come up repeatedly.

1) [H]ardOCP's stance that Nvidia's pseudo-trilinear settings, and overriding of the user's preferences are acceptable.
2) Kyle Bennett's willingness to ban from his forums those who are willing to stand up to his interpretation.
One thing you forgot:

3) Most people seem to think this is bad/wrong. ;)
 
Rugor said:
Sorry Dig, if I may call you that,

I thought that didn't need to be stated.
Feel free to call me what ya will, I know me moniker is a little lengthy. :)

I know it should be implied, but people from nVidia might read this thread and I'm afraid they all seem to be a bit rusty when it comes to the right/wrong, good/bad thing. ;)
 
Yeah I think you're right.

Looking back, I think it started getting obvious when the Gf4MX was announced-- a DX7 part with a name that made people think it was DX 8 or better.

Nowadays I don't trust anything they say about their cards, though I still think the Gf4 Ti cards were good products.
 
Rugor said:
though I still think the Gf4 Ti cards were good products.
I totally agree with that one, I do love the GF4 ti series and still think they're great cards.

I have a busted pc to fix for a friend while he's on vacation next week, and I just pulled his GF4 ti4200 and swapped it for me V5 on me secondary rig and gotta admit it's a nice step up for that system. :)
 
Back
Top