ATI engineering must be under alot of strain. MS funding?

Chalnoth said:
Let me put it another way, OpenGL Guy. There are two ways to define a "DX9 game."

1. The game uses some effects that can only be seen at their best when using PS 2.0 or higher.

2. The game requires PS 2.0 or higher because the developers decided to incorporate the use of PS 2.0 into the game in a fundamental way.

I don't get you Chalnoth...
1) you have been told by someone "In the Know" that a game uses PS2.0 shaders for effects that cannot be done without PS2.0... granted there are fallbacks of lesser quality or simply lack of effect for older cards, but that in no way reduces the significance of the DX9 only features, thereby quite clearly making it a game worth of a DX9 notation. You cannot experiece all the game has to offer without a DX9 card, period.

2) Doom3. I assume you harp on Doom3 because you expect Nvidia to perform exceedingly well in it, but that really isn't my point. There is
*nothing* in Doom3 that Requires DirectX 9 functionality... The only advantage from what I understand is speed when using PS/VS 2-level OpenGL extensions... Hrm... i guess there is the ability to run with floating point color too, which may be a fair enough determining factor, but I don't understand where the "incorporate the use of PS 2.0 into the game in a fundamental way" is in Doom3? There is nothing fundamental about Dx9 level functionality being faster than getting the same quality rendering, and all effects with a Dx8/8.1 level card... What is "Dx9 Game" about that?!?!
 
Chalnoth said:
OpenGL guy said:
Huh? Didn't I state earlier that TRAOD is a DX9 games? Did you know the game uses PS 2.0 shaders? Did you know some of these shaders are 48 instructions long? (There may be others that are longer, I'm just going from memory.)
There are two ways to look at what constitutes a "DX9" game. Does it use DX9-level shaders for gimmicks? Or does it use DX9-level shaders as a fundamental element of the game?

I see nothing to suggest that the Tomb Raider effects are anything more than a gimmick. So, by one definition, I could easily say that Tomb Raider is not truly a DX9 game.

By another, if it just uses any shaders that can only be done properly in DX9 (either for precision or other reasons), then I could call it a DX9 game. Anyway, you can't use both definitions at the same time. You have to use one or the other.

This is what I was calling Joe on. It seemed to me he was attempting to apply a double standard to two different games. I say that DOOM3 is at least as much a DX9 game as the new Tomb Raider will be. But it all depends on the definition you're using at the time. I hope that I'll always be specific about which definition I'm attempting to use.

Perhaps I'm missing something here. But, um, who really cares whether the gfFX5200 performs well in "DX9 games?"

As has been stated earlier, just check out Gunmetal. This has been touted as a DX9 game by NVIDIA, and it's part of TWIMTP program(heck, Tomb Raider is too), although most people here consider it a DX8 game. The fact is that the gfFX5200 doesn't perform very well in the Gunmetal benchmark. If this isn't a "true" DX9 game, how could it be fathomable that the gfFX5200 would perform better in a true DX9 environment which is more intensive? Logically, it should perform worse. Taking into consideration the weakness of NVIDIA's DX9 shaders, and the fact that the gfFX5200 only has one single pixel shader, it shouldn't be much of a stretch at all to say realize how the gfFX5200 will perform in true DX9 games.
 
Chalnoth said:
There are two ways to look at what constitutes a "DX9" game. Does it use DX9-level shaders for gimmicks? Or does it use DX9-level shaders as a fundamental element of the game?

The answer is: it doesn't matter.

If it uses DX9 shaders to do effects: whether you consider them to be "gimmicks" or "pervasive image quality enhancements", that doesn't change the fact that they are DX9 shaders.

If a game like Doom3 uses "pervasive use of stencil, dot-3, and cube maps", that does not constitute DX9 level shaders. No matter how "non gimmiky" the use of those effects are.
 
Joe DeFuria said:
The answer is: it doesn't matter.

If it uses DX9 shaders to do effects: whether you consider them to be "gimmicks" or "pervasive image quality enhancements", that doesn't change the fact that they are DX9 shaders.
I don't think I ever said "pervasive image quality enhancements."

I'm not even talking about enhancements at all with regards to that definition. I'm talking about the graphics engine being built from the start to assume that every card running the game will have those shaders. This doesn't necessarily mean the game will look better. What it means more is that those shaders can go beyond affecting how the game looks and move on to affecting how it is played.

Improving the look is a gimmick (particularly when talking about water effects, DOF, etc.). Using the look to alter gameplay is more fundamental. Of course, the altering of the gameplay doesn't necessarily need to happen, it's the assumption that every person playing the game must have card X (or better) that I'm talking about here.
 
Ichneumon said:
I don't get you Chalnoth...
1) you have been told by someone "In the Know" that a game uses PS2.0 shaders for effects that cannot be done without PS2.0... granted there are fallbacks of lesser quality or simply lack of effect for older cards, but that in no way reduces the significance of the DX9 only features, thereby quite clearly making it a game worth of a DX9 notation. You cannot experiece all the game has to offer without a DX9 card, period.
You people are not getting it. Not at all.

Two definitions.

By one, it's a DX9 game.

By the other, it's not.

I was using the definition, for the purposes of this thread that it is.
 
Oh, one final thing. From what JC has stated, there may well be some quality improvements from using the fragment program effects, specifically in the area of precision for specular highlights. Since this information is very old, and the game is not shipping for a few months yet, we really don't know how much the fragment program effects will change the look of the game.
 
Chalnoth said:
I'm not even talking about enhancements at all with regards to that definition. I'm talking about the graphics engine being built from the start to assume that every card running the game will have those shaders.

I don't get you.

Certainly, Doom3 is not built like that.

TR's DX9 effects ARE built like that. The game itself is not built using DX9 as a minimum spec, nor am I claiming that, nor am I claiming that's a "requirement." Whether the DX9 code is there for "a gimmicky effect", or a "core basis" for the engine is irrelevant.

This doesn't necessarily mean the game will look better. What it means more is that those shaders can go beyond affecting how the game looks and move on to affecting how it is played.

Who cares? The GPU doesn't give a rat's ass about how the game is played. All it cares about is how it looks.

Of course, the altering of the gameplay doesn't necessarily need to happen, it's the assumption that every card playing the game must have card X that I'm talking about here.

Why are you talking about it? No one else is.

We're talking about the assumption that every card using visual effect X must have DX9 shader support.
 
Joe DeFuria said:
Chalnoth said:
I'm not even talking about enhancements at all with regards to that definition. I'm talking about the graphics engine being built from the start to assume that every card running the game will have those shaders.
I don't get you.

Certainly, Doom3 is not built like that.
Which is why I said DOOM3 is not a DX9 game by this definition.

But it is a DX9 game by another definition: it will use DX9 effects to improve the visual experience (by current information, that means more for performance than anything else, but there are possible visible precision improvements...).

TR's DX9 effects ARE built like that.
This directly contradicts your next statement:
The game itself is not built using DX9 as a minimum spec,
All I'm asking is that you apply the same standards to each game. The way I see it, you're only claiming DOOM3 is not a DX9 game because so many others have claimed it's not. You're not even thinking about the definition that you're using to apply that label.
 
Chalnoth said:
Which is why I said DOOM3 is not a DX9 game by this definition.

I don't see how Doom is DX9 by any definition.

But it is a DX9 game by another definition: it will use DX9 effects to improve the visual experience (by current information, that means more for performance than anything else, but there are possible visible precision improvements...).

It doesn't use DX9 effects to improve visual experience. It uses a DX9 level render path to improve performance.

TR's DX9 effects ARE built like that.
This directly contradicts your next statement:
The game itself is not built using DX9 as a minimum spec,

No contradiction there. Or do you not understand the difference between the "core game engine" and an "effect"?

All I'm asking is that you apply the same standards to each game.

I am.

Doom3 does not require a DX9 renderer to achieve a new level of graphical goodness. You could have a DX7 or DX8 renderer...and if it had the raw horsepower, you'd be good to go.

TR does require a DX9 renderer to be able to see all the advanced effects.

The way I see it, you're only claiming DOOM3 is not a DX9 game because so many others have claimed it's not.

Including Carmack, of course. :rolleyes: Seriously, I'm claiming Doom3 is not a DX9 game...because you don't need a DX9 renderer to be able to "turn on" any effects.
 
Joe DeFuria said:
It doesn't use DX9 effects to improve visual experience. It uses a DX9 level render path to improve performance.
There's no difference. Improving performance is improving the visual experience. The essential effect is the same: the game will look and play significantly better on DX9-level hardware. Anything above that, and the new chip's features will be left unused (unless DOOM3 is updated, of course, to correspond with the release of the NV40 and R420).

Anyway, I'm going to construct a full argument in my next post. Might take me a few mins.
 
I don't see how it could be a DX9 game when it's been said time and time again that the graphics engine was built around what the original GeForceSDR is capable of doing.
 
The focus of this argument is the statement:

The GeForce FX 5200 is not fast enough to be worth the label, "DX9 video card."

So, first, the question:
Why do we want a low-cost DX9 video card? I doubt that any regular posters on this board will ever bother to purchase one. Some of us may recommend one for family or friends, but, for the most part, we will have no contact with these low-cost products.

What we want, I should think, is an improvement in the way games play on our mid-high end video cards. Now, game developers may add effects to games that only those with high-end video cards will be able to see. But, they will not have something that fundamentally changes the way a game is played remain optional. It will have to be required.

So, for a feature to fundamentally change the way a game is played, that game must have the the support of that feature required to play the game. If the feature is only optional, then it cannot, by definition, change the gameplay. It must only be a, "Oh, that looks pretty cool," kind of feature. I call that a gimmick. That doesn't mean I don't want these features. They are a necessary first step. What it does mean is that it the role of the low-end DX9 card comes into play with required features, not optional features.

This leads me to my next point. Game developers will only list a feature as being required if there is a baseline of low-end hardware that supports that feature. Here is where the low-cost hardware comes in. If there is a large installed-base (or just a very cheap card with the specific feature, which means under $40-$50), then, and only then, can such a game be released (side note: this may change if IHV's start sponsoring game development for new architctures, a move I would support).

Not every new game that is released requiring that specific feature (or another feature from the same generation of video cards) will use the feature to improve gameplay. I would say, for instance, that DOOM3 vs. Unreal Tournament 2003 fall perfectly into this contrast. UT2003 used the hardware T&L to improve the graphics, not the gameplay. DOOM3 is set to use per-pixel lighting (feature from the same generation...) in conjunction with the stencil buffer to change the entire feel of the game.

Next, on the definition of what makes a DX9 game a DX9 game.

There is a logical progression of how features are used. First they are implemented in visual quality improvements, what I would call gimmicks. This use increases in frequency and in the demand on the video cards until it is finally a required feature to play games at all.

What will eventually define the impact of the low-end video card supporting the specific feature will be what sorts of games can be built to require that video card as a minimum required to play. This, of course, is not directly measurable, but the best indication, in this case, will be from so-called "DX9 games" that use DX9 as an optional feature.
 
Chalnoth said:
Joe DeFuria said:
It doesn't use DX9 effects to improve visual experience. It uses a DX9 level render path to improve performance.
There's no difference. Improving performance is improving the visual experience.

There is a clear difference. They improve the visual experience in different ways.

The essential effect is the same: the game will look and play significantly better on DX9-level hardware.

Because of DX9 capability, or the raw performance level of DX9 generation cards?
 
Joe DeFuria said:
The essential effect is the same: the game will look and play significantly better on DX9-level hardware.
Because of DX9 capability, or the raw performance level of DX9 generation cards?
The idea is that the performance improvement will be more than it would be in what could be termed a "DX8 game."

That is, the 5200's performance standing should improve relative to DX8-level cards with the release of games that use DX9 for performance improvements. This will likely be the case with DOOM3.

Another way to look at is this: if we look at two video cards whose performance is essentially the same when looking at DX8 games, then DX9 games are those where the DX9 card pulls ahead (visual quality or performance: one is but a different aspect of the other). If this does not happen with the 5200, then I will accept that it is not a worthy DX9 video card.

Update:
And remember, if a DX9 effect adds significantly to the amount of computations done, one with a 5200 may choose to run the game at lower resolution with the effect enabled. This means that whether or not it is a DX9 card in games like the new Tomb Raider will likely be a judgement call. It will be much more clear in games that use DX9 effects to improve performance, not to add new effects entirely.
 
I may respond to the rest of this tommorrow, but I'm off to bed now. In the mean time, consider this:

Chalnoth said:
What we want, I should think, is an improvement in the way games play on our mid-high end video cards. Now, game developers may add effects to games that only those with high-end video cards will be able to see. But, they will not have something that fundamentally changes the way a game is played remain optional. It will have to be required.

Back to Doom3. By all accounts, the Doom3 experience will be vastly different on GeForce3/4 and Radeon 8500 cards and higher, vs. Geforce2, Geforce4 MX, and older ATI cards.

So apparently, it is not "required" to have the low level cards support such "experience changing features", and they can indeed be "optional."

Half-Life2 is another title which is supposed to scale pretty drastically. We'll have to wait and see on that one.

If the feature is only optional, then it cannot, by definition, change the gameplay. It must only be a, "Oh, that looks pretty cool," kind of feature. I call that a gimmick.

Hmmm...Doom3's NV2x/R2xx and higher codepaths are "gimmicks" then?
 
Chalnoth, do you work as a statistician or you like to complicate things and come up with definitions of your own just as a hobby?
 
Joe DeFuria said:
Back to Doom3. By all accounts, the Doom3 experience will be vastly different on GeForce3/4 and Radeon 8500 cards and higher, vs. Geforce2, Geforce4 MX, and older ATI cards.
I doubt it. The only difference is that the NV1x and R1xx cards will typically not be run with specular highlights enabled, and they'll run, naturally, at lower resolution. Those things won't drastically change the gameplay, the feel of the game. It won't look as good, but that's a different idea.

What I'm talking about here is the essential idea of the game. In particular, the stencil shadows are, supposedly, going to be used to significant cinematic effect to affect the atmosphere of the game (to scare the hell out of you). All users who can run the game will get this effect.

Specular highlights, higher resolution, etc. are just secondary effects. The fact that higher-generation cards (DX8, DX9, etc.) will benefit more from this game than older games makes it an arrow to future games that will require those more advanced effects.

Hmmm...Doom3's NV2x/R2xx and higher codepaths are "gimmicks" then?
Yes.
They will improve the way the game looks, as well as the performance.
They will not significantly alter the way the game is played.
 
SvP said:
Chalnoth, do you work as a statistician or you like to complicate things and come up with definitions of your own just as a hobby?
Heh, I'm just finishing up my Bachelor's degree in physics.

And I'm not try to complicate things so much as attempt to get others to explain their value judgements. In particular, I think people (Joe) are unjustly making a negative judgement call on DOOM3 without thinking about the double standard that they are using.

Anyway, my point is simple. Before making any sort of judgement call, one needs to first make definitions. Not applying those definitions unilaterally makes for an error in judgement.
 
That is, the 5200's performance standing should improve relative to DX8-level cards with the release of games that use DX9 for performance improvements. This will likely be the case with DOOM3.

That I want to see. I haven't seen a 5200Ultra yet being able to outperform a NV25 in preliminary Doom3 benchmarks, I'll be generous and leave the 64bit versions of the former out of the discussion.

As for real dx9 games only time will tell. But to be honest I don't even expect todays high end dx9 games to be able to cope adequately with true dx9 games, let alone a budget iteration of those.

I'd say s.o. is lucky if he'll get 30fps with high detail in 1024*768 in Doom3 with a 5200Ultra, unless we mean some weird version of point sampling AF here.

Still if I'd be a budget gamer today I'd personally pick in all honesty a NV34, just because it supports Multisampling. Apart from that it's usability and future prospects are just as limited as with any other budget card.

What you pay is what you get and I couldn't care less about useless checkboard features either.
 
Back
Top