ATI engineering must be under alot of strain. MS funding?

Chalnoth said:
Whether or not you bother to write a fallback is something else entirely...

Whether or not you CAN write a fall-back and still have adequate performance with comparable quality is entirely the point.

Obviously, with Doom3, you can. Precisely because it's NOT a DX9 level renderer, or a renderer with "DX9 level" effects. You CAN fall-back to DX8 code paths and still maintain both quality and performance.

Is it possible with TR?

Don't really know. In the developer's estimation, it's either not possible, (because the fall back has too many performance or quality issue comprimises), or the code in a sub DX9 shader is too complex (expensive) to develop and/or support.

The bottom line is, the Developer has the last say.
 
Chalnoth said:
All I see is that Core never wrote some of the effects for PS 1.x.

From what I've heard, one can fully-replicate Renderman in PS 1.1. The main problems are precision and performance, so improved precision and performance are precisely the things that going for PS 2.0 (or the OpenGL equivalent) will get you.
So by your logic any card that supports PS 1.1 is really a PS 2.0 card because you can just use multipass to create the effects.
 
I said:
Obviously, with Doom3, you can. Precisely because it's NOT a DX9 level renderer, or a renderer with "DX9 level" effects. You CAN fall-back to DX8 code paths and still maintain both quality and performance.

One thing I would like to add, is that even though the "standard / officially supported" Doom3 rendering paths are not DX9 level, this doesn't mean that it's impossible for Doom3 to be used in a way that can be considered a DX9 level benchmark.

Carmack has made reference to some "unsupported" features in the ARB2 (DX9) path. Seems like he will play with this path a bit, presumably because the DX9 architecture allows him new flexibility. There will be things in the ARB2 path he will not think of implementing on a lower path for whatever reason. (Quality / performance, ease of implementation, etc) Depending on what these end up being, we may be able to "turn on these features" and benchmark using that path, and those settings, and that may be legitimate DX9 benchmark.

Of course, it then becomes debatable on whether or not those settings still make it a "real game" banchmark, if it uses features that are not actually officially supported by the game. But at worst, it may be a new "synthetic DX9 benchmark using a real game engine" for those who like to quibble about such differences.
 
Joe DeFuria said:
One thing I would like to add, is that even though the "standard / officially supported" Doom3 rendering paths are not DX9 level, this doesn't mean that it's impossible for Doom3 to be used in a way that can be considered a DX9 level benchmark.
Here I'd like to say two things.

Yes, it's obviously possible to add 3rd-gen shaders with no fallback to 2nd-gen or fixed function. But why is this necessary for it to be considered "DX9?"

Secondly, while it doesn't directly relate to this discussion, I find it interesting. In a recent interview at GameSpy ( http://www.gamespy.com/quakecon2003/carmack/ ), JC states that he plans at least one more engine, one based upon the 3rd-generation shader hardware. It is interesting that he seems to think that this engine will be exceedingly-scalable, and I tend to agree. I think that GPU architectures are fast approaching the point where future tech generations primarily mean performance improvements under various different circumstances, not actual feature additions (though there is the major step yet to be made of flexible HOS).
 
Chalnoth said:
Yes, it's obviously possible to add 3rd-gen shaders with no fallback to 2nd-gen or fixed function. But why is this necessary for it to be considered "DX9?"

To be clear, having a fall-back does not invalidate an app from being DX9. It's having a fall that does not significantly impact image quality or performance. If image quality or performance is not impacted by the fall-back, then that means that DX9 hardware is not the "enabler" of that effect.

Secondly, while it doesn't directly relate to this discussion, I find it interesting. In a recent interview at GameSpy....

I pretty much agree. I think DX10 level hardware has the potential to be the "dawn of the last generation" of 3D Graphics hardware. That is, beyond DX10, I'm not anticipating major overhauls to the "redering pardigm" for some time. just more performance, and fine-tuning flexibility here and there.

There's still lots of room for increased performance of course....
 
Joe DeFuria said:
To be clear, having a fall-back does not invalidate an app from being DX9. It's having a fall that does not significantly impact image quality or performance. If image quality or performance is not impacted by the fall-back, then that means that DX9 hardware is not the "enabler" of that effect.
Well, herein comes the judgement call. I'd prefer to not make any judgement: if it uses the shader programming interface at all.... (and so on). Specifying a cutoff is always just too flexible and subjective.

I pretty much agree. I think DX10 level hardware has the potential to be the "dawn of the last generation" of 3D Graphics hardware. That is, beyond DX10, I'm not anticipating major overhauls to the "redering pardigm" for some time. just more performance, and fine-tuning flexibility here and there.
Well, I think that the internal architectures may differ hugely (I hope that the programming interfaces are built as to make this possible!), but I don't think the programming interface will change much.
 
Chalnoth said:
Well, herein comes the judgement call. I'd prefer to not make any judgement: if it uses the shader programming interface at all.... (and so on). Specifying a cutoff is always just too flexible and subjective.

Which is why, as I said, we rely on the developer's judgement. Carmack said "minor" quality improvement. That's his judgement, not mine.

Well, I think that the internal architectures may differ hugely (I hope that the programming interfaces are built as to make this possible!), but I don't think the programming interface will change much.

Agreed.
 
Joe DeFuria said:
Which is why, as I said, we rely on the developer's judgement. Carmack said "minor" quality improvement. That's his judgement, not mine.
We won't always have that.
 
Chalnoth said:
Joe DeFuria said:
Which is why, as I said, we rely on the developer's judgement. Carmack said "minor" quality improvement. That's his judgement, not mine.
We won't always have that.

Then we just might have to (shock) rely on our own judgement in those cases.

In any event, I hope that I have satisfied your "calling me out" wrt having an "inconsistent" view of TR and Doom3 and DX9. You don't have to agree with me of course, but I'm certainly not being inconsistent with my reasoning.
 
Joe DeFuria said:
In any event, I hope that I have satisfied your "calling me out" wrt having an "inconsistent" view of TR and Doom3 and DX9. You don't have to agree with me of course, but I'm certainly not being inconsistent with my reasoning.
I do have to add one more thing.

I still think that from what I've read, DOOM3 is quite a bit further ahead than, say, Tomb Raider in terms of technology. It's just a vastly more advanced rendering scheme, with shaders incorporated in the basic rendering forms, not just added on as fluff.

Those shaders are currently limited by the fact that the game is designed to work on 1st-gen GPU's, and a specific design philosophy.

Anyway, I think it should count as a "DX9 game" simply because it reaches, as JC said, optimal implementation on 3rd-gen hardware.
 
Back
Top