How does ATi support shadow buffers under OpenGL? If they have the hardware, why can't they support it under D3D? Is it the automatic compare for the depth texture v the projection?[/code]
They do work. At least for me.
How does ATi support shadow buffers under OpenGL? If they have the hardware, why can't they support it under D3D? Is it the automatic compare for the depth texture v the projection?[/code]
chavvdarrr said:Come on, why ATi should know for every single benchmark?!
Imagine the situation:
AMD releases Barton 3200+, lots of sites make reviews... and here comes Intel saying:
"oh, but its not fair, we didn't know, we could prepare making more optimised CC2003SE benchmark suite...."
AMD replies:
"But iP4 has SSE2 instructions... for apples2apples comparison we demand that no-SSE2 optimised apps be used"
Intel:
"And no 3dnow! too"
AMD:
"P4 has wider bus - 200x4 vs 200x2, NO bandwidth limited tests!"
Intel:
"Barton has more cache (640 vs 512) - no cache limited tests!"
......
......
These are curent results in Doom3 engine, like it or not.
.
BenSkywalker said:For the particular implementation utilized in SC there are no color values nor are there any soft shadows used which differ considerably from the particular example that nVidia has in that doc. For instance-
That doesn't appear to be happening in SC, nor are there color values being utilized for the shadow maps generated which changes D3DFMT_24S8 to D3DFMT_D16.
The depth comparison is done prior, although I see what you are saying in terms of it not being the same as in Stencil Buffer implementations.There is no code in this shader that actually does any depth comparison to decide if something is in shadow!
tex t2
tex
Loads the destination register with color data (RGBA) sampled from a texture
In terms of what SC in particular is doing, it appears to be rather considerably simplified from what is possible using shadow buffers under D3D on nV hardware and removes some of the out of spec functionality versus the sample code. I see what you are saying now in terms of the differences, although the example you are using seems to be quite a bit beyond what is happening in SC in particular.
How does ATi support shadow buffers under OpenGL? If they have the hardware, why can't they support it under D3D? Is it the automatic compare for the depth texture v the projection?[/code]
I haven't really looked with particular interest at all the SC benchmarks in the various NV35-vs-R9800PRO256 shootouts but did any of the reviewers mentioned that shadow projection (available to both cards) was used instead of shadow buffer (available only to NV cards)?saf1 said:Question -
If Splinter Cell has a specific path for Nvidia hardware, why did it do so poorly in the recent NV35 benchmark? Poor shader performance?
There appears to be has large of a gap between the 9800 Pro and NV35 in Splinter Cell as there is with the NV35 and the 9800 Pro and Doom 3.
Just wondering.
Reverend said:WRT Andy's and Ben's discussion about SC and shadows in the game -- there are some pertinent points raised about shadows implementations per se but there appears to be a lack of understanding about how SC is coded to use both implementations (texture projection vs buffer). I'd like to talk about this but Ubisoft''s prohrammer had told me quite a number of days ago that he wanted to participate (for the firt time) at B3D to explain SC's shadow paths. I'll leave it to him while reminding him of this specific discussion in this thread at the same time.
I guess we could also hack our driver under D3D to support nVidia's non-compliant behaviour, but why would we support something as woefully out-of-spec as that?
Because SplinterCell does
Not a valid arguement, it is the developers fault not sticking again to a agreed Standard
SC is a special case really, being an XBOX port and Ubisoft is (personally, and subjectively... I think) thrilled by the XBOX's "allocation" of the shadow buffer implementtation Ubisoft used.Doomtrooper said:Because SplinterCell does
Not a valid arguement, it is the developers fault not sticking again to a agreed Standard..."The way its meant to played" should not take priority to standards all the other IHVs follow.
BenSkywalker said:I see what you are saying, want to clarify a few points on my end. First is the color values relating to the shadow, check out the smiley shadow on the doc you linked to. To me, it appears that they can pull off colored shadows if they want to spend the time doing it. Soft shadow edges- where can these be seen? If you could point me to a good example of soft shadow edges when using shadow buffers?
Reverend said:I do not really agree with the "standard" reference because I feel that it is down to the developer to consider the risks (and personally, I think Ubi did a great balancing job about looking-after-all-sorts-of-hardware-owners). IMO your comment is a sulking reflection of what hardware you prefer. but you ahve every right to feel that way.
Ubisoft tried to give everyone a good-looking game based on a variety of PC hardware and its limitations according to what a hardware's full capabilities affords... and we really shouldn't complain.
I do believe, however, that people should be concerned by attempts to circumvent agreed standards