FP16 and market support

Well, my opinion is, simply using a feature for a few token effects does not make a title a "designed for DX9" title. What if I wrote a game that used a single shader on a single polygon out of the entire game. Would that justify using it in a benchmark? Doom3 is an example of a game designed from the ground up for normal mapping and stencils. The same cannot be said of older games that added token support for bump mapping or stenciled shaders (e.g. Giants)

TR:AOD is a game that has a few special effects added, but it looks inconsisent. You might look at the water or DOF and say "cool", then you look at 90% of the game content and go "blah" It's not different than token DOF/water affects slapped into many console titles.

IMHO, Top Spin tennis, a DX8 x-box title, looks way better in terms of effects because the game is designed from the ground up with shaders, self shadowing, bloom, and mega-polygon crowds and stadiums, everywhere. It doesn't feature a few water shaders slapped ontop of a blurry bland game world.

I don't just want shaders to be used as a "gimmick" the same way specular or reflections became used on a few token things to make uber-shiny surfaces.
 
DemoCoder said:
TR:AOD is a game that has a few special effects added, but it looks inconsisent. You might look at the water or DOF and say "cool", then you look at 90% of the game content and go "blah" It's not different than token DOF/water affects slapped into many console titles.
Shows what (little) you know about the game's graphics.
IMHO, Top Spin tennis, a DX8 x-box title, looks way better in terms of effects because the game is designed from the ground up with shaders, self shadowing, bloom, and mega-polygon crowds and stadiums, everywhere. It doesn't feature a few water shaders slapped ontop of a blurry bland game world.
Neither does TRAOD.
 
I've never played TR:AOD, and don't really have any desire to. I don't care if it's a good or bad game, because that doesn't have anything to do with its usefulness as a benchmarking tool. All it needs for benchmarking is to put a decent rendering load on the GPU, and to generate repeatable results. TR:AOD uses DX9 shaders and has (in the .49 patch at least) a benchmarking mode that gives good repeatable results.

Benchmarking isn't the same as playing, and just because you don't enjoy playing a game doesn't mean it won't serve as a benchmark.
 
OpenGL guy said:
Shows what (little) you know about the game's graphics.

Well, putting aside your your vested interest in the game as DX9 demonstrator or ATI title. The game is simply not impressive graphically. 3dMark2003 looks better. If there are impressive DX9 effects everywhere, I sure as hell can't see em. The game got mediocre reviews, most of the feedback on IGN and Gamespot forums says the graphics are mediocre for the performance given, and to boot, the film studio even blamed Core for the bad box office results of the film (ridiculous, since the film sucked as well, bur atleast they perceived the game sucked too)

Do you see fanboys gushing mad about the games graphics like they do about D3 or HL2? The simply is not a killer-app demonstrator for DX9. The textures are so blurry that they really didn't need to implement an floating point depth-of-field shader, since everything looks out of focus anyway.

None of this has anything to do with the game as a benchmark per se, but don't try to tell me the game is a showcase of what developers who utilize DX9 heavily will be able to deliver.

Neither does TRAOD.

Well, it sure looks like it. Looks like they did alot of DX9 work for nothing. Maybe you can point me to a screenshot that I'm supposed to be impressed with, something on par with HL2, or even Max Payne 2. Are you telling me that most surfaces in TR:AOD have DX9-level shaders applied to them?
 
DemoCoder said:
Well, it sure looks like it. Looks like they did alot of DX9 work for nothing. Maybe you can point me to a screenshot that I'm supposed to be impressed with, something on par with HL2, or even Max Payne 2.
please explain how YOU have to be impressed before a game makes use of DX9 features?

Just because a game uses the latest features doesnt magically make the artists / level designers better.

I'm suprised you can't see that.
Who cares if they did alot of DX9 work for nothing?
What bearing does that have on its use as a benchmark?
I bet i coudl take the most graphically profound game you can imagine, and make it look mediocre by changing textures to blander ones, etc. How this magically makes it not DX9, you'll have to explain.
 
The game does not appear to use any complicated shaders on most surfaces. It doesn't even appear to use much normal mapping.

What qualifies as a real DX9 benchmark then? Can anyone get into the game? If I throw a Wood shader into Quake3, would it be applicable? OpenGL Guy's trying to hoodwink me into believing the game is a heavy shader user. Well then, point me to a screenshot and provide some evidence that the game is a heavy user. (besides post-process gimmicks)
 
Well then, point me to a screenshot and provide some evidence that the game is a heavy user. (besides post-process gimmicks)

Why do shaders have to be visible as shaders? the worst best special effects in the world are the ones you don't notice (and the worst are the ones you do.)

Whether or not anyone can play the game is irrelevant for benchmarking purposes. AFAIK the game does make use of a great deal of shaders, unfortunately it seems they don't fit your criteria of DX9 (i.e good looking) shaders.
 
Well, putting aside your your vested interest in the game as DX9 demonstrator or ATI title

dude its a the way its meant to be played game. Its nvidia that has a vested intrest in the game. As long as it uses dx 9 features and has a benchmark mode its a dx 9 benchmark. Its just as valid as half life 2 will be as a dx 9 benchmark and its more valid than doom3 as a dx 9 bencmark. Doom 3 is basicly a dx 7 game using dx 8 and dx9 to reduce the special effects into less passes . Yet I'm sure doom3 will blow away most dx 8 and dx9 games that come out.
 
jvd said:
Well, putting aside your your vested interest in the game as DX9 demonstrator or ATI title

dude its a the way its meant to be played game. Its nvidia that has a vested intrest in the game. As long as it uses dx 9 features and has a benchmark mode its a dx 9 benchmark. Its just as valid as half life 2 will be as a dx 9 benchmark and its more valid than doom3 as a dx 9 bencmark. Doom 3 is basicly a dx 7 game using dx 8 and dx9 to reduce the special effects into less passes . Yet I'm sure doom3 will blow away most dx 8 and dx9 games that come out.

Doom3 is OpenGL based, not Direct3D based (it probably will make use of DirectX for sound and input however). How anyone expects an OpenGL game to demonstrate Direct3D effects is a little beyond me.
 
DemoCoder said:
OpenGL guy said:
Shows what (little) you know about the game's graphics.
Well, putting aside your your vested interest in the game as DX9 demonstrator or ATI title.
My vested interest? You're joking, I hope. nvidia is the one who put their TWIMTBP logo on this game.
The game is simply not impressive graphically. 3dMark2003 looks better. If there are impressive DX9 effects everywhere, I sure as hell can't see em. The game got mediocre reviews, most of the feedback on IGN and Gamespot forums says the graphics are mediocre for the performance given, and to boot, the film studio even blamed Core for the bad box office results of the film (ridiculous, since the film sucked as well, bur atleast they perceived the game sucked too)
I can't help how it looks. I can tell you that there are shaders with more than 40 instructions in use all the time. I'm sure someone else can provide shader dumps.
Do you see fanboys gushing mad about the games graphics like they do about D3 or HL2? The simply is not a killer-app demonstrator for DX9. The textures are so blurry that they really didn't need to implement an floating point depth-of-field shader, since everything looks out of focus anyway.
None of this matters a bit. Does the game use shaders intensively? Yes. Does the game look good? No. So what? Blame the artists or level designers or something.
None of this has anything to do with the game as a benchmark per se, but don't try to tell me the game is a showcase of what developers who utilize DX9 heavily will be able to deliver.
Did I ever imply that it should be? No. I am merely pointing out that the game does make heavy use of advanced shader techniques. Maybe not to the best effect, but that's irrelevant.
Neither does TRAOD.
Well, it sure looks like it. Looks like they did alot of DX9 work for nothing. Maybe you can point me to a screenshot that I'm supposed to be impressed with, something on par with HL2, or even Max Payne 2. Are you telling me that most surfaces in TR:AOD have DX9-level shaders applied to them?
I can't remember the exact statistics, but there is a significant amount of PS 2.0 stuff on each and every frame. Again, I'm sure someone else can provide shader dumps.
 
Ok, why don't you tell me what shaders are being used on the wood, Lara's hair, jeans, skin, door, etc in this http://www.gamespot.com/pc/adventure/tombraidertheangelod/screens.html?page=129 screenshot (I picked the first one, just to avoid accusations of selection bias) I'd like to know, because I am baffled as to how bland this game looks. Can you pick a shot that doesn't include water, caustics, or mirrors that demonstrates the other (e.g. the rest of the game) shaders? Here's another one http://www.gamespot.com/pc/adventure/tombraidertheangelod/screens.html?page=118 Anything in this scene DX9 specific (requires PS2.0 to do?)

Since OpenGL guy seems to know alot about this game, why not explain some of the advanced material shaders they are using. How kind of amazing shaders did they design to make Lara's skin, hair, and clothes suck so bad, and the game's rocks, stone, metal, and wood look like it came out of a 1997 game.

Look, I can believe that have some shaders to do advanced water, reflection, DOF, bloom, etc. But are those in every frame? I'm not trying to start an argument with you on purpose, but the PC version of this game doesn't look that much better (in terms of lighting, effects) than the PS2 version.
 
radar1200gs said:
...snip...snip...

Hey, where are those numbers at? Or have you conceeded that you were indeed spreading FUD?
 
DemoCoder said:
Well, my opinion is, simply using a feature for a few token effects does not make a title a "designed for DX9" title
Well, I think that's the real problem. "Designed for DX9" title is perhaps your own definition or categorization of this game (or any other game you have in mind for this category).

TR:AOD is a game that has a few special effects added, but it looks inconsisent.
Like many other game. I personally thought Tiger Woods 2004 look quite nice... but it doesn't qualify as a "designed for DX9" (or is it DX8, coz I don't know, and it doesn't really matter) title. I think it uses some vs and ps effects for the tress and water but that's it but it still looks good... wonder why...

I don't just want shaders to be used as a "gimmick" the same way specular or reflections became used on a few token things to make uber-shiny surfaces.
Well, shaders are a gimmick, Ray! :)

Seriously though, as developers get more and more used to DX9 and as DX9 hardware becomes prevalent, we shouldn't be expecting too much too soon (taking into context how long ago TRAOD has been released). Lord, if only there is an abundance of per-pixel lighted games -- a DX7 basic feature, one that really enhances image quality of PC games.

The simply is not a killer-app demonstrator for DX9.
I don't think anyone said it is.

It just uses DX9 features, that's it, and with its proper benchmark mode it qualifies as a benchmark, no more no less.

Ray, in case you missed it, have a read (the page and the Word doc download within) : http://www.beyond3d.com/downloads/traod/

Oh, and here's what the shaders look like, courtesy of mboeller : http://www.beyond3d.com/forum/viewtopic.php?t=7543&postdays=0&postorder=asc&start=60
 
Jeez…..isn’t anyone else getting tired of this crap? The ONLY reason that there is a FP16 is because one major player hasn’t got the ability to run in FP24, and while it can run FP32, it hasn’t the power to do so, PERIOD! M$ had no choice but to allow FP16 as a hint in order to keep this one manufacturer “in the gameâ€, so to speak. The reasons for the manufacturer’s choice, on the other hand, were to try to keep others “out of the gameâ€, so to speak.

If all DX9 video cards were capable of running FP24 at decent speed, we wouldn’t be having this discussion. If nVidia had followed M$’s DX9 specifications, rather than trying to do an end run around them, there would be no FP16. FP16 will only last as long as nVidia supports what is a badly conceived and executed family of video cards.

And no amount of apologist fanboy FUD will change this.
 
If the rumors that R420/R423 will support a higher precision are true and _PP support comes along with that higher precision (which I believe it will, since for R300 to easily support _PP it would have had to be FP12 which is below the minimum microsoft specify) then I will LMAO. It will be hugely entertaining to see the fanboys explain how it all came about given FP24's superiority.

As I said before, you should see what nVidia's FP32/FP16 is truly capable of with NV40. Shame about how NV3x ended up, but in the end nVidia were correct to just get on with NV40 rather than waste resources and effort trying to fix the unfixable. In the end NV3x is extremely competitive in everything bar DX9 and DX9 isn't exactly setting the gaming world on fire at present.
 
(which I believe it will, since for R300 to easily support _PP it would have had to be FP12 which is below the minimum microsoft specify) then I will LMAO

Get a clue Greg. Please paddle in the shallow end where your undertsanding best suits you.
 
radar1200gs said:
If the rumors that R420/R423 will support a higher precision are true and _PP support comes along with that higher precision (which I believe it will, since for R300 to easily support _PP it would have had to be FP12 which is below the minimum microsoft specify) then I will LMAO.

WTF????

Just where on earth have you pulled that one from?

The ATi guys have already said that if they go to FP32 it will be the same 'layout' at R300 - as in through the pipeline on eprecision not multiple.... heck they already said that *if* MS had said FP32 was the minimum precision in DX9 that the R300 could have handled that with 'relatively minor' alterations.... die size would have gone up and hence the yield per wafer would decrease causing cost of production to rise slightly... but they could have/would have done it.... multiple precisions never was in the picture.



OKAY - given a hypothetical situation however where R4xx is a multiple precision chip why oh why would you suggest that they would use FP12 anyway? Wouldn't FP24/32 make more sense....???? Or *even* FP16/32 like NV????

BESIDES R300 has bugger all to do with the future of multiple precisions anyway it was, is and will remain FP24 and there is no way to change that - it is IN THE HARDWARE! Any and all _pp hints in place are ignored and everything is run at full precision.

Excuse me while I fall off my chair laughing at you.
 
Dio said:
So the point is, it's not just 'a few simple rules'.
It is. You just have to be aware of "creeping errors," and know how to spot them when they occur. I don't think this will be a problem for the majority of shaders.
 
Back
Top