NVIDIA Are the Industry Standard Shader Driving Force?

It does not matter whether the wrapper is broken or not. The fact is, not everything is being rendered as it should be, therefore (as a developer like Mr Huddy should well realise) you cannot claim that Dawn runs faster on Ati than on nVidia - you are comparing apples with oranges...
 
FUDie said:
For those who will claim that NVIDIA's shaders are superior I ask "then why can you not render applications with 32-bit floats and instead drop back to FX12?" If your reply is that 32-bit floats aren't required I ask "then why are there image quality differences between your output and refrast? After all you claim to have the superior product..."

To be fair he said that nvidia will have the standard shaders, not that their shaders are superior :)

It remains to be seen whether or not the gfFX5200 will actually help us as gamers or hold us back.
 
I never said NV30's shaders were superior - I simply stated that some would have you believe ATi's shaders were superior. If this is the case, they should have little trouble doing everything nVidia shaders can, and more - hence a little hair and some eyelashes should, in theory, pose little trouble...
 
Radar, why shouldn't he say that dawn runs faster on ATI boards? It does.

Because of image quality difference? Missing visuals? Isn't that how Nvidia has been running 3dmark? Or shadermark? Or the splinter cell benchmark? Isn't Nvidia the one who wanted reviewers to choose lower quality settings for Nvidia cards for benchmarks in reviews? Isn't Nvidia the one who has their crap AA run against ATI's excellent AA? Heck, I've seen half a dozen reviews now where Nvidia's AA mysteriously didn't even turn on for some of the AA benches even though it was chosen in the control panel. Let's not forget the mystery of the missing dragon. Or the mystery of the dissappearing pixel shaded water in splinter cell.

That's no different than ATI not rendering dawn's hair exactly the same, or missing eyelashes. I think it's perfect. It's an Nvidia demo so it should be run the Nvidia way. Missing visuals. heh. Isn't that "the way it's meant to be benched"? ;)

Or I could go the Nvidia PR route:

"Hey, those aren't missing visual effects. That's one of our newest greatest features. We intelligently remove what we don't think is important. You may think it's silly, but believe me, our customers will love it. They will love the performance gains for the minor loss in image quality. These are the things we think our customers want, so we give it to them."

OR


"9 out of 10 demo's are developed with our card in mind. Pretty soon it should be 10 out of 10. Therefore, they are developed our way. We don't believe that eyelashes should be rendered. Nobody uses eyelashes in real games. If any developer wants to sell his game he better make sure he does it our way. NO EYELASHES."
 
As has been pointed out already in this thread, the issue may not be symptomatic of anything to do with ATI's shaders but of the wrapper - work out whether is the wrapper thats causing these issues first.

However, what are these issues you're talking about? Seems that the Hair is rendering fine here:

http://www.rage3d.com/articles/atidawning/

As for 5200, don't expect that will set any kind of precident developmentally speaking. Its Dx9 shaders are so godawfully slow that I doubt developers will use this as a DX9 target, but instead still look to high end boards than this for actual performance compatibility of DX9.
 
whql said:
As has been pointed out already in this thread, the issue may not be symptomatic of anything to do with ATI's shaders but of the wrapper - work out whether is the wrapper thats causing these issues first.

However, what are these issues you're talking about? Seems that the Hair is rendering fine here:

http://www.rage3d.com/articles/atidawning/

As for 5200, don't expect that will set any kind of precident developmentally speaking. Its Dx9 shaders are so godawfully slow that I doubt developers will use this as a DX9 target, but instead still look to high end boards than this for actual performance compatibility of DX9.

Okay, dealing with your 5200 comments first, contrary to the beliefs of most enthusiast forum dwellers (who make up 15% of the total gaming market at best), on the PC platform, the lowest common demoninator rules (developers want to maximise sales and will target the most common hardware out there before bothering with support for more modern hardware). As far as DX9 hardware goes, 5200 is the lowest common denominator and will far outsell all higher models from any manufacturer combined. ATi currently has no answer to this. Developers will have to take this into account when designing their games.

It's ridiculously difficult to track down a decent screenshot of Dawn on the 'net, unfortunately (and somewhat suprisingly), and I don't have half a year waiting for NV30emulate to do its thing on my poor old GF3.

http://www.skenegroup.net/temp/dawn6.jpg
Dawn on a 5800

http://www.rage3d.com/articles/atidawning/image.php?pic=shots/1024x768_4aa16af_1.jpg&comment=
ATi dawn

Notice the ATi model has coarser hair and the lighting/coloring in it isn't quite right. Note also the total absense of eylashes.

http://www.nvnews.net/vbulletin/attachment.php?s=&postid=129299
Dawn eyelash detail on NV3x

http://www.nvnews.net/vbulletin/attachment.php?s=&postid=144526
Dawn eye detail on ATi (taken from a larger image)

http://www.harlow.nildram.co.uk/misc/pix/ATI/smarties/9700PRO best quality.bmp
The source image for the ATi eye detail above
 
Okay, dealing with your 5200 comments first, contrary to the beliefs of most enthusiast forum dwellers (who make up 15% of the total gaming market at best), on the PC platform, the lowest common demoninator rules (developers want to maximise sales and will target the most common hardware out there before bothering with support for more modern hardware). As far as DX9 hardware goes, 5200 is the lowest common denominator and will far outsell all higher models from any manufacturer combined.

Actually, if you read our developer interview you'd realise that was a slightly simplistic statement. Developers code to multiple levels at multiple times - they have to span from the high end to the low end and that usually means multiple different level of support as well.

Don't assume that because the 5200 is "DX9" that it will stop there. The developer will still code to the high end which may mean better utilisation of DX9 for those high end boards. The level of DX9 performance of 5200 may mean that developer may still opt to put this in a lower end bracket and lower feature levels for it - they will still look at high end boards to utilise better shaders. For instance Half Life 2 will scale from DX6 to DX9 boards and are even putting in place textures and poly counts that present high end boards cannot run - they didn't just sit there an look at what the lowest level was and stop there, the entire thing is coded to meet a range of hardware.
 
radar:
if i understand well , your statement is that because the 5200 is (will be) one of the most widely used dx9 cards out there, developers will use nvidias dx9 shader capabilities/features in their games. (sorry if i misunderstood you)
its not true.
because the 5200 is NOT CAPABLE of running any dx9 shader. sure, it has the feature, but it hasnt got the performance.
so, if developers will keep in mind that the 5200 is the most widely used card, THEY WONT USE any dx9 in their games till something better comes out to the low end market...fortunately that wont happen, as Dave stated...
 
robert_H said:
its not true.
because the 5200 is NOT CAPABLE of running any dx9 shader. sure, it has the feature, but it hasnt got the performance.
It is capable, you said it yourself. You're just putting a judgement on whether or not the speed is acceptable.

However, We don't know the degree of PS2.0 useage in near future games, so we can't begin to know whether or not the DX9 capabilities of the 5200 are useless or not. We know its slower than everything else out there, but it doesn't make it useless--especially if all it means is the end user needs to run at 800x600 without AA.

If there's a large population of 5200's out there, the average developer will keep it in mind.
 
It is capable, you said it yourself. You're just putting a judgement on whether or not the speed is acceptable

exactly
from what we have seen this far (i dont mean only 3dmark, but every other bench or program which uses ps2.0) the whole fx line is very weak, but the 5200 is simply useless in terms of ps performance. thats my opinion, that doesnt mean i'm right ;)

you have right on that maybe the user will use it in 800x600 no AA, and maybe (just maybe) there will be some games using ps 2.0 which ones in a resolution like that will be playable.

i'm just saying, its absolutely impossible to BASE the future of shaders (the usage of shaders, the amount of them used, and the mode these shaders are written) on these cards, as radar stated. but ok, maybe i'm wrong, this thread is not about the 5200, so i'm out :)
 
The 5200 series (NV34) was a stroke of genius on nVidia's part.

The only pity is that they never launched NV34 in september and explained the manufacturing difficulties on the 13 micron process better. if they had done that, chances are nVidia would be in better shape trust wise than they are now.

Market share that parts like the 5200 afford you is a wonderful thing. It is why Half-life 2 will scale all the way down to a TNT-2.
 
radar1200gs said:
....
Okay, dealing with your 5200 comments first, contrary to the beliefs of most enthusiast forum dwellers (who make up 15% of the total gaming market at best), on the PC platform, the lowest common demoninator rules (developers want to maximise sales and will target the most common hardware out there before bothering with support for more modern hardware). As far as DX9 hardware goes, 5200 is the lowest common denominator and will far outsell all higher models from any manufacturer combined. ATi currently has no answer to this. Developers will have to take this into account when designing their games.
....

There's some truth to this in so far as developers not restricting their software to the newest hardware. However, that's a far cry from saying that developers don't support the newest hardware at all--because they certainly do. And of course the baseline for 3D support is constantly ratcheting upwards, which is why you can't do very much today with a V3 or a TNT2 when it comes to even rudimentary 3D games.

Actually, products like the lower-end of the 3D-card spectrum actually don't comprise the bulk of the 3D-gaming market--they are for people who's primary need is 2D display with only occasional 3D-game useage. IE, they are for people whose primary use of their platform is not 3D gaming.

How well I recall when nVidia started pushing hardware T&L support. Although few developers actually supported it in any major way, this did not stop nVidia from pushing it and publishing strange lists of upcoming games said to support it (few of them actually did in reality.) Saying that a game scales downward to support lower-performance hardware does not mean that a game won't support newer functionality if it is present. I can't think of a single hardware T&L supporting game that failed to run in the absence of hardware T&L, though.

So your argument about what ATi has "to match" the 5200 on the lower end is contradictory, I think. When you can demonstrate that games are running on the 5200 but not on ATi's equivalently priced cards then you will have something to talk about. I mean, on the lower end of the spectrum, what makes you think developers won't target ATi's functionality instead of nVidia's? That would seem to be in line with your presumption.
 
I can't think of a single hardware T&L supporting game that failed to run in the absence of hardware T&L, though.
I can. NOLF 2.

So your argument about what ATi has "to match" the 5200 on the lower end is contradictory, I think. When you can demonstrate that games are running on the 5200 but not on ATi's equivalently priced cards then you will have something to talk about. I mean, on the lower end of the spectrum, what makes you think developers won't target ATi's functionality instead of nVidia's? That would seem to be in line with your presumption.
The simple fact is the 5200 will become the baseline standard by sheer force of numbers. The reason is simple - cost. At the 5200's pricepoint ATi can't touch nVidia.
 
So radar1200gs, were you trying to say that because ATi's cards are running the Dawn Demo with a hacked wrapper that was not created by them, and the wrapper has problems with the demo, that ATi's shaders have something wrong with them? :rolleyes:
 
IIRC, Dave mentioned that that ATI had plans to bring the 9600 series into the mainstream market with boards in the sub $100 range. If so, I don't feel that the 5200 will be able to set any defacto standards.
 
Squega said:
So radar1200gs, were you trying to say that because ATi's cards are running the Dawn Demo with a hacked wrapper that was not created by them, and the wrapper has problems with the demo, that ATi's shaders have something wrong with them? :rolleyes:
No. Christ, don't you people read? He said it (the wrapper) was having problems and wasn't rendering the same thing that the NVIDIA cards were--pieces of the model, for example, were missing. Therefor, its disingenuous to point to the fact that its running faster.
 
Squega said:
So radar1200gs, were you trying to say that because ATi's cards are running the Dawn Demo with a hacked wrapper that was not created by them, and the wrapper has problems with the demo, that ATi's shaders have something wrong with them? :rolleyes:

No, I never said anything of the sort, although you would reasonably expect a card widely touted as being superior to the competition to run the Dawn demo without problems, especially when a developer claims the "superior" cards advantage is clearly in the shaders.

No, my beef was more with Mr Huddy trying to compare the speed of a hacked and incomplete Dawn on ATi with the real Dawn on NV3x. You can't compare apples and oranges. Mr Huddy should know that.
 
No, my beef was more with Mr Huddy trying to compare the speed of a hacked and incomplete Dawn on ATi with the real Dawn on NV3x. You can't compare apples and oranges. My Huddy should know that.

Ah, alright. Thanks for clearing it up. Now back to the figh.....debate! :D
 
Back
Top