glw said:The NV30 and NV40 targets seem to produce the same output as R300, so
I've omitted them. I'm more interested in the comparison between R300
and R420. This isn't every shader, I picked out some of those that
require mutliple passes with the R300 family.
Bouncing Zabaglione Bros. said:glw said:The NV30 and NV40 targets seem to produce the same output as R300, so
I've omitted them. I'm more interested in the comparison between R300
and R420. This isn't every shader, I picked out some of those that
require mutliple passes with the R300 family.
If you've got the numbers handy, I'd be quite interested in also seeing the NV30/NV40 numbers.
Chalnoth said:Which would seem to indicate that they don't bother to use the more advanced shading capabilities of the NV30/NV40 at all in those profiles.
Chalnoth said:Which would seem to indicate that they don't bother to use the more advanced shading capabilities of the NV30/NV40 at all in those profiles.
jvd said:Chalnoth said:Which would seem to indicate that they don't bother to use the more advanced shading capabilities of the NV30/NV40 at all in those profiles.
Are you talking about the nv30/nv40 profiles running on the r300/r420 ?
If so they most likely use fp 16 alot in those profiles and thus it wouldn't change the scores on the r3x0 and r420 because they both do fp 24 the whole way through?
John Reynolds said:But, yes, IMO Dave needs to break out the ban stick and remove a handful of fuckwits from this board.
What I'm talking about wouldn't be an optimization, just simply the exposing instruction lengths/features. I mean, seriously, if they're going to have NV30/NV40 profiles at all, they might as well reflect the hardware.glw said:Of course that's obvious, maybe if Nvidia make Cg optimise properly for
ATI chips then ATI will get AHSLI running well on Nvidia chips?
That's not what he meant at all. See his post below yours.ChrisRay said:Maybe he is suggesting something different. For example several Shader 3.0 Operations can fit into Shader 2.0 code,FUDie said:PS 1.4 allows just as much of a fallback to PS 1.x as PS 3.0 does for PS 2.0: i.e. none whatsoever. The developer has to explicitly code the shaders and app to allow the fallback.radar1200gs said:The thing is SM3.0 will allow you to fallback, but something like PS1.4 won't.Albuquerque said:Ok, so just one more post:
I would surely agree, if you're lacking features that are mainstream -- please show me a mainstream SM3 application, or an announced SM3 application that has no fallback to PS 1.x (let alone PS 2.x)Performance is no substitute for core features. See: 3dfx, Ruby.
And in reverse, featureset is no excuse for poor performance either
Wrong again, Radar.
And Shader 2.0 can sometimes fallback to Shader 1.4 Code.
But many cases Shader 1.4 cant fall back to 1.1.
That's not correct. PS 1.4 has a limit of 16 (I think) instructions over 2 phases, how are you planning on converting your PS 2.0 shader with 4 "phases" (sets of dependent texture reads) and 50 instructions to a single pass PS 1.4 shader?I could be completely wrong but the way Skuzzy described it to me. Most Shader 2.0 Operations can be done in Pixel Shader 1.4 (Current Shader 2.0 Ops) Albeit with less precision. But still single pass.
What your saying about PS 1.4 and 1.1 applies just the same to PS 2.0 and 1.4. Converting PS 3.0 shaders to 2.0 shaders is non-trivial. First, there are some operations in 3.0 that are not in 2.0 (gradients, conditionals, etc.). Use of some of these operations is guaranteed to require extra passes, if it can be done at all. HLSL is not going to multipass these shaders as it's just a high level language with a compiler; the application must explicitly support it.But many Shader 1.4 operations cant be done in Shader 1.1 Without multiple passes. Maybe he's assuming Shader 3.0/2.0 would be similar.
What your saying about PS 1.4 and 1.1 applies just the same to PS 2.0 and 1.4. Converting PS 3.0 shaders to 2.0 shaders is non-trivial. First, there are some operations in 3.0 that are not in 2.0 (gradients, conditionals, etc.). Use of some of these operations is guaranteed to require extra passes, if it can be done at all. HLSL is not going to multipass these shaders as it's just a high level language with a compiler; the application must explicitly support it.
That's not correct. PS 1.4 has a limit of 16 (I think) instructions over 2 phases, how are you planning on converting your PS 2.0 shader with 4 "phases" (sets of dependent texture reads) and 50 instructions to a single pass PS 1.4 shader?
Chalnoth said:Why would developers bother to support PS 1.4 now? Most hardware that supports PS 1.4 also supports 2.0, so I don't see any reason for much work to be done in PS 1.4.
So it's ok for you to look at worst case scenarios of PS 1.4 vs. 1.1 but not PS 2.0 vs. 1.4?ChrisRay said:Obviously when I said "Some" I did not mean all. That applies for Pixel Shader 2.0What your saying about PS 1.4 and 1.1 applies just the same to PS 2.0 and 1.4. Converting PS 3.0 shaders to 2.0 shaders is non-trivial. First, there are some operations in 3.0 that are not in 2.0 (gradients, conditionals, etc.). Use of some of these operations is guaranteed to require extra passes, if it can be done at all. HLSL is not going to multipass these shaders as it's just a high level language with a compiler; the application must explicitly support it.
Depends the shader. Not all Shader 2.0's are pushing the candle of Shader 2.0. In many cases Shader 1.4 can handle alot of code being written for Shader 2.0. Thats where the point applies. You are taking worse case scenerios and trying to compare them to "best" case scenerios.That's not correct. PS 1.4 has a limit of 16 (I think) instructions over 2 phases, how are you planning on converting your PS 2.0 shader with 4 "phases" (sets of dependent texture reads) and 50 instructions to a single pass PS 1.4 shader?
But there are many cases where Shader 1.1 cant produce a comparable result to shader 1.4. Without multiple passes.
I'm sure that Far Cry will look no different because the content was not written to take advantage of PS 3.0.I'm aware that worse case scenerios the Shader 2.0/3.0 can be quite different, But in many cases, An Operation designed for a single pass in Shader 3.0 is going to do a single pass in Shader 2.0 Code. When Far Cry patch gets released, We'll probably see this. Someone just might opt to use a Shader 3.0 profile even when it can be done in Shader 2.0.
Or not.Same with 2.0/1.4, Someone Might write a shader in a with a 2.0 profile, And still easily be able to write the same shader in a 1.4 profile.
It's moving forward because the amount of platforms that support PS 1.4 is growing. For a long time only ATI supported PS 1.4.However with the Move to shader 2.0. Shader 1.4's adoption is slowly moving forward as well. I imagine many Shader 2.0 games will have Shader 1.4 Fallbacks. I also dont believe Shader 1.4 is as "Dead" as some people would you like you to believe.
It was slow to be adopted because only one IHV supported it! NVIDIA finally supported it with the GeForce FX 5x00 line, but it was slower than it could have been because the chip wasn't that great for floating point operations, which they had to use to meet the precision requirements of PS 1.4.It is my honest opinion that shader 1.4's slow adoption was because it was so different at the time, And such a move forward, That it was not adopted as quickly. As we move forward Shader 1.4 will become more usable. As Most hardware beyond geforce 4's support it now. I believe the same will happen with shader 3.0
So it's ok for you to look at worst case scenarios of PS 1.4 vs. 1.1 but not PS 2.0 vs. 1.4
I'm sure that Far Cry will look no different because the content was not written to take advantage of PS 3.0.
It's moving forward because the amount of platforms that support PS 1.4 is growing. For a long time only ATI supported PS 1.4.
It was slow to be adopted because only one IHV supported it! NVIDIA finally supported it with the GeForce FX 5x00 line, but it was slower than it could have been because the chip wasn't that great for floating point operations, which they had to use to meet the precision requirements of PS 1.4.
They were telling devs to aviod it because it most likely won't be strong on nv's hadrware this gen (most likely from feedback they had from other devs). Did you also disagree with nvidia telling dev's not to use ps2.0, instead to use dx8 ps1.1/1.4 because their own hardware sucked at ps2.0 (whilst they simultaneously marketed and sold those parts as full dx9)?ChrisRay said:In many cases 2.0 will probably be just as good as 3.0. But not all. My only disgruntle with ATI is them telling Devs not to code 3.0 for Nvidia hardware. I find that somewhat hypocritical since they have intentions of supporting it within the future.
I wouldn't say so. Didn't someone show some of the shaders from Tomb Raider a while ago? How can you convert a 48 instruction PS 2.0 shader to a PS 1.4 one?ChrisRay said:So it's ok for you to look at worst case scenarios of PS 1.4 vs. 1.1 but not PS 2.0 vs. 1.4
Actually I think its best to look at both scenerios, and acknowledge the pros and cons of each.
The same could be said for Shader 1.4/2.0 Games lately. Wouldnt you agree? In many cases Games arent taking advantage of 2.0 Shaders are capable of a 1.4 Shader fallback.I'm sure that Far Cry will look no different because the content was not written to take advantage of PS 3.0.
Who said it's worthless? I am saying it's not useful to think of PS 1.4 as a fallback to 2.0. If your content can be written in PS 1.4 then by all means use it.I think its a little bit of both. If Shader 1.4 was worthless. People wouldnt use it anyway despite how much hardware supports it.It's moving forward because the amount of platforms that support PS 1.4 is growing. For a long time only ATI supported PS 1.4.
You completely missed my point. I never said that PS 1.4 is bad. I am arguing with your claim that PS 2.0 shaders can be written in PS 1.4!Isnt that kind of what I said? As we move forward Shader 3.0 will probably be supported in the same fashion as Shader 1.4 is moving forward now. To say Shader 1.4 is a good thing and shader 3.0 is not is an argument that doesnt make sense to me.It was slow to be adopted because only one IHV supported it! NVIDIA finally supported it with the GeForce FX 5x00 line, but it was slower than it could have been because the chip wasn't that great for floating point operations, which they had to use to meet the precision requirements of PS 1.4.
I wouldn't say so. Didn't someone show some of the shaders from Tomb Raider a while ago? How can you convert a 48 instruction PS 2.0 shader to a PS 1.4 one
Who said it's worthless? I am saying it's not useful to think of PS 1.4 as a fallback to 2.0. If your content can be written in PS 1.4 then by all means use it.
You completely missed my point. I never said that PS 1.4 is bad. I am arguing with your claim that PS 2.0 shaders can be written in PS 1.4!
They were telling devs to aviod it because it most likely won't be strong on nv's hadrware this gen (most likely from feedback they had from other devs). Did you also disagree with nvidia telling dev's not to use ps2.0, instead to use dx8 ps1.1/1.4 because their own hardware sucked at ps2.0 (whilst they simultaneously marketed and sold those parts as full dx9)?
ChrisRay said:Then you can easily as it as a fallback. Why not do it? Current games seem to do it.