r420 may beat nv40 in doom3 with anti-aliasing

:oops: Well it doesn't look like it uses SIN or COS at
all, which rather surprises me. :?

You can get ASHLI from the developer page of ATI's
site, you may well want to check what I posted. :)
 
glw said:
The NV30 and NV40 targets seem to produce the same output as R300, so
I've omitted them. I'm more interested in the comparison between R300
and R420. This isn't every shader, I picked out some of those that
require mutliple passes with the R300 family.

If you've got the numbers handy, I'd be quite interested in also seeing the NV30/NV40 numbers.
 
Bouncing Zabaglione Bros. said:
glw said:
The NV30 and NV40 targets seem to produce the same output as R300, so
I've omitted them. I'm more interested in the comparison between R300
and R420. This isn't every shader, I picked out some of those that
require mutliple passes with the R300 family.

If you've got the numbers handy, I'd be quite interested in also seeing the NV30/NV40 numbers.

They are the same as the R300, which is why I didn't include them.
 
Which would seem to indicate that they don't bother to use the more advanced shading capabilities of the NV30/NV40 at all in those profiles.
 
Chalnoth said:
Which would seem to indicate that they don't bother to use the more advanced shading capabilities of the NV30/NV40 at all in those profiles.

Are you talking about the nv30/nv40 profiles running on the r300/r420 ?

If so they most likely use fp 16 alot in those profiles and thus it wouldn't change the scores on the r3x0 and r420 because they both do fp 24 the whole way through?
 
Chalnoth said:
Which would seem to indicate that they don't bother to use the more advanced shading capabilities of the NV30/NV40 at all in those profiles.

Of course that's obvious, maybe if Nvidia make Cg optimise properly for
ATI chips then ATI will get AHSLI running well on Nvidia chips? :)

Besides I posted those figures as they indicate that an R420 has some
real advantages at a shader level over the R300 generation. If they
can add support for the new instructions, and perhaps the new
co-issue permutations, and possibly the new optimised F-buffer
then perhaps the R420 will do quite a bit better than has been
expected so far. I might even give up waiting for a 6800
and get an R420 instead. :)
 
jvd said:
Chalnoth said:
Which would seem to indicate that they don't bother to use the more advanced shading capabilities of the NV30/NV40 at all in those profiles.

Are you talking about the nv30/nv40 profiles running on the r300/r420 ?

If so they most likely use fp 16 alot in those profiles and thus it wouldn't change the scores on the r3x0 and r420 because they both do fp 24 the whole way through?

There are no performance figures at all, just the program lengths, ASHLI
can generate the code without running, in fact I didn't run the shaders at all,
I just went through the more complex programs noting down the passes and
instructions produced. ASHLI's a cool bit of software which will take in
Renderman/HLSL/GLSL and compile to ARB_fp & ARB_vp, or VS/PS 2.0
and 2.x. There doesn't appear to be 3.0 support yet, which is not
surprising given the source. :) Personally I'm most intereseted in GLSL
as it has the greatest scope for optimisation. Persumably the driver
can match or beat the ARB_vp/fp output. If this is true then the R420
will be a much better card for GLSL support than the R3x0, though
not as good as the NV40 or P20.
 
John Reynolds said:
But, yes, IMO Dave needs to break out the ban stick and remove a handful of fuckwits from this board.

Agree. The noise to signal ratio is getting out of hand.
 
glw said:
Of course that's obvious, maybe if Nvidia make Cg optimise properly for
ATI chips then ATI will get AHSLI running well on Nvidia chips? :)
What I'm talking about wouldn't be an optimization, just simply the exposing instruction lengths/features. I mean, seriously, if they're going to have NV30/NV40 profiles at all, they might as well reflect the hardware.

As far as Cg is concerned, ATI still has the opportunity to write their own optimized compiler. But now that GLSL is available, I don't see much reason to use Cg, except for programming for old hardware under OpenGL.
 
ChrisRay said:
FUDie said:
radar1200gs said:
Albuquerque said:
Ok, so just one more post:

Performance is no substitute for core features. See: 3dfx, Ruby.
I would surely agree, if you're lacking features that are mainstream -- please show me a mainstream SM3 application, or an announced SM3 application that has no fallback to PS 1.x (let alone PS 2.x) :)

And in reverse, featureset is no excuse for poor performance either ;)
The thing is SM3.0 will allow you to fallback, but something like PS1.4 won't.
PS 1.4 allows just as much of a fallback to PS 1.x as PS 3.0 does for PS 2.0: i.e. none whatsoever. The developer has to explicitly code the shaders and app to allow the fallback.

Wrong again, Radar.
Maybe he is suggesting something different. For example several Shader 3.0 Operations can fit into Shader 2.0 code,

And Shader 2.0 can sometimes fallback to Shader 1.4 Code.

But many cases Shader 1.4 cant fall back to 1.1.
That's not what he meant at all. See his post below yours.
I could be completely wrong but the way Skuzzy described it to me. Most Shader 2.0 Operations can be done in Pixel Shader 1.4 (Current Shader 2.0 Ops) Albeit with less precision. But still single pass.
That's not correct. PS 1.4 has a limit of 16 (I think) instructions over 2 phases, how are you planning on converting your PS 2.0 shader with 4 "phases" (sets of dependent texture reads) and 50 instructions to a single pass PS 1.4 shader?
But many Shader 1.4 operations cant be done in Shader 1.1 Without multiple passes. Maybe he's assuming Shader 3.0/2.0 would be similar.
What your saying about PS 1.4 and 1.1 applies just the same to PS 2.0 and 1.4. Converting PS 3.0 shaders to 2.0 shaders is non-trivial. First, there are some operations in 3.0 that are not in 2.0 (gradients, conditionals, etc.). Use of some of these operations is guaranteed to require extra passes, if it can be done at all. HLSL is not going to multipass these shaders as it's just a high level language with a compiler; the application must explicitly support it.

-FUDie
 
What your saying about PS 1.4 and 1.1 applies just the same to PS 2.0 and 1.4. Converting PS 3.0 shaders to 2.0 shaders is non-trivial. First, there are some operations in 3.0 that are not in 2.0 (gradients, conditionals, etc.). Use of some of these operations is guaranteed to require extra passes, if it can be done at all. HLSL is not going to multipass these shaders as it's just a high level language with a compiler; the application must explicitly support it.

Obviously when I said "Some" I did not mean all. That applies for Pixel Shader 2.0

That's not correct. PS 1.4 has a limit of 16 (I think) instructions over 2 phases, how are you planning on converting your PS 2.0 shader with 4 "phases" (sets of dependent texture reads) and 50 instructions to a single pass PS 1.4 shader?

Depends the shader. Not all Shader 2.0's are pushing the candle of Shader 2.0. In many cases Shader 1.4 can handle alot of code being written for Shader 2.0. Thats where the point applies. You are taking worse case scenerios and trying to compare them to "best" case scenerios.

But there are many cases where Shader 1.1 cant produce a comparable result to shader 1.4. Without multiple passes. I'm aware that worse case scenerios the Shader 2.0/3.0 can be quite different, But in many cases, An Operation designed for a single pass in Shader 3.0 is going to do a single pass in Shader 2.0 Code. When Far Cry patch gets released, We'll probably see this. Someone just might opt to use a Shader 3.0 profile even when it can be done in Shader 2.0.

Same with 2.0/1.4, Someone Might write a shader in a with a 2.0 profile, And still easily be able to write the same shader in a 1.4 profile.

However with the Move to shader 2.0. Shader 1.4's adoption is slowly moving forward as well. I imagine many Shader 2.0 games will have Shader 1.4 Fallbacks. I also dont believe Shader 1.4 is as "Dead" as some people would you like you to believe.

It is my honest opinion that shader 1.4's slow adoption was because it was so different at the time, And such a move forward, That it was not adopted as quickly. As we move forward Shader 1.4 will become more usable. As Most hardware beyond geforce 4's support it now. I believe the same will happen with shader 3.0
 
Why would developers bother to support PS 1.4 now? Most hardware that supports PS 1.4 also supports 2.0, so I don't see any reason for much work to be done in PS 1.4.
 
Chalnoth said:
Why would developers bother to support PS 1.4 now? Most hardware that supports PS 1.4 also supports 2.0, so I don't see any reason for much work to be done in PS 1.4.

I believe using Shader 1.4 on Nvidia hardware (Specifically the FX line) Can be a good thing because of ill fated Shader 2.0 performance in many cases.

Thats a very valid reason. The FX line exists and entrenched in the market. You might as well try and support it the best you can. The FX line is basically being treated as a high end DirectX 8.0 solution and low end DirectX 9.0 solution. So Shader 1.4 code could probably produce many comparable results to Shader 2.0 code on that hardware.

I find it ironic. That the cards most likely be to be running 1.4 Shaders is actually the FX line.

Chris
 
ChrisRay said:
What your saying about PS 1.4 and 1.1 applies just the same to PS 2.0 and 1.4. Converting PS 3.0 shaders to 2.0 shaders is non-trivial. First, there are some operations in 3.0 that are not in 2.0 (gradients, conditionals, etc.). Use of some of these operations is guaranteed to require extra passes, if it can be done at all. HLSL is not going to multipass these shaders as it's just a high level language with a compiler; the application must explicitly support it.
Obviously when I said "Some" I did not mean all. That applies for Pixel Shader 2.0
That's not correct. PS 1.4 has a limit of 16 (I think) instructions over 2 phases, how are you planning on converting your PS 2.0 shader with 4 "phases" (sets of dependent texture reads) and 50 instructions to a single pass PS 1.4 shader?
Depends the shader. Not all Shader 2.0's are pushing the candle of Shader 2.0. In many cases Shader 1.4 can handle alot of code being written for Shader 2.0. Thats where the point applies. You are taking worse case scenerios and trying to compare them to "best" case scenerios.

But there are many cases where Shader 1.1 cant produce a comparable result to shader 1.4. Without multiple passes.
So it's ok for you to look at worst case scenarios of PS 1.4 vs. 1.1 but not PS 2.0 vs. 1.4?
I'm aware that worse case scenerios the Shader 2.0/3.0 can be quite different, But in many cases, An Operation designed for a single pass in Shader 3.0 is going to do a single pass in Shader 2.0 Code. When Far Cry patch gets released, We'll probably see this. Someone just might opt to use a Shader 3.0 profile even when it can be done in Shader 2.0.
I'm sure that Far Cry will look no different because the content was not written to take advantage of PS 3.0.
Same with 2.0/1.4, Someone Might write a shader in a with a 2.0 profile, And still easily be able to write the same shader in a 1.4 profile.
Or not.
However with the Move to shader 2.0. Shader 1.4's adoption is slowly moving forward as well. I imagine many Shader 2.0 games will have Shader 1.4 Fallbacks. I also dont believe Shader 1.4 is as "Dead" as some people would you like you to believe.
It's moving forward because the amount of platforms that support PS 1.4 is growing. For a long time only ATI supported PS 1.4.
It is my honest opinion that shader 1.4's slow adoption was because it was so different at the time, And such a move forward, That it was not adopted as quickly. As we move forward Shader 1.4 will become more usable. As Most hardware beyond geforce 4's support it now. I believe the same will happen with shader 3.0
It was slow to be adopted because only one IHV supported it! NVIDIA finally supported it with the GeForce FX 5x00 line, but it was slower than it could have been because the chip wasn't that great for floating point operations, which they had to use to meet the precision requirements of PS 1.4.

-FUDie
 
So it's ok for you to look at worst case scenarios of PS 1.4 vs. 1.1 but not PS 2.0 vs. 1.4

Actually I think its best to look at both scenerios, and acknowledge the pros and cons of each.

I'm sure that Far Cry will look no different because the content was not written to take advantage of PS 3.0.

The same could be said for Shader 1.4/2.0 Games lately. Wouldnt you agree? In many cases Games arent taking advantage of 2.0 Shaders are capable of a 1.4 Shader fallback.

It's moving forward because the amount of platforms that support PS 1.4 is growing. For a long time only ATI supported PS 1.4.

I think its a little bit of both. If Shader 1.4 was worthless. People wouldnt use it anyway despite how much hardware supports it.

It was slow to be adopted because only one IHV supported it! NVIDIA finally supported it with the GeForce FX 5x00 line, but it was slower than it could have been because the chip wasn't that great for floating point operations, which they had to use to meet the precision requirements of PS 1.4.

Isnt that kind of what I said? As we move forward Shader 3.0 will probably be supported in the same fashion as Shader 1.4 is moving forward now. To say Shader 1.4 is a good thing and shader 3.0 is not is an argument that doesnt make sense to me.

Yes the FX line supports it. But I dont think its entirely fair to criticise ATI or Nvidia for not supporting either function, It would seem ATI has plans to support Shader 3.0. All we can really go by is what we currently have available right now, Currently there is plenty of Pixel Shader 1.4 Capable hardware, And in many cases its proving to be just as good as 2.0.

In many cases 2.0 will probably be just as good as 3.0. But not all. My only disgruntle with ATI is them telling Devs not to code 3.0 for Nvidia hardware. I find that somewhat hypocritical since they have intentions of supporting it within the future.
 
ChrisRay said:
In many cases 2.0 will probably be just as good as 3.0. But not all. My only disgruntle with ATI is them telling Devs not to code 3.0 for Nvidia hardware. I find that somewhat hypocritical since they have intentions of supporting it within the future.
They were telling devs to aviod it because it most likely won't be strong on nv's hadrware this gen (most likely from feedback they had from other devs). Did you also disagree with nvidia telling dev's not to use ps2.0, instead to use dx8 ps1.1/1.4 because their own hardware sucked at ps2.0 (whilst they simultaneously marketed and sold those parts as full dx9)?
 
ChrisRay said:
So it's ok for you to look at worst case scenarios of PS 1.4 vs. 1.1 but not PS 2.0 vs. 1.4

Actually I think its best to look at both scenerios, and acknowledge the pros and cons of each.
I'm sure that Far Cry will look no different because the content was not written to take advantage of PS 3.0.
The same could be said for Shader 1.4/2.0 Games lately. Wouldnt you agree? In many cases Games arent taking advantage of 2.0 Shaders are capable of a 1.4 Shader fallback.
I wouldn't say so. Didn't someone show some of the shaders from Tomb Raider a while ago? How can you convert a 48 instruction PS 2.0 shader to a PS 1.4 one?
It's moving forward because the amount of platforms that support PS 1.4 is growing. For a long time only ATI supported PS 1.4.
I think its a little bit of both. If Shader 1.4 was worthless. People wouldnt use it anyway despite how much hardware supports it.
Who said it's worthless? I am saying it's not useful to think of PS 1.4 as a fallback to 2.0. If your content can be written in PS 1.4 then by all means use it.
It was slow to be adopted because only one IHV supported it! NVIDIA finally supported it with the GeForce FX 5x00 line, but it was slower than it could have been because the chip wasn't that great for floating point operations, which they had to use to meet the precision requirements of PS 1.4.
Isnt that kind of what I said? As we move forward Shader 3.0 will probably be supported in the same fashion as Shader 1.4 is moving forward now. To say Shader 1.4 is a good thing and shader 3.0 is not is an argument that doesnt make sense to me.
You completely missed my point. I never said that PS 1.4 is bad. I am arguing with your claim that PS 2.0 shaders can be written in PS 1.4!

-FUDie
 
I wouldn't say so. Didn't someone show some of the shaders from Tomb Raider a while ago? How can you convert a 48 instruction PS 2.0 shader to a PS 1.4 one

I saw those results, I dont remember who posted them. But they didnt look all that much different. The biggest difference was between 1.1 (rather than 1.4/2.0) (was it Neeyik who made that comparison? And were the results with or without DoF enabled?) I remember seeing them. But heres my question for you. IF the shader doesnt require more instructions than a 1.4 shader. Then you can easily as it as a fallback. Why not do it? Current games seem to do it.

The argument here seems to be that the same cannot be said for Shader 3.0. Early Shader 2.0 titles have all offered 1.4 Shader as a fallback, (With the exception of Far Cry)

Who said it's worthless? I am saying it's not useful to think of PS 1.4 as a fallback to 2.0. If your content can be written in PS 1.4 then by all means use it.

Well you're the one who said its only supported because the hardware supports it now. So you tell me. I think its perfectly valid to consider 1.4 a fallback to 2.0 if the instructions can be done within the boundaries of 1.4 as well.

I am not quite sure where you are going with the argument to say Shader 1.4 is only here because of the hardware. I Disagree and think Shader 1.4 support will continue to thrive as long as 2.0 support is out. Considering the Gap between 1.4 and 2.0 isnt nearly as wide as 1.1 and 1.4.

You completely missed my point. I never said that PS 1.4 is bad. I am arguing with your claim that PS 2.0 shaders can be written in PS 1.4!

Many of them can. Are you saying all the data in HL2 and Tomb Raider is all incorrect? Most current content with Shader 2.0 hardware seems to be showing that it can also be written in 1.4. I'm "aware" That shader 2.0 offers more flexibility than 1.4. But In Many cases 1.4 Has shown to be a sufficient fallback to 2.0. (But not always)

The Same fallbacks will probably occur with Shader 2.0 and 3.0. Are Far Cry devs wrong and saying Shader 3.0 effects will fall back to Shader 2.0?


They were telling devs to aviod it because it most likely won't be strong on nv's hadrware this gen (most likely from feedback they had from other devs). Did you also disagree with nvidia telling dev's not to use ps2.0, instead to use dx8 ps1.1/1.4 because their own hardware sucked at ps2.0 (whilst they simultaneously marketed and sold those parts as full dx9)?

Yes I do disagree with one vendor telling another how to code for another hardware, In Most cases it would seem that Devs who coded for r300 got the results they wanted.

Look At Far Cry, Expecially the indoor scenes where Pixel Shader 2.0 is used heavily. Did Nvidia's poor performance in Shader 2.0 stops the Far Cry Devs from utilizing it for the r300 hardware? I wont get into speculation about the Nv40's Shader 3.0 performance.

But I completely disagree with one IHV telling another how to code for another vendors hardware, Its not ATIS concern how devs code for Nvidia hardware, and Vice Versa. They should be concerned with getting optimal results out of their own hardware, Not working to castrate someone else's hardware.
 
Many, if not most "PS2.0" shaders in use today can be written in PS1.4 FUDie. We're still a ways away from most games using shaders >20+ instructions on all their shaders or more than 1 level of dependant texturing.
 
ChrisRay said:
Then you can easily as it as a fallback. Why not do it? Current games seem to do it.

Precision is a potential reason - PS1.x is integer, PS2.0 and above is float. Not all PS1.4 capable parts have the same range under PS1.4 either.
 
Back
Top