Shaders are currently rubbish as video cards have other...

K.I.L.E.R

Retarded moron
Veteran
bottlenecks.

Playing games with FSAA and AF seem to kill framerates. So how do shaders improve performance with those 2 options? Procedural rendering, how would it effect current implementations of AF and FSAA?
When will we see 100% PR games?
What amounts of power would we need?
Will we have infinite processing power?
 
The raison d'etre of shaders wasn't to improve performance (certainly before 1.4, there isn't much you can do to improve performance by reducing passes anyway). Shaders and FSAA/AF are separate things.

Shaders exist to let people do stuff they couldn't do before without a lot of CPU intervention (vertex shader) and to do stuff that was impossible at any decent framerate (pixel shader).

I'm not sure what you mean by 'procedural rendering' in this context. If you mean by a renderman 'shaders are everything' model, VS/PS2.0 gets pretty close to being able to run all the shaders.

By definition, we won't have infinite processing power.

As to when we have 'enough' - well, if you say the gold standard in 3D is Shrek, Final Fantasy, Ice Age etc. then we've a long way to go. But the games guys will get close to the same image quality with much less horsepower needed.
 
Shaders and FSAA/AF are separate things.

I realise that, but Ati seem to be doing loads of things with shaders and I wonder when are we going to use ONLY shaders to render complex scenes with FSAA and AF.
 
Maybe the solution is for IHVs to come out with :

1) products with the latest API-determined technologies and sell them at a more expensive price while at the same time ;
2) also produce cards without the latest API-determined technologies and just have, hopefully, better AA and AF algorithms and sell these at a lower price?

... all the while using the same (=latest) process technology to get the highest clockspeed possible?

:rolleyes:

I see what you're getting at K.I.L.E.R but costs are a big factor.
 
If anything I'd say that was backwards. To promote shader use, the shaders have to be there. If the majority of cards don't have shaders, the games won't use them. We're suffering badly from this right now.
 
I meant 'us' as in the general public. I stay away from any ATI-centric posts, I get more sleep that way :). Just the facts, ma'am.

I guess I read it so quickly the smiley's hadn't loaded :)
 
Actually, the primary reason to introduce shaders, and, more particularly more complex shaders than we have today is for performance. The secondary reason is for precision.

Fixed-function rendering can actually do quite a lot...but it is very slow at it.

As far as how shaders affect FSAA and anisotropic filtering, they shouldn't affect either at all. The only exception would be a shader that uses some sort of kill instruction, which would result in the same aliasing seen with alpha tests.
 
Fixed function can't do anything that requires texture dereferencing or more than 8-bit precision.
 
I would argue that performance and precision is secondary. The primary reason why shaders exist is that it adds to many capabilities to do stuff that was previously impossible or very inconvenient.
 
K.I.L.E.R said:
Shaders and FSAA/AF are separate things.

I realise that, but Ati seem to be doing loads of things with shaders and I wonder when are we going to use ONLY shaders to render complex scenes with FSAA and AF.


Not only does every modern ati card support ps 1.4 and above but the aa and fsaa of its product range is second to none in its mid - high end.

What you should ask yourself is what is nvidia bringing to the market. The only nvidia cards you can buy today are pixel shader 1.3 and under and offer aa and fsaa inferior to the ati line up.
 
What about using shaders to render everything including textures?
FSAA will have to be done through the shaders then, right?
 
Is there any real advantage to going to shader only rendering? That seems like a massive step away from what we're doing now... I'm not sure if that will come to pass?
 
Reverend said:
Maybe the solution is for IHVs to come out with :

1) products with the latest API-determined technologies and sell them at a more expensive price while at the same time ;
2) also produce cards without the latest API-determined technologies and just have, hopefully, better AA and AF algorithms and sell these at a lower price?

... all the while using the same (=latest) process technology to get the highest clockspeed possible?

:rolleyes:

I see what you're getting at K.I.L.E.R but costs are a big factor.

Sounds like what ATI and nVidia are just about to do from the rumors on this forum.
 
Nagorak said:
Is there any real advantage to going to shader only rendering? That seems like a massive step away from what we're doing now... I'm not sure if that will come to pass?

Cinematic quality ?! Both ATI and NV are saying this is now possible :D
 
If both NV and ATI are supporting shaders on mid and high end products why not use shaders exclusively? By Xmas this year there will be a sh*&load of DX8/9 parts out there. It would be great if there were chipsets with integrated graphics that supported shaders. There must be something planned to release be Xmas this year.
 
Nagorak said:
Is there any real advantage to going to shader only rendering? That seems like a massive step away from what we're doing now... I'm not sure if that will come to pass?

IMO the problem isn't with the shaders but with the methodology. Vertex shaders are currently similar to doing these things (T&L) in the old days on the CPU on hardware that could only rasterise. The concepts of T&L aren't there, which makes using them conceptually hardware, IMO. I do think shaders are the way to go -- they just need to go this one step further (they've already gone from assembly level to high level, now they need the 3D concepts).

I've been doing quite a bit of thinking about this and vertex shaders in general, and I'm planning to write something and send it to Microsoft, in hope that it'd do some good.
 
Back
Top