Chalnoth said:
Most surfaces in my room aren't reflecting much. Most could easily be approximated with decent specular lighting.
I don't think "let's approximate everything in a simple way" is the way of the future. We approximate those things with specular now, but I don't expect us to still be doing this 10 years from now. I am trying to look forward, not back.
But this is little more than a "mix and match" style of shader. You don't need to develop entirely new shaders for most objects, and can simply link pre-programmed pieces together. I don't see this as being very different than the idea of a unified shader (since the only difference is in implementation, and thus performance characteristics, not in art development). I believe this is, in fact, what UE3 does.
Depends on what you call a shader. In light of the underlying hardware, a shader is still a single independent program. So whether you link them together and compile at runtime or not, you are using multiple shaders to enhance the diversity of material.
Ofcourse these shaders can be partly recycled.
But Doom3 went the other way, restrict the artists to pretty much use one material everywhere. Which was nice on early shader hardware, but I don't see this as the solution for the not-so-distant future. It seems like most engines, among which UE3.0, 3DMark05, HalfLife2 and the Artificial Reality engine are going for the path of using many shaders to create a large diversity of materials.
As for procedural effects, what did you have in mind?
Depends on how powerful the hardware is. Currently we can already do procedural wood and marble textures, and you'd be surprised how useful noise or trig functions can be in all kinds of surface (or volume) emulations (fire, water, smoke, clouds, you name it). As hardware gets more powerful, we can implement more sophisticated functions. Ideally we can model the entire world around us procedurally.
The work that Henrik Wann Jensen did on
subsurface scattering with his photon mapper is interesting aswell, he actually modeled the material procedurally (one function for each material, which would translate to one shader per material), so there were all kinds of microscopic bumps and things in which the light would reflect and illuminate the surface. That's the sort of thing we may be doing on programmable hardware in the future.