Pavlos said:
This is not impossible. We already have compilers which translate RenderMan shaders in bytecode for interpretation, and there’s nothing that prevents you from writing a compiler that targets the OpenGL2 API instead. I’m referring to OpenGL2 because it has not any hardware limits (instruction limits, texture fetch limits, etc…), but it’s certainly feasible with DX9 too. In fact, converting the bytecode of my renderer to OpenGL2 shaders is trivial for the majority of shaders, but I’m not sure if any hardware today is supporting the OpenGL shading language.
Whoa whoa whoa whoa! Major correction on OpenGL Shading Language needed.
There are virtual *and* physical limits to the OpenGL Shading Language. What we virtualized were the things that were difficult to count in a device independent way -- temporaries, instructions and texture fetch restrictions.
But there are very real physical limits. Some of the constraints are small and harsh. Just a few:
- Vertex attributes - 16 vec4s is the minimum maxium.
Varying floats (interpolators) - 32 floats is the minimum maximum.
Texture units - *2* is the minimum maximum, with *0* the minimum maximum texture units available to the vertex shader. (ATI's initial implementation has 16 texture units.)
Production shaders *will* (not may, *will*) exceed these limits - by large margins. So production shaders *will* have to be broken up ala Peercy/Olano/et al. (The example RenderMan shaders in the paper are *not* close to production shaders in size or scope. But the good news is some of these simple shaders from the paper can now be directly ported to OpenGL Shading Language. )
So, not quite so trivial for the majority of shaders, let alone production shaders.
On Mr. Blue's hardware pudding, I'd say we know where the ingredients are, we even probably know how to mix them, but we still have to get out the whisk, get everything into the saucepan and then chill to make the pudding. We don't yet know if we'll get pudding out of this. But if we do, since on some of the incredients we've used some substitions (and even left a couple of pinches out) we still aren't quite sure how it will taste yet.
Mr. Blue already knows the software pudding tastes good. (So does anyone who saw Bunny or Ice Age.)
Finally, if history is any guide. The short Red's Dream was rendered in software for the opening and closing sequences, and hardware (the Pixar Image Computer) for the dream sequence. All predating RenderMan btw. As far as I know, it's only distributed on "Tiny Toy Stories" VHS, and in Quicktime on Pixar's web site. See for yourself.
-mr. bill