I would like to make a general observation.
Why do movie companies would want to use bump-mapping? Because the cost of real displacement mapping is too high. And why use displacement mapping? Use more vertices! But why would you want to use vertices in the first place? Splines are much better!
To make your models look best, you want N (yes) polygons, made out of splines. Rendered exactly as specified.
Ok. That takes care of the geometry. Now we want it to look nice. Do we want to 'paint' it? We might. It is easy and follows the way we see things. But that's just because we use only the outer surface. And things like eyes should be done independently.
It would be much better to generate those objects 'inside-out'. Create generic objects, like an eye, that contain the way they look within.
Hm. Are we talking molecules? Atoms? REAL simulations? Yes.
Can we do that? No.
We need shortcuts. As we only see the outer layer, we can remove all that is inside. And so we need textures. Paint.
So, any way you look at it, even the best movie CG is *NOT* 'real'. Hell, nobody even knows how to render the most basic thing: a convincingly realistic human.
We all try. Hard. The hardware people. The artists. The game people. The movie people. But it isn't really 'real' yet. It might look extremely awesomely stunning, *for CG graphics*. If you're an insider and know how it is.
Does that mean, that game graphics are bad? No. Au contrary. They're sublime, compared to older games. Does that mean, that movie CG images are bad? Yes. Absolutely. Anyone can tell, just by looking at it, that it's not really a 'real' camera-captured scene.
But everyone is out to change all that. And succeeding really great. And really fast. But we're not there yet.
Does that mean, that game developers do worse than the movie people? No. Or the opposite? No.
8)
We need a benchmark. Not of speed. But of the 'realism' of the graphics. And we (more or less) chose Toy Story, as it was the first milestone.
Can a current, top-of-the-bill graphics card render Toy Story in the same quality all by itself? The current riot about the nVidia drivers cheating comes to mind. Does it matter? If you are there, looking at the big screen, and you cannot say if it is done real-time by a graphic card or prerendered, it does not matter.
Can that current, bleeding-edge graphic card render it a lot faster than the renderfarm that was used? No. Can the same amount of graphic cards as there were computers used do it? Absolutely. Without raising a sweat. Any way you look at it.
Why do movie companies would want to use bump-mapping? Because the cost of real displacement mapping is too high. And why use displacement mapping? Use more vertices! But why would you want to use vertices in the first place? Splines are much better!
To make your models look best, you want N (yes) polygons, made out of splines. Rendered exactly as specified.
Ok. That takes care of the geometry. Now we want it to look nice. Do we want to 'paint' it? We might. It is easy and follows the way we see things. But that's just because we use only the outer surface. And things like eyes should be done independently.
It would be much better to generate those objects 'inside-out'. Create generic objects, like an eye, that contain the way they look within.
Hm. Are we talking molecules? Atoms? REAL simulations? Yes.
Can we do that? No.
We need shortcuts. As we only see the outer layer, we can remove all that is inside. And so we need textures. Paint.
So, any way you look at it, even the best movie CG is *NOT* 'real'. Hell, nobody even knows how to render the most basic thing: a convincingly realistic human.
We all try. Hard. The hardware people. The artists. The game people. The movie people. But it isn't really 'real' yet. It might look extremely awesomely stunning, *for CG graphics*. If you're an insider and know how it is.
Does that mean, that game graphics are bad? No. Au contrary. They're sublime, compared to older games. Does that mean, that movie CG images are bad? Yes. Absolutely. Anyone can tell, just by looking at it, that it's not really a 'real' camera-captured scene.
But everyone is out to change all that. And succeeding really great. And really fast. But we're not there yet.
Does that mean, that game developers do worse than the movie people? No. Or the opposite? No.
8)
We need a benchmark. Not of speed. But of the 'realism' of the graphics. And we (more or less) chose Toy Story, as it was the first milestone.
Can a current, top-of-the-bill graphics card render Toy Story in the same quality all by itself? The current riot about the nVidia drivers cheating comes to mind. Does it matter? If you are there, looking at the big screen, and you cannot say if it is done real-time by a graphic card or prerendered, it does not matter.
Can that current, bleeding-edge graphic card render it a lot faster than the renderfarm that was used? No. Can the same amount of graphic cards as there were computers used do it? Absolutely. Without raising a sweat. Any way you look at it.