After seeing real-time refraction of underwater textures and geometry, (and warping of reflections on the water surface), in games like WR. Blue Storm, Super Mario Sunshine and ICO, I’ve been wondering a bit about how it is done technically.
I mean how does the CPU “know†how to “warp†the texture at the correct places at the right angle between the viewpoint and the waves?
The only solution I can think of, is some kind of raytracing algorithm, but wouldn’t that be way to resource demanding on today’s hardware?
A not to technical answer would be appreciated, or a link to one.
I mean how does the CPU “know†how to “warp†the texture at the correct places at the right angle between the viewpoint and the waves?
The only solution I can think of, is some kind of raytracing algorithm, but wouldn’t that be way to resource demanding on today’s hardware?
A not to technical answer would be appreciated, or a link to one.