Raytracing

You can throw rays at the scene, or you can throw the scene at the rays ... triangles, voxels .... it doesn't matter.
Yes indeed, which is why I'm confused that they're emphasizing the ray casting side of things rather than LOD, which is the real benefit of a (topologically limited/free) voxel representation.

But whatever, coherence/locality is swiftly becoming the only thing that matters in rendering and parallel computing, so I don't see rasterization-like techniques going away any time soon.
 
From what I gathered, everything in the video is rendered and lit in realtime.
Perhaps, or perhaps they bake in the lighting for the entire animation sequence (not in a pure color sense, but with say spherical harmonics). Can't rightly say at the moment.
 
Perhaps, or perhaps they bake in the lighting for the entire animation sequence (not in a pure color sense, but with say spherical harmonics). Can't rightly say at the moment.

In one of the videos, Jules says they can pop a spot light in at will and that each of the 100 lights in the building is a real light source. I'm just trying to find the truth behind all the glitz. I guess we'll know more when/if they release the new Ruby demo and also when Id talks about the voxel rendering at siggraph.

P.S. Not sure if I missed this, but it seems like good info http://anteru.net/2008/07/25/242/
 
Last edited by a moderator:
At will is fine, but is it interactive and realtime?

Here's a nice thread on the topic at hand.

http://www.youtube.com/watch?v=iwTcvk5IuB4&feature=related

"no baked lighting, nothing's precomputed" minute 1:14
"100's of lights in the building all being rendered in real time, calculated in real time"

minutes 2:25 he puts a spotlight at the camera location and the characters shadow is on the opposing building


Thanks for the link also, I'm reading it right now :)
 
Back
Top