Ray Tracing on Programmable Graphics Hardware (RSX)?

Even for high quality direct lighting the cost is pretty obscene (if you want AA and soft shadows of decent quality). I do think that having hardware which can more efficiently support ray tracing and other tree traversal algorithms would be nice to have though...
 
Just something to think about:

Much of the high quality CGI stuff in movies isn't even raytraced because it isn't worth the time/power for the small amount of increase in realism (which in action type scenes most people can't reall see the small bits anyways).

I had a chance to talk to one of the people (recently) who has a company (and friends) that do a lot of the CGI in movies today.

Ray Tracing is overrated -- it friggin takes all the atoms in the universe for it to be rendered properly for us, and it barely looks better than stuff that uses tricks instead.
 
vember said:
Fox5 said:
I have to ask that too, what can we do with ray tracing now that we can't do with shaders?
Correct reflections/refractions of the surrounding world on complex geometry. We can only do that on planes nowadays (or appoximate it with cubemaps).

Physically correct reflection/refraction is one of the last things the average viewer would notice on an image. You can get away with cubemaps 95% of the time, IMHO.
 
Inane_Dork said:
I would take a real-time radiosity engine over a real-time ray tracing engine any day, but that's just me.

Most radiosity and global illumination solutions require some sort of raytracing...
 
Bobbler said:
Much of the high quality CGI stuff in movies isn't even raytraced because it isn't worth the time/power for the small amount of increase in realism (which in action type scenes most people can't reall see the small bits anyways).

Most studios use raytracing nowadays for the following:

- Ambient occlusion pass.
Looks better than baked maps because it calculates interaction between the moving elements. Still, it's usually faked as much as possible by using low-res proxy geometry, disabling displacement etc.
This pass is multiplied over the image, to darken areas that receive very soft shadowing.

- True subsurface scattering pass.
Faked again, you usually don't need the enviroment to affect the object at all.
This pass is usualy composited into the diffuse pass of an object, which will get multiplied with the texture color, and reflection/specularity is then added on top.

- Reflection occlusion pass.
This is a grayscale image that's darkened at the areas where the objects would reflect themselves.
This pass is multiplied with the reflection pass, which is usually just (cube or spherical) enviroment mapped reflection.

Some studios sometimes do full global illumination lighting too (like a certain one in Scotland... :p), but it's not too common. Reflection and refraction might require tracing in special cases, too.
But the common way is to scanline render as much as possible - no wonder that the originally raytracer-based Mental Ray also got a scanline mode in the recent years...
 
nAo said:
Slow-ass bandwith?! what are you talking about?
By the way once we'll have full MIMD pixel pipelines (or ALUs..) RT will be run much faster on GPUs than today.
The slow-ass bandwidth is referring to between CPU/Main memory & GPU (not on the graphics card itself, if that wasn't clear). Obviously, I'm referring to it on PCs. I don't really know about practical bandwidth on PS3 or 360 -- though I doubt anyone would ever bother in console-land, so it's kind of a moot point in that arena anyway.

I still basically have no faith in the idea that GPUs will ever be programmable enough to handle everything you could throw at it. There will be far too much even about raytracing alone that just lies outside of stream computing, and that's something GPUs will never get away from.

On the contrary, I would much prefer the idea that people just keep advancing what GPU shaders were made for in the first place, which is to apply more visual tricks on scanline rendered polygons... rather than try to kid themselves that GPUs will one day be Turing complete.
 
ShootMyMonkey said:
I still basically have no faith in the idea that GPUs will ever be programmable enough to handle everything you could throw at it.
I agree with you, but I'm not talking about adding more programmability, I'm talking about a someway improved GPU architecture.
There will be far too much even about raytracing alone that just lies outside of stream computing, and that's something GPUs will never get away from.
Stream computing is a very broad concept..
On the contrary, I would much prefer the idea that people just keep advancing what GPU shaders were made for in the first place, which is to apply more visual tricks on scanline rendered polygons... rather than try to kid themselves that GPUs will one day be Turing complete.
I think you missed my point, anyway give a look at this:

RPU: A Programmable Ray Processing Unit for Realtime Ray Tracing
This paper describes the architecture and a prototype implementation of a single chip, fully programmable Ray Processing Unit
(RPU). It combines the flexibility of general purpose CPUs with
the efficiency of current GPUs for data parallel computations. This
design allows for realtime ray tracing of dynamic scenes with programmable material, geometry, and illumination shaders.

A GPU with a specialized "traversal unit" and an improved dynamic branching mechanism would be quite similar to this RPU proposal.
Even if I don't think we're not going to see special database traverse custom hw on GPUs anytime soon, I believe we'll have in the next year GPUs that will be much better than current GPUs at dynamic branching (AFAIK Xenos should be already much more efficient than NV40 in this regard from what I heard..)
 
nAo, thanks for the cool link :) .
nAo said:
... , I believe we'll have in the next year GPUs that will be much better than current GPUs at dynamic branching (AFAIK Xenos should be already much more efficient than NV40 in this regard from what I heard..)

Unfortunately "much more efficient than NV40" can still be "really quite bad"...
 
It'll be interesting to see what Sony & Nvidia can come up with in 6 years or so, as they work on their joint roadmap that Ken mentioned, obviously first and foremost for PS4.
 
sklaar said:
cell can do realtime raytracing

http://www.golem.de/0507/39524.html

i`m from germany and try to translate the important thing

university Saarland do realtime raytracing on cell and shows it at Siggraph 2005

Thanks!

Interesting article, although I can't make it all out. The only bits I could really take from it were that it was developed in conjunction with IBM Germany in 2 weeks, and that "already with one Cell, realtime rates are possible at full resolution". Although there seem to be no other technical details..

These guys are due to present "RPU: A Programmable Ray Processing Unit for Realtime Ray Tracing" at Siggraph, but I see no mention of Cell on their site, which is a little confusing :?

Anyway, thanks again - looking forward to your own translation :)

edit - found the press release from the university themselves: http://www.uni-saarland.de/de/medien/2005/07/1122474975 Pretty much the same as the article you linked though
 
I;ve known of the Saarcor raytracer for a while. It seems they've ported their algorithms to Cell. Siggraph's end of this month, so hopefully we'll get some news on Cel raytracing then!
 
Shifty Geezer said:
I;ve known of the Saarcor raytracer for a while. It seems they've ported their algorithms to Cell. Siggraph's end of this month, so hopefully we'll get some news on Cel raytracing then!

Indeed, I didn't realise it was soon.

Anyway, I'm not getting too excited just yet. Looking at their video that's marked for Siggraph 05, their idea of "realtime" may differ from the norm ;) The scenes they were showing were going from anywhere from ~4fps to ~20fps. That was on their own chip..who knows how Cell fares (I'll be conservative and guess it's not a whole lot better).
 
I got the gist from that german article that frame updates were within a second. Far from realtime gaming, but the Raytracers of the world are going to be jumping backflips over the moon naked if they can preview their scenes in just a few seconds, or minutes even! In the history of raytracing, processors have never kept ahead of modelling and rendering technologies, so a jump in processing power still resulted in the same hours-long render times. If we can see our scene in a minute or less, productivity of graphics studios will jump dramatically. :D
 
IT HAS BEGUN! :oops: the long, long road to full scene raytracing & GI and all of that other highend stuff that is currently impossible

I'd imagine that Playstation5 will be a full blown Ray Tracing monster. PS3 will be the first baby step forward. PS4 will be the halfway point with lots of ray tracing but not 100% , and PS5 will bring us the rest of the way there.


I'd imagine that Saarcor will be bought by either Sony or Nvidia, or a new partnership will be announced. whereas Cell was a CPU partnership between Sony-IBM-Toshiba, a new GPU partnership could be formed between Sony-Nvidia-Saarcor, to produce a new Ray Tracing Processor. the first major implemention will be for PS4.
( say hello to the: 'Reality Ray Tracing Synthesizer') which will be helped out by Cell2.

the PS5 will use a massively improved RRTS and Cell3



ok, just me having some fun :)
 
"for the first time on the new Cell processor of IBM, Sony and Toshiba present. With real time Raytracing shade, refractions of light, reflections and indirect lighting effects are photo-realistically shown."

"In a very short development time of only two weeks the Raytracing algorithms for the new high speed processor in co-operation with IBM Germany were completely revised."


I always felt cell could do it... :D
 
Most radiosity and global illumination solutions require some sort of raytracing...

That is true, but for gaming purposes you should always get away with instant radiosty. Instant radiosity means, that you place about 100+ lights in the scene based on the result of a simple photon/light tracer.

On current hardware you get something about 1 frame/s with nice (enough) global illumination and soft shadows. Regarding reflection/refraction cube maps should really do the job for games.

Caustics afaik are still impossible without a real photon map or something similar, though.
 
project sites in english:

www.saarcor.de
www.openrt.de

Megadrive1988 said:
I'd imagine that Saarcor will be bought by either Sony or Nvidia, or a new partnership will be announced. whereas Cell was a CPU partnership between Sony-IBM-Toshiba, a new GPU partnership could be formed between Sony-Nvidia-Saarcor, to produce a new Ray Tracing Processor. the first major implemention will be for PS4.
( say hello to the: 'Reality Ray Tracing Synthesizer') which will be helped out by Cell2

First step is the RPU ? ;)
http://graphics.cs.uni-sb.de/~woop/rpu/rpu.html
 
Back
Top