D. Kirk and Prof. Slusallek discuss real-time raytracing

Simon F said:
GameCat said:
When it comes to asymptotic complexity raytracing wins (it's O(plog(n)) in pixels and triangles,
Err....... How did you build the acceleration structure in sub linear time? Magic?

Well, assuming some kind of spatial hierarchy already exists :) Which I think is a pretty fair assumption since most applications have most of the geometry in some kind of hierarchy that is generated offline. Don't get me wrong here, I really don't think ray tracing is some kind of panacea, but *asymptotically* the complexity is low with regard to the number of triangles. That isn't necessarily very interesting though.

An hierarchical z-buffer with geometry in a spatial hierarchy probably performs similarly to a ray tracer for example, but that doesn't mean the hierarchical z-buffer is the be all end all of visibility computations.
 
MDolenc said:
I am not quite sure why would we need specialized "DirectX 7 style" ray tracing hardware?
You don't. GPUs would seem to be already moving toward architectures that can more efficiently handle raytracing (check out papers on raytracing on stream processors). In any case, special hardware functionality for raytracing would make the hardware no more "DX7" level than fixed function texture filtering makes current hardware.

A professional renderer that uses raytracing is Mental Ray (which I actually believe is a hybrid approach of sorts). It was used in Matrix Reloaded, Star Wars EP2, The Hulk, T3, Fight Club, and more... PRMan recently incorporated some raytracing functionality as well.
 
Waltz King said:
rwolf said:

Yes, this real-time raytracing engine - developed by a student - made the "GameStar" magazine investigate further and arrange this discussion in the first place.

Wrong.
The raytracingengine is not developed by him. He´s using OpenRT.

It's really impressive, I have seen some nice in-game videos.

A Video and more games at http://graphics.cs.uni-sb.de/RTGames/

But: it's running on a supercomputer with 48 Athlon MP CPUs and 12 Gigabytes RAM...

it´s a simple cluster not a supercomputer.

OpenRT and Saarcor are both projects at the CG Lab at the University of Saarland.

Objects can be moved and animated. The animation solutions are still not optimal but it works. (fishes in movie are still static but with a swarm engine that will change)

Special hardware (saarcor) don´t mean fixed shader functions. Simply optimized for each step (ray-triangle test etc). The shaderunit will be full programmable.

Thanks
Chris
 
I know at Saarbrucken University there are a lot of talented people (a couple of years ago I was there to attend to the graphics hardware 2002) and they did (and are still doing..) a great work to improve realtime raytracing but.. (you know it was coming..) imho their custom hw research is a waste of time. GPUs power is scaling very fast , they will never compete, even if their custom solution is much more efficient in raytracing than a modern GPU. I'll spend that moneys in other research..
 
Nexiss said:
A professional renderer that uses raytracing is Mental Ray (which I actually believe is a hybrid approach of sorts). It was used in Matrix Reloaded, Star Wars EP2, The Hulk, T3, Fight Club, and more... PRMan recently incorporated some raytracing functionality as well.

MR has been used in various ways in the movies you've mentioned.
Fight Club had full raytracing and global illumination in the waste basket scene. It took ages to render, although it was back in 1999.
ILM used MR mostly for ambient occlusion and reflection occlusion (masking out blocked areas from the reflection map) passes; they've also used Entropy, and now the raytracing functions in PRMan.
ESC (the Matrix team) was using MR as a full-blown tool for the CG doubles and their full CG scenes (superbrawl, burly brawl, highway chase etc.). Some of these scenes had raytraced raindrops, global illumination and such. AFAIK Tippet studio used PRMan for their scenes (battle in Zion and such).

I also have to mention that Shrek 2 used raytracing and GI as well; although as I understand, they've used simplified placeholder geometry for these effects; like replacing a wall with a simple colored plane to quickly generate bounced lighting. I wonder how problematic it would be to implement such solutions in realtime... I'd expect that the costs of using another version of the scene for such effects would be far higher in realtime than in offline rendering.
 
nAo said:
I know at Saarbrucken University there are a lot of talented people (a couple of years ago I was there to attend to the graphics hardware 2002) and they did (and are still doing..) a great work to improve realtime raytracing but.. (you know it was coming..) imho their custom hw research is a waste of time. GPUs power is scaling very fast , they will never compete, even if their custom solution is much more efficient in raytracing than a modern GPU. I'll spend that moneys in other research..

I would call it a waste of time at all. First of all if RT turns out to be a good solution for games they are doing very important ground work to develop the software but also the hardware. This will benefit enduser, customers, a lot because less money is wasted on "beta" hardware testing when big IHVs get into the game. I hope they succeed with it and we get something really new to the 3D scene.

Btw, 350 million triangles 2-3 fps is huge achievement in my opinion when you think about the used hardware - plain dual opteron PC. Dedicated hardware for the RT task could be 10x faster. Like gfx cards are compared to CPUs.
 
paju said:
I would call it a waste of time at all. First of all if RT turns out to be a good solution for games they are doing very important ground work to develop the software but also the hardware.
Too many IFs. RT is not needed most of the time in offline rendering..it would not be needed in real time rendering too.
There are corner cases where RT can 'win'..but when they will be 'ready', modern GPUs will be light years ahead.
Obviously, they can't really compete on the hw side.
 
nAo said:
I know at Saarbrucken University there are a lot of talented people (a couple of years ago I was there to attend to the graphics hardware 2002) and they did (and are still doing..) a great work to improve realtime raytracing but.. (you know it was coming..) imho their custom hw research is a waste of time. GPUs power is scaling very fast , they will never compete, even if their custom solution is much more efficient in raytracing than a modern GPU. I'll spend that moneys in other research..

show me any gpu that beats the performance of saarcor at raytracing..

and saarcor is just one pipeline at 90mhz. if that gets into hands of some "real hw vendors", this gets hell fast.
 
paju said:
Btw, 350 million triangles 2-3 fps is huge achievement in my opinion when you think about the used hardware - plain dual opteron PC. Dedicated hardware for the RT task could be 10x faster. Like gfx cards are compared to CPUs.

Dont forget, it is aliased to hell and back ...
 
davepermen said:
show me any gpu that beats the performance of saarcor at raytracing...
Don't be blind, look in the future.
Moreover there's already custom hw designed to perform raytracing and they aren't going to make any revolution in realtime 3D rendering, anytime soon..not in this universe
 
Laa-Yosh said:
I also have to mention that Shrek 2 used raytracing and GI as well; although as I understand, they've used simplified placeholder geometry for these effects; like replacing a wall with a simple colored plane to quickly generate bounced lighting. I wonder how problematic it would be to implement such solutions in realtime... I'd expect that the costs of using another version of the scene for such effects would be far higher in realtime than in offline rendering.
I don't think simplified placeholder geometry would be a problem at all. Games use that sort of thing all the time for things like physics, or even just for dropping LOD when things are far away.
 
nAo said:
I know at Saarbrucken University there are a lot of talented people (a couple of years ago I was there to attend to the graphics hardware 2002) and they did (and are still doing..) a great work to improve realtime raytracing but.. (you know it was coming..) imho their custom hw research is a waste of time. GPUs power is scaling very fast , they will never compete, even if their custom solution is much more efficient in raytracing than a modern GPU. I'll spend that moneys in other research..
But that is the point of doing research projects at academic institutions - you can get away from only doing stuff that will pay for itself within a very short time-frame. It doesn't necessarily have to pay for itself ever - that's why its called research and not product development. The purpose of projects such as this is to learn, some of which will hopefully be useful, some of which will probably turn out not to be.

Once upon a time, we made our tools out of rocks, but while we were all chipping away, a few guys dabbled with metals - soft, rare, pretty useless but shiny.... We need people who can search for new ways of doing stuff without being tied to corporate profitability.

If you want to critisize money spent on academic research, go pick on economics and social science. ;)
 
Simon F said:
Entropy said:
Once upon a time, we made our tools out of rocks
We still do... quartz.
:p

Never thought I'd resort to emoticon only communication - sure sign I should move on.
Anyways, we should be happy there are some crazy Germans around devoting their time to this stuff so that others don't need to and can make healthy money instead. Regard it as their service to the community rather than dismissing their work as "No commercial potential".
 
IMO ray-tracing is like the Amiga.

It's a neat idea that's been beaten in the marketplace by a brute-force hack. Nevertheless fans of RT/Amiga are still very vocal (for their number) in insisting that it is The Next Big Thing(TM), and it will answer all our computing problems (along with curing cancer, AIDS and world hunger/poverty into the bargain), if only folks would stop being cattle and buying the brute-force hack.

When it dominates the non-realtime market, there may be a future for it in games. Until then, it's an Amiga.
 
Entropy said:
But that is the point of doing research projects at academic institutions - you can get away from only doing stuff that will pay for itself within a very short time-frame. It doesn't necessarily have to pay for itself ever - that's why its called research and not product development. The purpose of projects such as this is to learn, some of which will hopefully be useful, some of which will probably turn out not to be.
I agree with you. Neverthless academic research, imho, should pursue projects that can increase our knowledge. To be fair I can't see how 'that' project can do the magic. Some years ago, I was part of team (among many other teams) of physicists involved in a CERN project for the building of the new LHC accelerator. Well..now I know how much money are spent in completely unuseful researches..:)
(don't get me wrong, I thinkg LHC project is a wonder..)
Academics should pursue interesting researches, not projects that you know from the day one they're doomed,even from an acamemic point of view, imho.
We need people who can search for new ways of doing stuff without being tied to corporate profitability.
I agree another time with you, that's why I would like to see that money spent on more interesting topics. I don't think that research will give us some new insight in computer graphics :) Obviously I can be wrong..but that's my opinion.
If you want to critisize money spent on academic research, go pick on economics and social science. ;)
LOL :)
 
Back
Top