Radiosity on ATI hardware?

Shadowmage

Newcomer
Someone I know claims that the ATI X1800 won't be able to do Radiosity, while NVIDIA's 7xxx series can, in upcoming games. Looking at NVIDIA's website, they list Radiosity as one of the features of 7800GTX, while it's not listed for ATI's X1xxx series.

Can someone elaborate on this? How come the X1xxx can't perform Radiosity in upcoming games?

Thanks!
 
Old, I've seen it before. It's a hack, and even then, won't run realtime on real scenes (Cornell box will make exciting games!)
 
In no way related to GPU's but realtime ray tracing and global illumination is possible. just look the pictures and videos on the link:
http://www.openrt.de/gallery.php

Only problem is that currently it takes 24 dual CPU PC's to render ~40M triangles with GI in near-playable framerate*. Hopefully a HW accelerator for ray tracing gets into mass production in some years :)

*)40M triangles is way beyond any current game or techdemo. In one Xbox360 demo they were proud that one of their scenes had 1M triangles visible
 
I saw a reference to 10M triangles @ 4fps (Ray Tracing) and I read the GI paper which stated 10,000 triangle scene @ 640x480 and 1.7 fps. No where did I see a reference to 40million triangles rendered anywhere near playable rates with Radiosity on the cluster.
 
DemoCoder said:
I saw a reference to 10M triangles @ 4fps (Ray Tracing) and I read the GI paper which stated 10,000 triangle scene @ 640x480 and 1.7 fps. No where did I see a reference to 40million triangles rendered anywhere near playable rates with Radiosity on the cluster.
Obviously you didn't see the videos and didn't check the links. I suggest waching this video (55.9MiB). There it shows four powerplant models (~50M triangles in total) rendered on 44 CPU's(2.6G p4's IIRC) around 2fps. Sure, speed is far from current games but on the other hand it is way faster than most people expect from a such massive scene. The same scene without GI runs ~10FPS on only four CPU's
For some truelly massive scenes rendered with ray tracing (no GI) see this video. It shows a scene with 1 billion triangles. For more info see this. There is no LOD used what so ever.


Realtime RT/GI won't be availiable on average PC's in a near years. With lower quality it could be used with some 2xquadcore CPU's(opterons) running around 3GHz.
 
Don't confuse raycasting, raytracing, and radiosity. Radiosity scales far far worse than either of the previous two. No way 50M triangle scene is going to run radiosity on any realistic HW setup at desired resolutions (1280x1024) and playable (re: 60fps) framerates for the foreseeable future. It can't even achieve that on 44-node cluster.
 
DemoCoder said:
Don't confuse raycasting, raytracing, and radiosity. Radiosity scales far far worse than either of the previous two.
These images don't agree with what you are saying*. For me it seems like GI scales linearly with processing power. Only thing is that it takes more processing power than simple ray tracing.
*) taken from here
DemoCoder said:
No way 50M triangle scene is going to run radiosity on any realistic HW setup at desired resolutions (1280x1024) and playable (re: 60fps) framerates for the foreseeable future.
And when will there be a GPU/CPU system availiable of such triangle counts :smile:

Active research on realtime RT/GI began only a few years ago. Comparing that to how long traditional rendering has been researched and how much money is put into it I think realtime RT/GI is viable alternative to traditional methods sooner than most people think.

I would like to point out something on the prototype ray tracer chip (RPU):


A working prototype of this hardware architecture has been developed based on FPGA technology. The ray tracing performance of the FPGA prototype running at 66 MHz is comparable to the OpenRT ray tracing performance of a Pentium 4 clocked at 2.6 GHz, despite the available memory bandwith to our RPU prototype is only about 350 MB/s. These numbers show the efficiency of the design, and one might estimate the performance degrees reachable with todays high end ASIC technology. High end graphics cards from NVIDIA provide 23 times more programmable floating point performance and 100 times more memory bandwidth as our prototype. The prototype can be parallelized to several FPGAs, each holding a copy of the scene. A setup with two FPGAs delivering twice the performance of a single FPGA is running in our lab. Scalability to up to 4 FPGA has been tested.

I think they compared the performance with 6800u
Currently the biggest problem with ray tracing is dynamic scenes. Rebuilding KDtree is not easy and current implementations are not very parallelizable.
 
Ho Ho: on openrt.de and all the projects there, they _never_ talk about real radiosity solutions. about possible ways to work towards global illumination solutions. but it has all nothing to do with radiosity. more with photonmapping, and such.

radiosity the ordinary way is nowhere close (and never close) to realtime. but radiosity is not what is needed for gi.

but a lot of people somehow think of radiosity == gi => if you do gi with help of a raytracer, you do raytracing and radiosity. but this is wrong.

radiosity is the thing that fits great to lightmap generation. but for about nothing else really..
 
One thing you have to understand is that "Radiosity" is a very specific implementation of global illumination. Radiosity lighting scales just horribly with scene complexity, and it is simply impossible to run in realtime (at least with silicon-based hardware). The scaling is just that bad.

Now, there are many other techniques of global illumination that scale vastly better than radiosity. Precomputed radiance transfer is one of those methods. None of these methods should ever be called "realtime radiosity." A better term would be "realtime global illumination."
 
DemoCoder said:
PRT is not a total solution however, since it only works on rigid objects.
Right. It does have the benefit that it is very easy to do in realtime, however. I'm really not sure I like the idea of using it at all because of this very definite issue, unless you can get another technique working on the moving objects that modifies the scene in a way that isn't glaringly-different. I haven't heard of any such technique yet, but that doesn't mean one doesn't exist.
 
Shadowmage said:
So how does this work out?

Precomputed Radiance Transfer (aka "Realtime Radiosity") support, allowing for Real-Time Subsurface Scattering and Soft Shadowing.

http://www.artificialstudios.com/features.php

Also, Unreal 3 is said to use Radiosity.

If it's not real radiosity, then what is it?
Did you notice the "precomputed" part in that PRT? That means the radiance is precomputed. Its similar to what qrad was doing with q2 maps and lightmaps, only it takes far more processing power and looks better.

You may watch PRT in action from the reality engine techdemo called Backyard PRT demo here. After seeing it please tell me how many moving objects did you see. I saw none. They did say that reality lightin model scales to moving and deformable objectsbut I doubt they meant PRT when they said that.
If moving objects are possible with their PRT implementation then why don't they show any ;)
 
Ho_Ho said:
If moving objects are possible with their PRT implementation then why don't they show any ;)

There are many situations where PRT techniques can be used successfully in dynamic scenes. The thing to keep in mind is that PRT is an approximation, but that's ok because all real time lighting techniques are approximations :)

We used PRT and Irradiance Volume techniques for diffuse lighting on all the characters in the Ruby2 demo (Ruby: Dangerous Curves).

http://www.ati.com/developer/demos/rx850.html

If you run the demo, there are modes for viewing the various lighting terms. Unfortunately these additional modes aren't in the Mpg or Quicktime movies.

If you're interested, there's presentation on some of these methods here:

http://www.ati.com/developer/gdc/GDC2005_PracticalPRT.pdf

--Chris
 
Their is a certain level of ambiguity in this term which causes confusion. Radiosity can be used to refer to a very specific rendering technique, but it also refers to a physical characteristic of an object. In the original paper that introduces the term "radiosty" the later definition is used, when referring to the actual algorithm as a whole it's referred to as the "radiosty method". When people say real time radiosity they are talking about a real time method for approximating the radiosity of surfaces or relying on preprocessed data and using this information to render. Since all radiosty methods have some level of approximation I think these "real timeâ€￾ methods should still be considered radiosty and our not just throwing the term out for jargon. The biggest reason many people make a distinction between these new real time methods and traditional ones is because they do things very differently and the look is different. The non real time methods are better approximations of the real life phenomenon.
 
Back
Top