GPU Global Illumination Renderer

Won't work on my 9700, it calculates everything then pops up a window with a "0 ????????" error and a garbled screen
 
That goes for the 9800Pro in this machine too.

The final rendering window shows rendered environment (or box..?) and black objects.

With regards
Kjetil
 
kenneth9265_3 said:
What is Global Illumination Renderer if you guys don't mind me asking? :?
Indirect lighting. Light bounces off objects and relflects onto other objects. This bouncing light is global illumination. It's especially noticeable with shiny objects.

The program works with my 9700 and Windows XP. It's impossible to tell if the graphics chip is actually doing any of the work though.
 
davepermen said:
it's the way real lighting works, and thus, the 'final destination' of rendering.

You mean, if we can simulate how the real lighting works, there is no more improvement needed for the development of GPU? That the CG images (or even real time rendering) will look EXACTLY like their real world counterparts?
 
Of course, I'll be impressed if they get enough computational power to do 100% accurate simulation of indirect diffused light interacting with multiple objects and a non-trivial environment within 10 years.

Unless something clever is invented, even Moore's LAw won't give us enough juice. My guess is they'll find some way that stil requires a lot of power but doesn't require nearly as much (while looking nearly as good).
 
ChronoReverse said:
Of course, I'll be impressed if they get enough computational power to do 100% accurate simulation of indirect diffused light interacting with multiple objects and a non-trivial environment within 10 years.

If you took a really simplistic view, then it would take 12-13 years before that demo would run in real time (~60fps)

Very pretty tho, shall be interesting to see how GI develops and what short cuts they come up with
 
3dcgi said:
The program works with my 9700 and Windows XP. It's impossible to tell if the graphics chip is actually doing any of the work though.

Easy. :) Just underclock the GPU and see if it takes longer to render the image...
 
lol it produces a whole bunch of shaders I'm sure its real. Not everything is done on the gpu but alot of stuff is.

Does it stop doing passes? or is it like an infinite number of passes?
 
embargiel said:
davepermen said:
it's the way real lighting works, and thus, the 'final destination' of rendering.

You mean, if we can simulate how the real lighting works, there is no more improvement needed for the development of GPU? That the CG images (or even real time rendering) will look EXACTLY like their real world counterparts?

the rest is just art :D

well, you can always go further and further,but not much further than this, and we're at the end of what we know about lighting :D (and calculations will be hell much more complicated:D).
 
Guden Oden said:
3dcgi said:
The program works with my 9700 and Windows XP. It's impossible to tell if the graphics chip is actually doing any of the work though.

Easy. :) Just underclock the GPU and see if it takes longer to render the image...
Good point. My OEM 9700 seems to be clock locked though so I can't test it. That and it's too much effort. ;)

bloodbob said:
lol it produces a whole bunch of shaders I'm sure its real. Not everything is done on the gpu but alot of stuff is.

Does it stop doing passes? or is it like an infinite number of passes?
Infinite. It doesn't stop until you save the image or cancel the render. Despite using graphics hardware it still takes a while for the lighting to converge to a good result.
 
DaveBaumann said:
So, who can hack it and make a benchmark out of it? ;)

Well you could benchmark with some degree of error. Sit and wait for it to get raycasting. Once it hits that point let it go for either XXX seconds or XXX raycast and repeat on another card to compare. This sucker is probably memory speed limited because of the heavy use of fp textures.
 
Back
Top