What is SubSurface scattering?

The ability to do raytracing does not automatically mean that it's worth doing it.

Today's hardware rendering architectures are built on a few principles, like streaming the scene through the rendering pipeline, and skipping unseen parts of the scene.
But raytracing means that when rendering a single pixel, your reflected/refracted ray may pass into any part of the scene, even those outside the camera's view. Bumpy surfaces throw rays all around the scene and that means that you'll suddenly need to access a different texture, evaluate a completely different shader, tesselate another object that hasn't been processed yet, and so on. Memory accesses will become completely random, instead of the predictable dataflow. Rendering times will change dramatically between neighbouring pixels and between successive frames. And all this for visual improvements that 99% of the audience will probably won't see, or care about...
 
nintenho said:
The only demo I saw of raytracing was a single screenshot so I was skeptical of the framerate, could anyone point me to a video of raytracing in realtime on ps3?

You mean the landscape demo? It was on the conference. But I don't get it where ray tracing is in that demo? I mean there wasn't anything special there about the light? Meh. :smile:

Also, there's that quake 3 demo with ray tracing thing, which they did with like 30 computers with athlon cpus. But that wasn't the real thing? I'd guess cell could rival those. (think it we're xp 1700+'s or something.)

Didn't lair use that effect with the light going through the skin?

edit: man i'm writing like crap, better go sleep :p
 
weaksauce said:
You mean the landscape demo? It was on the conference. But I don't get it where ray tracing is in that demo? I mean there wasn't anything special there about the light? Meh. :smile:
Yeah, I think. All I saw was a screenshot of a mountain.

Edit: A mountain that had raytracing apparantly.
 
That was a rather limited case of raytracing, called raycasting AFAIK - rays weren't bounced, just traced until the first hit, which made memory accesses predictable.
For example, the original Wolfenstein 3D used raycasting as well. But it has only tested a ray for each column and not for each pixel, so it was very fast, but there was no variation in floor height. Should we thus state that the original Wolfenstein has also used raytracing? I don't think so ;)
 
Laa-Yosh said:
That was a rather limited case of raytracing, called raycasting AFAIK - rays weren't bounced, just traced until the first hit, which made memory accesses predictable.
For example, the original Wolfenstein 3D used raycasting as well. But it has only tested a ray for each column and not for each pixel, so it was very fast, but there was no variation in floor height. Should we thus state that the original Wolfenstein has also used raytracing? I don't think so ;)
O I C.....
 
Morgoth the Dark Enemy said:
The fact that Sony have been talking about raytracing is not necessarily relevant, IMHO. They`re free to talk about anything, but I think we`re fairly far away from actually doing realtime raytracing(if we`re ever going to do that, as some alternate solutions for realistic rendering exist). 3D graphics are about smoke and mirrors mostly, creating an illusion, if you can fake something good, then by all means do it:)

I`m curious though as to how the UE3 shader looks. The effect that they achieve is quite good, IMO.

I sat through a UE3 tech presentation recently and it's just a collection of well understood hacks they are doing, nothing revolutionary, just a lot of stuff that has been seen before in smaller quantities.
 
nintenho said:
The only demo I saw of raytracing was a single screenshot so I was skeptical of the framerate, could anyone point me to a video of raytracing in realtime on ps3?
http://www-306.ibm.com/chips/techlib/techlib.nsf/techdocs/05CB9A9C5794A5A8872570AB005C801F
The demo runs 30 fps at 720p on a 3.2 GHz Cell. The demo wasn't so much about raytracing as, IMO, a visual demo that makes the most of Cell, showing it's relative strengths in a way that looks good in a presentation. The terrain was procedurally generated in realtime, shaded and lit, all on Cell. The use of raytracing makes sense in this case as the algorithm is simple and fits nicely onto a SPU, with the vey regular data access patterns that make prefetching to the LS's easy. In a 'real-world' case (in PS3 or workstation) you'd likely farm the rendering off to the GPU, with the Cell just producing the 3D data and letting the GPU shaders deal with texturing and lighting.

Realtime raytracing of a scene is a very long way off IMO. Work on RTRT processors hasn't produced much that scanline rasterizing can't do. RT handles complex scenes better, and is a simple, straightforward process, but raytracing has some severe bottlenecks. The things it uniquely provides, like GI, SSS, true softshadows and reflection/refraction, just are too slow to process to make it into realtime through brute force computation. The CG industry uses hacks and fakes as much as possible to speed things up, and they have massive amounts of processing power in their farms. No single chip is going to improve upon that!

Talk of Cell doing raytracing isn't necessarily all about raytracing scenes though. The method of casting rays and following their passage through a scene may find it's way into other uses. Though as LaaYosh points out, you're hitting random and unpredictable memory access patterns than which is very bad for SPU LS.
 
Shifty Geezer said:
http://www-306.ibm.com/chips/techlib/techlib.nsf/techdocs/05CB9A9C5794A5A8872570AB005C801F
The demo runs 30 fps at 720p on a 3.2 GHz Cell. The demo wasn't so much about raytracing as, IMO, a visual demo that makes the most of Cell, showing it's relative strengths in a way that looks good in a presentation. The terrain was procedurally generated in realtime, shaded and lit, all on Cell. The use of raytracing makes sense in this case as the algorithm is simple and fits nicely onto a SPU, with the vey regular data access patterns that make prefetching to the LS's easy. In a 'real-world' case (in PS3 or workstation) you'd likely farm the rendering off to the GPU, with the Cell just producing the 3D data and letting the GPU shaders deal with texturing and lighting.

Realtime raytracing of a scene is a very long way off IMO. Work on RTRT processors hasn't produced much that scanline rasterizing can't do. RT handles complex scenes better, and is a simple, straightforward process, but raytracing has some severe bottlenecks. The things it uniquely provides, like GI, SSS, true softshadows and reflection/refraction, just are too slow to process to make it into realtime through brute force computation. The CG industry uses hacks and fakes as much as possible to speed things up, and they have massive amounts of processing power in their farms. No single chip is going to improve upon that!

Talk of Cell doing raytracing isn't necessarily all about raytracing scenes though. The method of casting rays and following their passage through a scene may find it's way into other uses. Though as LaaYosh points out, you're hitting random and unpredictable memory access patterns than which is very bad for SPU LS.
Thank you very much duders.:D
 
Laa-Yosh said:
That was a rather limited case of raytracing, called raycasting AFAIK - rays weren't bounced, just traced until the first hit, which made memory accesses predictable.
For example, the original Wolfenstein 3D used raycasting as well. But it has only tested a ray for each column and not for each pixel, so it was very fast, but there was no variation in floor height. Should we thus state that the original Wolfenstein has also used raytracing? I don't think so ;)

Yes, just raycasting to render a height-field, but still very impressive given the quality of the ouput from a single chip (with MSAA, texture filtering, bump mapping, etc.)
 
!eVo!-X Ant UK said:
Dude 360 just has'nt got the power to render a game like BIA:3 with real SSS, its a faked effect like the SSS demo that Nvidia showed at E3 with RSX.

Fake does'nt mean rubbish though ;)
How do you fake SSS?
 
nintenho said:
Coooooooooooool!

this is the way nvidia and ati fake the SSS with pc cards, we don't know what tech is under the SSS on xenos

but every digital effect are "fake", an approssimation of real, even the cgi one is fake, of course, even if it use ray calc ;)
 
RedBlackDevil said:
this is the way nvidia and ati fake the SSS with pc cards, we don't know what tech is under the SSS on xenos

but every digital effect are "fake", an approssimation of real, even the cgi one is fake, of course, even if it use ray calc ;)
I'd imagine ATI wouldn't hold back *chortle*
 
Back
Top