[GDC 2011] Unreal Technology Demo

Ears require back scatter to look proper - light coming from behind them. Not sure how it's implemented in offline renderers either, but I can see how it's problematic in realtime renderers.
 
On slide 19, it says the screen space subsurface scattering takes Depth and Normal input into account. So why doesn't it work with ears?
It woud also need the information about the backfaces on ears.
Currently the depth information is only for the first intersection of camera facing polygons.

If one would render depth buffer for backfacing polygons it would become easier.
Check the volume between forward and back facing polygons, should be enough information for an approximation for sss.
Basically inverted SSAO should do ok, especially if combined to stuff in frostibes paper. ;)
 
They make so much fuss about it being real time and then don´t release a real time demo executable. Sigh. I´ve seen videos before. There is nothing impressive about a video. I can make a video. My mother can make a video. Come on, just "leak" it if you´re afraid of having to support it.
 
They make so much fuss about it being real time and then don´t release a real time demo executable. Sigh. I´ve seen videos before. There is nothing impressive about a video. I can make a video. My mother can make a video. Come on, just "leak" it if you´re afraid of having to support it.

Unreal Engine tech demos have never been made public AFAIK so I don't see why this one would be different.
 
It looks pretty good, would be nice if some of those features make it into the next Batman. Hopefully nVidia throws in some more marketing $$$ to finance PC (or nVidia :p) exclusive tech.
 
Back
Top