Can realtime Global Illumination be accomplished with 100TFLOPS of processing power?

Discussion in 'Architecture and Products' started by Capeta, Dec 24, 2006.

  1. Laa-Yosh

    Laa-Yosh I can has custom title?
    Legend Subscriber

    Joined:
    Feb 12, 2002
    Messages:
    9,568
    Likes Received:
    1,455
    Location:
    Budapest, Hungary
    The real question here is, what do we want to render in real time? A simple sphere? Low-res, low complexity game scenes? Movie level CGI? Even the later is a moving target, thigs are a lot more complex nowadays then in Jurassic Park...

    There will be a meeting point sometime in the future. Where it is, we can't tell now.
    A good guess is that Davy Jones is probably going to be the detail level to go for; it's one of the few CGI characters that... who has managed to trick even seasoned CG artists, making them believe that it was make-up or at least a CG augmented real person. So I'd say that this level or realism will probably be sufficient, and we have to work on non-rendering related issues from now on.

    However, Davy Jones is a single character in a live action enviroment, and even this took a loooot of rendering time. So you can probably extrapolate from this to a level...
     
  2. Panajev2001a

    Veteran

    Joined:
    Mar 31, 2002
    Messages:
    3,187
    Likes Received:
    8
    We might not get to Pirates of the Caribbean 2's level of CG soon enough as far as real-time rendering is concerned, but there is something about dynamic and global lighting+shadowing systems that is appealing to me even if you restrict polygonal detail of characters+scene to just a bit above PlayStation 2 levels: if I hold a flash-light I want any object that occludes it to cast a shadow... whether it is photo-realistic or not... is a secondary concern (it comes after the priority of having full light sources-objects interaction).
     
  3. zgemboandislic

    Newcomer

    Joined:
    Sep 15, 2005
    Messages:
    135
    Likes Received:
    0
    You mean like Doom 3? :D
     
  4. Panajev2001a

    Veteran

    Joined:
    Mar 31, 2002
    Messages:
    3,187
    Likes Received:
    8
    That too, but it can be improoved (the flashlight you hold does not cast shadows AFAICT)without going to insane amount of polygons in the scene and uber-accurate algorithms for shadows and lights, I even like the cut-down version on Xbox.
     
  5. Dee.cz

    Newcomer

    Joined:
    Oct 2, 2006
    Messages:
    46
    Likes Received:
    6
    Before meeting point with movies, games are going to improve illumination quality in many small steps. Interesting question is what the next steps will be.
    Existing games have hard shadows without indirect illum (doom) or with single precomputed channel of indirect illum that doesn't respond to dynamic lights (all other games). Shadows are hard or blurred hard, never correct soft shadows from area lights, it's just marketing.

    Next milestones for games are
    2007 indirect illum precomputed for many configurations of lights, not just for one. first realtime soft shadows.
    2008 realtime computed indirect illum without specular reflections
    2009 realtime computed indirect illum with some specular reflections
     
  6. tongue_of_colicab

    Veteran

    Joined:
    Oct 7, 2004
    Messages:
    3,773
    Likes Received:
    960
    Location:
    Japan
    I dont think its suck a strange thought to think we get cgi movie like gfx in 5 - 6 years with the ps4, x720 and pc's. Sure, it wont be as good from a technical point of view but if you compare what you see on the screen I dont think there will be such a big difference. The differenace probably will be small enough to let a decent amount of people not see the difference.
     
  7. Laa-Yosh

    Laa-Yosh I can has custom title?
    Legend Subscriber

    Joined:
    Feb 12, 2002
    Messages:
    9,568
    Likes Received:
    1,455
    Location:
    Budapest, Hungary
    Many upcoming games are using ambient cube maps for lighting, and only a single direct light. It creates rather good looking soft lighting, without the harsh contrast and total blacks of a Doom3-like system. It's also a lot faster then adding another 2-5 lights (which is actually pretty expensive in pixel shader calculations).

    The cube map stuff works well with normal maps, and you can also use the same in-engine generated HDRI cubemap for simple reflections too. The direct lightsource will account for shadows, and if needed, for specular highlights. Additional dynamic lights can be rendered into the cube map.

    An example is Halo 3, the Brute video shows untextured, grey shaded models and they clearly have some image-based lighting going on.
     
  8. santyhammer

    Newcomer

    Joined:
    Apr 22, 2006
    Messages:
    85
    Likes Received:
    2
    Location:
    Behind you
    To get raytraced real soft shadows/sss/etc we will need something like this implemented in HW:

    - Load the mesh using vertex buffers BUT store together a kd-tree/octree/BSP with the vertices. Something like the AGEIA does.

    - Add an intruction in the pixel shader to test if a ray hits a mesh in the scene. Something like a bool closestHit ( rayPos, rayDir, rayDistance, out float3 triangleBariCoords ). This will iterate over all the hit primitives's triangles and give us the closest triangle that collides our ray with.

    I think that could be an easy milestone. After all the AGEIA PhysX is making that in HW at amazing speed ( see the shadow/terrain raycast example in its SDK ). And NVIDIA/ATI are claiming physics now ( well, basic raycasting are physics really soo.... ). With that we could start to make some basic raytracing in the pixel shader...

    The problem is that with that basic raytracing we can't achieve GI really. GI is much more complicated. Requires to cast photons from a mesh-distributed lighting, calculate all the photon collisions, etc etc

    ... So probably we will se first the raytracing instruction in the pixel shader.. and then, when we have this running in 1600x1200 at 80fps, we could start calculating photon things at 4 or 5 FPS :p

    Considering we are almost reaching the electronic integration limits and the Mhz can't go much more ( and a 384-core CPU could use like 9000W ), we will need some kind of electronic advancement before to reach this stage ( nanotubes or optronic ). Who know! :roll:
     
    Acert93 likes this.
  9. Andrew Lauritzen

    Andrew Lauritzen Moderator
    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,632
    Likes Received:
    1,250
    Location:
    British Columbia, Canada
    The last thing we need is a data structure hard-coded into hardware!

    You can already do this in software on current GPUs fairly efficiently, why implement it in hardware?

    GPUs are already fairly efficient at doing raytracing, even with deep acceleration structure traversal. However as noted, GI is a fair step from just being able to trace a ray.
     
  10. santyhammer

    Newcomer

    Joined:
    Apr 22, 2006
    Messages:
    85
    Likes Received:
    2
    Location:
    Behind you
    Because I wanna use it inside a pixel shader :p

    With CUDA/CTM, perhaps! But im not sure if are going to support ppointers to use decent recursion. Meanwhile the uniform/nonuniform grid approaches aren't bad.. but still far from fast enough and too complicated and hacky.
     
  11. Andrew Lauritzen

    Andrew Lauritzen Moderator
    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,632
    Likes Received:
    1,250
    Location:
    British Columbia, Canada
    Totally possible with something like Sh or RapidMind. Internally it can be split into multiple passes as necessary (similar to what the GPU would have to do with divergent control flow anyways).

    You don't actually need recursion or a stack to descend into the data structure. There are alternative ways that don't need extra state.

    kd-trees are quite viable to do on the GPU :)
     
    #31 Andrew Lauritzen, Jan 15, 2007
    Last edited by a moderator: Jan 15, 2007
  12. DudeMiester

    Regular

    Joined:
    Aug 10, 2004
    Messages:
    636
    Likes Received:
    10
    Location:
    San Francisco, CA
    Imho, the future of graphics lies with pre-computing anything that can be, and real-time computing the minimum. That is, full on ray-tracing and photon mapping will be used, but only when absolutely necessary, like with large-scale, in-focus and dynamic objects. Again, static objects would have their GI contributions calculated ahead of time, but during the real-time rendering, they will simply be composited into the final image. In this way, usage of the hardware's capabilities can be appropriately tuned.

    However, I think this will require an advancement of not only graphics hardware and engines, but the development software itself, since this demands more detailed LOD and usage specifications. In other words, the tools need to be more expressive, so that the programmer/designer/modeller can specify exactly what characteristics their objects have and the contexts they will exist in. From this, the development tools can determine the optimal balance of graphics effort, as per the object's expected visibility, which can be further tweaked by the developer just to be sure. Essentially, this is an expert machine that supplements the developer, in the same sense that a C++ compiler is this.

    Now, I don't believe in only having one good reason to do something, and indeed, this kind of advancement has secondary benefits. Not only is the quality displayed graphics improved, but performance scalability is greater, and development is streamlined. Of course, the economic cost of this must also be considered, but in this time of HD graphics, I think this optimization has matured.
     
  13. Dee.cz

    Newcomer

    Joined:
    Oct 2, 2006
    Messages:
    46
    Likes Received:
    6
    Last time I measured it, Lightsprint Collider was 2-7x faster than AGEIA PhysX, running on the same computer with Ageia card. Collider does ray-mesh intersections on CPU [I wrote its major parts]. Do you still think PhysX does it on dedicated HW?

    On the GPU front, what are the latest results, how fast are GPU raytracers?
    (Latest news I have are from 2004 when my friend wrote Inferno GPU raytracer. It was still slower than CPU raytrace, but he predicted GPU raytracing era to come in a few years.)
     
  14. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,716
    Likes Received:
    2,137
    Location:
    London
  15. zgemboandislic

    Newcomer

    Joined:
    Sep 15, 2005
    Messages:
    135
    Likes Received:
    0
  16. Dee.cz

    Newcomer

    Joined:
    Oct 2, 2006
    Messages:
    46
    Likes Received:
    6
    Not necessarily raytracing, we can give it different names. But for indirect illumination effects, you usually need lots of ray-scene intersections, which is base also for raytracing.

    Indirect illumination is nicely visible on screenshots.
    I don't know technical details, so please correct me if I'm wrong.
    Sun looks static on those images, so indirect illumination is probably precomputed. Indirect shadow of chair on the left image is too sharp, it is probably faked by one weak pointlight that was positioned by level designer.

    This approach will be replaced by engines that do secondary effects in realtime, so it will be possible to move light source etc.

    Alternatively, "Instant radiosity" based techniques use no rays, only rasterization, but artifacts are visible and as I know game developers, they prefer good looking over physically correct with artifacts.
     
  17. Dee.cz

    Newcomer

    Joined:
    Oct 2, 2006
    Messages:
    46
    Likes Received:
    6
    Correction - sorry there are many other global illum techniques that need no ray-scene intersections.
    I remember several demos with problematic quality. If Crytek developed something realtime of this quality, it would be nice surprise.
     
  18. tabs

    Veteran

    Joined:
    Jan 11, 2007
    Messages:
    1,717
    Likes Received:
    258
    Location:
    UK
    'Sun looks static on those images, so indirect illumination is probably precomputed.'

    The CryEngine 2 tech demo video showed a mobile light source apparently casting indirect illumination using a technique they described as 'Real-time Ambient Maps'.
     
  19. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    Most likely its an optimized ambient occlusion mapping techique. Cry Engine 2 isn't going to be licensed till Crysis is out the door or close to out the door so I'm guessing, but that is what it sounds like to me from my talks with Crytek. It is definilty texture based thats for sure.
     
  20. Dee.cz

    Newcomer

    Joined:
    Oct 2, 2006
    Messages:
    46
    Likes Received:
    6
    I've seen that one, very nice.

    I don't know how they do it, but this is one possibility:

    Back in 2000 in Realtime Radiosity 2 demo, we made freely moving lights with global illumination by precomputing GI for many light positions in space and interpolating between closest GI solutions during light movement. It was meant as joke but many believed it's realtime computed :)

    If you want to copy our joke today, do shadows in realtime because it's finally cheap, precompute only indirect illumination ("ambient maps") for many light positions. Storage is cheap, it's low frequency low res map. Then, interpolate precomputed ambient maps in pixel shader and call it "realtime ambient maps". Many will believe it's realtime :)
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...