Real-Time Ray Tracing : Holy Grail or Fools’ Errand? *Partial Reconstruction*

Discussion in 'Rendering Technology and APIs' started by TheAlSpark, Oct 18, 2007.

  1. MfA

    MfA Legend

    Interactive games are fundamentally different because the viewpoint tends to be arbitrary.
     
  2. Scali

    Scali Regular

    Exactly, I read a nice paper from Dreamworks a few years ago, about the lighting used in Shrek, and how they like to control the lighting accurately with simple pointlights or spotlights, rather than going for GI/'physically accurate' solutions.
     
  3. Arnold Beckenbauer

    Arnold Beckenbauer Veteran Subscriber

  4. MfA

    MfA Legend

    Really limited ray tracing on an extremely inflexible architecture ... what a throwback.

    14400 FP multipliers in a 150 Million gates ASIC is impressive though, but the amount of memory available on chip is incredibly small ... I just don't see how you are ever going to implement say GI algorithms on that thing.
     
    Last edited by a moderator: Jul 4, 2009
  5. rpg.314

    rpg.314 Veteran

    Obviously, it is aimed to solve a verrry specific problem.
     
  6. TEXAN*

    TEXAN* Banned

    I can see such a technology make it's way down to a $20,000 arcade game box in 3-5 years time.

    Imagine a Virtua Fighter 6 with the following effects in realtime @ 1080p 60FPS -

    http://www.youtube.com/watch?v=5-Vq...9FB15750&playnext=1&playnext_from=PL&index=16

    You'll never be able approximate that with rasterization. Not now, not in 50 years.

    That's why the move to ray-tracing is a must.
     
    Last edited by a moderator: Jul 5, 2009
  7. Scali

    Scali Regular

    Why not? Unless I'm missing something, the average Pixar movie looks way better than this.
     
  8. nutball

    nutball Veteran Subscriber

    Virtua Fighter 6: Attack Of The Chrome Spheres

    ??
     
  9. TEXAN*

    TEXAN* Banned

    Oh I'm sure you can get things to look as good and better.

    I'm talking about the behaviour of the lighting, shadows and self shadowing. It is exactly as it is in real life, you can't code that in. The best you can do is exactly what you see is todays games.

    Manmade approximations cannot rival reality.
     
  10. TEXAN*

    TEXAN* Banned


    My fault, I meant all the ray tracing related effects that you see in the video.
     
  11. Scali

    Scali Regular

    So you're saying the lighting, shadows and self shadowing in Pixar movies is not as good as this? Then we disagree.
    I think we also disagree on raytracing being reality. In my opinion, raytracing is a manmade approximation aswell. Just a less efficient one.
     
  12. T.B.

    T.B. Newcomer

    That's a pretty important point. Raytracing is not a full physical simulation. Think about how you would render a prism in a raytracer. Or atmospheric scattering. Or fog. In each case you have two options: Throw in a fantastic number of rays, or fake it.

    Just so that we're clear, full physical simulation - the famous "Once we do that all problems in graphics are solved!" point - is a pipe dream. It's really up there with every kid's grand game design, the "Let's simulate a whole world!".

    Faking things is good. It is the Right Thing(tm) to do. Why would I simulate the way of a photon bouncing through the sun for a thousand years when I can simply approximate it as a white ball of light (or even a yellow one and fake the scattering as well)?

    /rant
     
  13. Scali

    Scali Regular

    Yea... A few years ago, a fellow student and I wrote a photon-mapping raytracer based on HW Jensen's work...
    Photon-mapping seems to be THE way to handle that sort of effect in a raytracer... Thing is, it's not raytracing in itself. Instead of the classic Whitted raytracing method of tracing rays of light from the eye back to the source, you are tracing photons from the source to ... whatever your criteria are for storing their information in the photonmaps.

    I wouldn't really call photonmapping raytracing in the first place. It's related, but not quite the same. Besides, you only use them to create photonmaps. These photonmaps don't necessarily have to be evaluated by a Whitted raytracer. You could just as easily evaluate them from within a rasterizer. After all, in essence a photonmap is just a 2d or 3d texture. That is, the photons are stored in texture-space, you just don't evaluate it as a bitmap. It's more like a procedural texture.

    But there are already two obvious approximations going on there:
    1) You generally won't base the number of photons you trace on the actual number of photons that would theoretically be emitted from your lightsource. You just take a smaller subset, and have each photon have a certain level of energy to compensate. So they're not really photons in a physical sense.
    2) During filtering there's another approximation going on. You will be estimating the photon density and luminance in an area, based on the number of photons you've simulated.

    And this solution is far more efficient and delivers far better quality than solutions based on conventional eye-ray tracing with Monte Carlo-based path tracing and all that.

    Oh well...
     
  14. MfA

    MfA Legend

    It seems to me such a massive misinvestment of capital and expertise, why not a 800 TFLOP accelerator for say progressive photonmapping ... that would be so much more interesting.
     
  15. MfA

    MfA Legend

    A tidbit from the progressive photon mapping paper to accentuate this :
     
  16. TimothyFarrar

    TimothyFarrar Regular

    Scali and MfA, yeah progressive photonmapping is the way to go!

    Anyone want to toss in a guess at current GPUs effective TFLOPs when you add in both FLOPs from the shader ALUs and dedicated FF hardware (TEX, ROP, raster)?

    We are comparing 88 TFLOP per chip raytracing (9 chips = 792) to ~1-2 TFLOPs + FF TFLOPs per chip on top end GPUs?

    Sure does clue in to how much more ALU TFLOPs will be needed to software raytrace on current GPUs (including LRB) in real-time 1080p however...
     
  17. MfA

    MfA Legend

    Most of their FLOPs are going to go into the intersection testing though. Bezier clipping is only efficient with a very loose definition of the word ...
     
  18. T.B.

    T.B. Newcomer

    Yup, photon mapping surly improves on classical raytracing, but it still fakes a lot of things and takes many shortcuts. Which is good.
    Not sure anyone ever tried to mix in wave behavior, which is the cause of most interesting optical effects. :)
     
  19. MfA

    MfA Legend

    Interesting perhaps, common not ... interference just like say spectral divergence aren't a big priority for rendering.
     
  20. T.B.

    T.B. Newcomer

    Kinda depends on how you look at it. Refraction is a pure wave effect, based on the change of the speed of light in a medium and Huygens' principle(*). Of course, you would have to be mad to implement it that way.

    Now that I think about it - and after checking my office - that's not a good counter example, as visible refraction is really not that common in real life. But neither are chrome spheres. ;)

    Yes, I'm splitting hairs. Sorry about that. I think it's safe to say that we agree. :)

    (* If I remember my physics lectures correctly)
     
Loading...

Share This Page

Loading...