Can Pixar's Rendering techniques be brought to the PC?

Discussion in 'Architecture and Products' started by Techno+, Dec 3, 2006.

  1. Techno+

    Regular

    Joined:
    Sep 22, 2006
    Messages:
    284
    Likes Received:
    4
    Greetings,

    I know this might sound like a stupid pointless question, but I wanted to ask you all can Pixar's rendering techniques be brought to the PC? As far as i know Pixar uses CPUs in rendering farms to render movies, so is this applicable for a lets say 64 core CPU? What advantage does Pixar find in using CPUs for rendering than GPUs (GPUs already offer more FLOPS)? is it programmability ?

    Thanks
     
  2. AlexV

    AlexV Heteroscedasticitate
    Moderator Veteran

    Joined:
    Mar 15, 2005
    Messages:
    2,528
    Likes Received:
    107
    Take a look at the algorithms they`re using for rendering, the resolutions they`re targetting and the performance(they don`t quite care about, if it looks as they want it to). The answer is there.
     
  3. Techno+

    Regular

    Joined:
    Sep 22, 2006
    Messages:
    284
    Likes Received:
    4
    ok I got it, thanks. I also think one other reason is costs, and although GPUs might prove easier to program, using GPUs specialized in CAD will be expensive, so they chose to go for the low cost hard programming method. If developers choose to program game gfx for multicore CPUs, then it will prove too expensive.
     
  4. Farid

    Farid Artist formely known as Vysez
    Veteran Subscriber

    Joined:
    Mar 22, 2004
    Messages:
    3,844
    Likes Received:
    108
    Location:
    Paris, France
    A REYES thread?

    I don't know of a better way to summon Uttar.
     
  5. London-boy

    London-boy Shifty's daddy
    Legend Subscriber

    Joined:
    Apr 13, 2002
    Messages:
    21,519
    Likes Received:
    5,168
    Yay! The bi-monthly realtime CGI thread is finally here! :grin:

    I think that, whatever architecture is used by Pixar and all other CGI studios, whatever architecture is used in realtime gaming... the CGI studios will always have time on their side. No matter the architecture or algorithm, they will always have a LOT more time than the .02 seconds we need our PCs to do the whole rendering, so that it can be playable at 30fps.

    That's the sad truth. We can only wait until realtime graphics is indistinguishable from reality, but not even CGI is there yet so we're gonna have to wait a long long time.
     
  6. Killer-Kris

    Regular

    Joined:
    May 20, 2003
    Messages:
    540
    Likes Received:
    4
    The answer I've always heard is that it is all about reliably identical results. Could you imagine the results if half the render farm rounded their arithmetic one way, and the other half of the render farm rounded a different way, or if they used different seeds for their noise functions, etc...

    A software approach with IEEE compliant CPUs, allow for them to not worry about those problems.
     
  7. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Hi Techno+ the essence of this thread looks to be the same as your last thread : http://www.beyond3d.com/forum/showthread.php?t=35940

    There's a difference between the pixar method and the GPU method where one is doing a lot of work with a lot of coding not necesarily in real-time and the other is using gpu assisted functions at a fraction of the cost of the previous setup and the results are according to their difference in costs.
     
  8. The_Wolf_Who_Cried_Boy

    Newcomer

    Joined:
    Feb 18, 2005
    Messages:
    172
    Likes Received:
    9
    Location:
    Floating face down in the stagnant pond of life.
    Isn't IEEE compliance part of DX10 specification? I'm sure I've seen a mention of FP rounding error limits somewhere?
     
  9. pascal

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    1,830
    Likes Received:
    49
    Location:
    Brasil
    Exatlly, and my guess somewhere in the near future they will start to use a farm of GPGPU to do the rendering. Imagine, one hundred G80 working for 1 minute to produce one frame.
     
  10. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    17,686
    Likes Received:
    1,204
    Location:
    Maastricht, The Netherlands
    Maybe the question should be, at which point in time can they start porting their OLD work to current tech? Would they ever consider opening a branch that designes real-time animations for computer games, and would it be interesting to, say, recreate one of their first movies with a freely moveable camera?
     
  11. zgemboandislic

    Newcomer

    Joined:
    Sep 15, 2005
    Messages:
    135
    Likes Received:
    0

    Now that would be something wouldn't it? Watching Shrek 10 years from now and having full control of the camera!
     
  12. rwolf

    rwolf Rock Star
    Regular

    Joined:
    Oct 25, 2002
    Messages:
    968
    Likes Received:
    54
    Location:
    Canada
    Computer games have already exceeded the abilities of traditional rendering. Take a look at world of warcraft and compare that with say south park. :wink:

    The real question is when will they produce movies that are 100fps.
     
    #12 rwolf, Dec 4, 2006
    Last edited by a moderator: Dec 6, 2006
  13. JHoxley

    Regular

    Joined:
    Oct 18, 2004
    Messages:
    391
    Likes Received:
    35
    Location:
    South Coast, England
    Mathematical accuracy/precision is much more strictly defined in D3D10, but its not quite 100% IEEE compliant - there are a few deviations. From the looks of things it doesn't look like anything hugely significant except, possibly, to a few hardcore GPGPU types :smile:

    Cheers,
    Jack
     
  14. Laa-Yosh

    Laa-Yosh I can has custom title?
    Legend Subscriber

    Joined:
    Feb 12, 2002
    Messages:
    9,568
    Likes Received:
    1,452
    Location:
    Budapest, Hungary
    Don't concentrate on the CPU part - a good renderfarm needs a high-speed network between the render nodes, and some very fast disk systems to store data. A large farm (100+ machines) may require serious redundancy in the central storage as it can read data faster then any HDD could provide it. CG movies have massive scenes that won't fit into 2 GBs of memory, so data is constantly moving in and out of the system, not to mention (software based) caching and a lot of precomputing. For example it's pretty common to render out all the shadow maps, for every light in every single frame of a sequence, to disk once; and then re-use them for iterative test renders and final renders, as animation and light positions tend to remain constant. This can easily mean up to several gigabytes of data for a single sequence.

    So, just because you have a GPU that's 10 or 100 times as fast as the current CPUs, it won't automatically mean that you can now render CGI quality in real time. It may easily happen that the rest of the computer won't be able to provide data to the GPU fast enough.
     
    pascal likes this.
  15. Reverend

    Banned

    Joined:
    Jan 31, 2002
    Messages:
    3,266
    Likes Received:
    24
    I can't expect one to improve while the other stagnates, so my answer would be a definite "no".
     
  16. Megadrive1988

    Veteran

    Joined:
    May 30, 2002
    Messages:
    4,644
    Likes Received:
    155

    ah yes, this is true technically. absolutely. realtime playable graphics will NEVER be as good as current or recent pre-rendered / offline graphics / CGI.

    however, realtime graphics can and do surpass very old, low-end CGI.

    example: most of the CGI used in early to mid 90s games on IBM PC, 3DO, Saturn, etc. and even early PS1 have been surpassed by current realtime graphics on PCs and consoles. but we are talking about pathetic looking, very low-end, low budget CGI.



    computing systems/graphics systems/consoles/machines will need to be fast enough in every area, including the entire system not just the CPU/GPU, for "CGI-like graphics in realtime" to happen. maybe to do midrange CGI, like what we see as intros/cut-scenes in current videogames (below film-grade CGI) could be achieved in realtime late next decade. maaaaaaybe. if all the cards are lined up.
     
    #16 Megadrive1988, Dec 6, 2006
    Last edited by a moderator: Dec 6, 2006
  17. AlexV

    AlexV Heteroscedasticitate
    Moderator Veteran

    Joined:
    Mar 15, 2005
    Messages:
    2,528
    Likes Received:
    107
    Looking at it differently, getting StarCraft CGI quality graphics is not quite here yet, even with the latest and greatest, and that should give some perspective on the gap, considering the fact that even though Blizzard is quite adept at CGI, they don`t do it for a living and their goals aren`t the same as Pixar`s.
     
  18. Sc4freak

    Newcomer

    Joined:
    Dec 28, 2004
    Messages:
    233
    Likes Received:
    2
    Location:
    Melbourne, Australia
    No, I think a modern 8800GTX would be able to do a realtime render of Starcraft's CGI scenes. They'd need to be optimised for real-time rendering, but I'd say that it could be done. They were at, what, 320x240 resolution?
     
  19. Sonic

    Sonic Senior Member
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,879
    Likes Received:
    85
    Location:
    San Francisco, CA
    Regardless of the resolution that Starcraft's CG was at the time, the question is at was resolution was the Starcraft CG rendered? Pixar's movies are rendered beyond DVD resolution, yet is stil displayed at DVD resolution when played from a DVD. Starcraft CG could have been rendered at a far greater resolution, but at the time it could have been a limitation for the majority of hardware out there to not even play the Starcraft CG in video format at 640*480, this the need for 320*240 resolution.

    Realtime graphics may begin to approach Toy Story level CG at some point in the future, but it will probably be done through other methods to make the image appear as good and not the same methods Pixar used.

    From what I remember when the scene is actually being rendered doesn't Renderman actually use REYES? I'm unsure of this right now. Laa Yosh, can you clarify this for us?
     
  20. AlexV

    AlexV Heteroscedasticitate
    Moderator Veteran

    Joined:
    Mar 15, 2005
    Messages:
    2,528
    Likes Received:
    107
    I too seem to recall it uses REYES, but I may be wrong on that. And I think an 8800GTX could do a pretty darned good INTERPRETATION of StarCraft CGI, with some rough edges, but certainly not a 1-1 carbon copy, or an interpretation you can`t distinguish from the original.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...