Can Pixar's Rendering techniques be brought to the PC?

Techno+

Regular
Greetings,

I know this might sound like a stupid pointless question, but I wanted to ask you all can Pixar's rendering techniques be brought to the PC? As far as i know Pixar uses CPUs in rendering farms to render movies, so is this applicable for a lets say 64 core CPU? What advantage does Pixar find in using CPUs for rendering than GPUs (GPUs already offer more FLOPS)? is it programmability ?

Thanks
 
Take a look at the algorithms they`re using for rendering, the resolutions they`re targetting and the performance(they don`t quite care about, if it looks as they want it to). The answer is there.
 
ok I got it, thanks. I also think one other reason is costs, and although GPUs might prove easier to program, using GPUs specialized in CAD will be expensive, so they chose to go for the low cost hard programming method. If developers choose to program game gfx for multicore CPUs, then it will prove too expensive.
 
A REYES thread?

I don't know of a better way to summon Uttar.
 
Yay! The bi-monthly realtime CGI thread is finally here! :D

I think that, whatever architecture is used by Pixar and all other CGI studios, whatever architecture is used in realtime gaming... the CGI studios will always have time on their side. No matter the architecture or algorithm, they will always have a LOT more time than the .02 seconds we need our PCs to do the whole rendering, so that it can be playable at 30fps.

That's the sad truth. We can only wait until realtime graphics is indistinguishable from reality, but not even CGI is there yet so we're gonna have to wait a long long time.
 
The answer I've always heard is that it is all about reliably identical results. Could you imagine the results if half the render farm rounded their arithmetic one way, and the other half of the render farm rounded a different way, or if they used different seeds for their noise functions, etc...

A software approach with IEEE compliant CPUs, allow for them to not worry about those problems.
 
Hi Techno+ the essence of this thread looks to be the same as your last thread : http://www.beyond3d.com/forum/showthread.php?t=35940

There's a difference between the pixar method and the GPU method where one is doing a lot of work with a lot of coding not necesarily in real-time and the other is using gpu assisted functions at a fraction of the cost of the previous setup and the results are according to their difference in costs.
 
I think that, whatever architecture is used by Pixar and all other CGI studios, whatever architecture is used in realtime gaming... the CGI studios will always have time on their side. No matter the architecture or algorithm, they will always have a LOT more time than the .02 seconds we need our PCs to do the whole rendering, so that it can be playable at 30fps.
Exatlly, and my guess somewhere in the near future they will start to use a farm of GPGPU to do the rendering. Imagine, one hundred G80 working for 1 minute to produce one frame.
 
Maybe the question should be, at which point in time can they start porting their OLD work to current tech? Would they ever consider opening a branch that designes real-time animations for computer games, and would it be interesting to, say, recreate one of their first movies with a freely moveable camera?
 
Maybe the question should be, at which point in time can they start porting their OLD work to current tech? Would they ever consider opening a branch that designes real-time animations for computer games, and would it be interesting to, say, recreate one of their first movies with a freely moveable camera?


Now that would be something wouldn't it? Watching Shrek 10 years from now and having full control of the camera!
 
Computer games have already exceeded the abilities of traditional rendering. Take a look at world of warcraft and compare that with say south park. ;)

The real question is when will they produce movies that are 100fps.
 
Last edited by a moderator:
Isn't IEEE compliance part of DX10 specification? I'm sure I've seen a mention of FP rounding error limits somewhere?
Mathematical accuracy/precision is much more strictly defined in D3D10, but its not quite 100% IEEE compliant - there are a few deviations. From the looks of things it doesn't look like anything hugely significant except, possibly, to a few hardcore GPGPU types :smile:

Cheers,
Jack
 
Don't concentrate on the CPU part - a good renderfarm needs a high-speed network between the render nodes, and some very fast disk systems to store data. A large farm (100+ machines) may require serious redundancy in the central storage as it can read data faster then any HDD could provide it. CG movies have massive scenes that won't fit into 2 GBs of memory, so data is constantly moving in and out of the system, not to mention (software based) caching and a lot of precomputing. For example it's pretty common to render out all the shadow maps, for every light in every single frame of a sequence, to disk once; and then re-use them for iterative test renders and final renders, as animation and light positions tend to remain constant. This can easily mean up to several gigabytes of data for a single sequence.

So, just because you have a GPU that's 10 or 100 times as fast as the current CPUs, it won't automatically mean that you can now render CGI quality in real time. It may easily happen that the rest of the computer won't be able to provide data to the GPU fast enough.
 
Yay! The bi-monthly realtime CGI thread is finally here! :D

I think that, whatever architecture is used by Pixar and all other CGI studios, whatever architecture is used in realtime gaming... the CGI studios will always have time on their side. No matter the architecture or algorithm, they will always have a LOT more time than the .02 seconds we need our PCs to do the whole rendering, so that it can be playable at 30fps.

That's the sad truth. We can only wait until realtime graphics is indistinguishable from reality, but not even CGI is there yet so we're gonna have to wait a long long time.


ah yes, this is true technically. absolutely. realtime playable graphics will NEVER be as good as current or recent pre-rendered / offline graphics / CGI.

however, realtime graphics can and do surpass very old, low-end CGI.

example: most of the CGI used in early to mid 90s games on IBM PC, 3DO, Saturn, etc. and even early PS1 have been surpassed by current realtime graphics on PCs and consoles. but we are talking about pathetic looking, very low-end, low budget CGI.



computing systems/graphics systems/consoles/machines will need to be fast enough in every area, including the entire system not just the CPU/GPU, for "CGI-like graphics in realtime" to happen. maybe to do midrange CGI, like what we see as intros/cut-scenes in current videogames (below film-grade CGI) could be achieved in realtime late next decade. maaaaaaybe. if all the cards are lined up.
 
Last edited by a moderator:
Looking at it differently, getting StarCraft CGI quality graphics is not quite here yet, even with the latest and greatest, and that should give some perspective on the gap, considering the fact that even though Blizzard is quite adept at CGI, they don`t do it for a living and their goals aren`t the same as Pixar`s.
 
No, I think a modern 8800GTX would be able to do a realtime render of Starcraft's CGI scenes. They'd need to be optimised for real-time rendering, but I'd say that it could be done. They were at, what, 320x240 resolution?
 
Regardless of the resolution that Starcraft's CG was at the time, the question is at was resolution was the Starcraft CG rendered? Pixar's movies are rendered beyond DVD resolution, yet is stil displayed at DVD resolution when played from a DVD. Starcraft CG could have been rendered at a far greater resolution, but at the time it could have been a limitation for the majority of hardware out there to not even play the Starcraft CG in video format at 640*480, this the need for 320*240 resolution.

Realtime graphics may begin to approach Toy Story level CG at some point in the future, but it will probably be done through other methods to make the image appear as good and not the same methods Pixar used.

From what I remember when the scene is actually being rendered doesn't Renderman actually use REYES? I'm unsure of this right now. Laa Yosh, can you clarify this for us?
 
I too seem to recall it uses REYES, but I may be wrong on that. And I think an 8800GTX could do a pretty darned good INTERPRETATION of StarCraft CGI, with some rough edges, but certainly not a 1-1 carbon copy, or an interpretation you can`t distinguish from the original.
 
Back
Top