Real-time rendering during Avatar production?

wco81

Legend
James Cameron gave an interview on NPR and he described how during shooting, the actors were in a motion-capture space but he vaguely referred to two different computers processing the capture data, resulting in one of the monitors Cameron watched showing the actors in Pandora, as they were being filmed/captured.

He said the resolution wasn't the same as the final film output but he could "preview" scenes of the film that way.

He obviously didn't get technical about it on NPR but is this a standard setup or could it be one of the many things they developed for making this movie with a huge budget?

Was it just compositing over a prerendered backdrop or could they have been rendering Pandora and movement in real time?
 
It wouldnt surprise me weta certainly have the computing power to do this, from the worldwide top500 supercomputer list

Australia and New Zealand == 9 computers,
7 of those are at weta in wellington!!
 
Weta is some CGI effects house and they have supercomputers?

Not just rendering farms like other effects houses?

I thought Cameron had his own effects company.

In any event, it's not something that a consumer is going to see any time soon?
 
How would you define the difference between a rendering farm and a supercomputer (I've no clue personally)?

Yeah Cameron has his own effects company, Digital Domain IIRC, maybe (assuming Avatar was shot in NZ) it was just easier to use a local company, and I guess their work on rendering skin/mocap in LOTR/KK didn't do them any harm either.
 
In the old days it would be COTS vs. custom ... but the TOP500 is awash with COTS component clusters. A rendering farm has the same components as some supercomputers but they might spend a little less on the interconnect architecture (flatter, less hierarchy, less switches/routers, slower perhaps although I assume most new rendering farms chose Infiniband just like HPC COTS clusters).

So basically not a lot of difference ... unless one simply says "if it's on the TOP500 it's a supercomputer, if not not". Not all CGI houses will have an interest in running Linpack benchmarks to get on the TOP500 though ... what's the point?

That said, compositing the images from a stereoscopic camera with some 3D proxy representation of the (imaginary) set for display on a stereoscopic monitor is not going to take a supercomputer.
 
Last edited by a moderator:
That said, compositing the images from a stereoscopic camera with some 3D proxy representation of the (imaginary) set for display on a stereoscopic monitor is not going to take a supercomputer.

It's just the usual virtual shooting room as were all the rage a few years ago for news broadcasts and stuff. Angle/zoom sensors on the cameras and bit of chroma keying. I'd assume a radeon 9600 can run that in real time for the typical shitty quake 1 level like virtual room.
 
Back
Top