Are Realtime LOTR demo shots online?

NinjaBiker1

Newcomer
After hearing Nvidia's claim of realtime cg quality (brushing aside Toy Story for Final Fantasy level) it was really nice to see the screenshots of the actual demo. ATI's now claiming to render Lord of the Rings in real time and I'm wondering is any of that online at all?

-Ninja
 
I don't have screen shots, but I'll describe it. The clip is about 5 seconds long and it shows tons of orcs walking across the hillside. It was being run in Massive which is a software program developed for Lord of the Rings.
 
Leaving out the "Rendering time", I have my doubts as to whether Massive can even generate the RIB files for the scene "in real time".

IIRC, Massive can simulate hundreds or thousands of CGI actors on the screen at once, each one having its own movement, combat, etc. That's got to be a massive load on the CPU doing all the physics and generating the scene files.
 
Democoder, you may doubt it but they infact did it. That is why Massive did this demo. They are promoting their technology for other movies.
 
I have no doubt that they did the demo, I question as to whether the level of DETAIL in the simulation was the same level of detail in the REAL LOTR scene or whether it was CUT DOWN for the demo the same way NVidia CUT DOWN the resolution of the geometry and textures in the Final Fantasy demo.


LOTR was rendered on 200+ dual CPU linux boxes using a 12terabyte network attached storage. The average scene database was over 2gb IIRC. This means without even thinking about rendering, the CPU would have to compute and generate over 2gb of scene data per frame, and datarates would far outstrip the bandwidth of the CPU at real-time rendering rates.

If I simply wrote a 1 line C program that wrote out a stream bytes over and over, I wouldn't be able to push that much data.

But, if you want to believe that Massive was running the true LOTR scene data, in full detail, at real time frame rates, go right ahead.

Looks like those 200+ dual-cpu linux boxes were a big waste of money then, and someone at WETA Digital is in trouble.
 
of course the level of detail was cut down... the r300 is a monster.. but not that big of a monster

massive really did a sweet job with it's software, i remember back a few years ago, a guy from massive came in the 3d irc channel i hang out in talking about the software they were developing for the LOTR trilogy(back then i had no clue they were even making one).. we all flamed him till they put up a webpage talking about their software, then we ate our words.. the first movie of the trilogy didn't show crap of what their stuff is capable of... the next 2 are gonna be sweet(especially the 3rd)
 
I though the LOTR demo was rendered on that four chip beast the FireGL X1. And not on a vanilla R300? There are more details here.
 
DemoCoder said:
Leaving out the "Rendering time", I have my doubts as to whether Massive can even generate the RIB files for the scene "in real time".

Massive was not outputting RIBs, as PRMan would've been unable to handle the load. They had to write its own renderer to push all those polygons out.

I have no clue about the realtime demo though.
 
Basic said:
That "four chip beast" isn't a FireGL X1, but made by CAE and has four R200 chips.

ATI introduced a new workstation card based on its latest Radeon 9700 GPU, called the FireGL X1. The board is essentially identical to upcoming consumer-focused Radeon 9700-based offerings, although the memory load-out will be 256MB of DDR memory. This part is due out later this fall. Although ATI didn't discuss pricing, the larger amount of memory plus the usual markup seen for workstation 3D products will likely have this offering priced somewhere in the $500-700 range.

That was the four chip I was talking about.

edit
This card was previewed at Siggraph and was used to render the LOTR demo.
 
Ah, I'm getting well confused, the picture I saw with the blurb had a picture of a four chip board. But I guess that must be the CAE jobbie?
 
Back
Top