nVidias Final Fanatsy real time demo

BoardBonobo

My hat is white(ish)!
Veteran
I've heard about this but I'ver never seen it, and now the inquirer *pinch of salt* has seen it too.

Has anyone here seen it and if so does anyone know where you can get a copy. I'd love to see it running, mainly because I want to see how much they managed to include. The required machine specs. are pretty high so I guess alot of stuff is being done by the processor.

And can anyone think of a good reason why it couldn't be ported over to ATi hardware?
 
Well, more than likely the demo would use OpenGL (at least, I'd be surprised if if was DX) and if thats the case then it would be using nVIDIA's shader extensions - the demo is using vertex / pixel shaders to replicate detail that would be done through highly detailed geometry in the movie.
 
I have seen it twice, once at GDC Europe last year and once at GDC US this year. It is indeed OpenGL and its highly tweaked hacked to work in realtime, its a test case : can we approach the quality if we "really" want to ? The scene data is quite tweaked, for example there is a old guy in the scene but you always see the upper part of his body and head, well there is nothing else of him in the scene, so if they enable free camera mode (which tends to crash the app at least once during the demonstrations) you can see his body floatin in mid air with no legs :)

But its quite impressive to see. I don't think they can distribute the demo due to legal reasons.

K~
 
I read a claim somewhere that the only difference from the movie was resolution. Do you hold to this at all Kristof?
 
I really find that hard to sallow. Did you folks read about the detail that went into that movie? They had a server farm of over 1000 Linux P3 boxes do the core of the rendering at they said about 2 years of total render time. Granted it was a full lenght movie, but 60,000 strains of hair just for Aki's hair alone makes you wonder on the power needed.


Edit:

found the info:

http://arstechnica.com/wankerdesk/01q3/ff-interview/ff-interview-1.html


Here are the stats incase you dont have time to read:

Number of Sequences = 36
Number of Shots = 1,336
Number of Layers = 24,606
Number of Final Renders (Assuming that > everything was rendered once) = 2,989,318
Number of Frames in the Movie = 149,246
Average number of shots per sequence = 37.11
Average number of rendered layers per shot = 18.42
Average number of frames per shot = 111.71
Estimated average number of render revisions = 5
Estimated average render time per frame = 90 min
Shot with the most layers = (498 layers)
Shot with the most frames = (1899 frames)
Shot with the most renders [layers * frames] = (60160 renders)
Sequence with the most shots = (162 shots)
Sequence with the most layers = AIR (4353 layers)
Sequence with the most frames = (13576 frames)
Using the raw data (not the averages) it all adds up to 934,162 days of render time on one processor. Keep in mind that we had a render farm with approximately 1,200 procs.

Troy: ...and that's just final renders. Including test renders, revisions, and reviews, it's much more. SQB (the render farm software) ran ~ 300,000 jobs, w/ an average of 50-100 frames/job (depending on the type of job). For storage, we have about 4TB online (and pretty full, most of the time...).

Which gives you a FPS of about 0.000003fps

The also talk about shadows, lighting, and other neat stuff. Hard to believe that could run in real time just today. Tomorrow, maybe :)
 
Aki's hair in the demo was MUCH reduced in quality...

The demo was of a scene that only had Aki and the old doctor guy, her mentor. They did look damn good, but the detail was significantly reduced. The lighting was moderately good, but the rest of the scene was pretty much empty.

Anyone that tries to say it looked just like the movie didn't see it for real. Still overall it was an impressive demo.
 
I can't believe there are people out there that think its possible, today, to render in real-time a scene out of Final Fantasy the movie with the same level of detail. Impossible, on any hardware. :rolleyes:

I doubt it will be done with in 5 years. Remember, I am talking about the exact same amount of details as seen in the movie. IMHO, its not going to happen for atleast another 8-10 years. I hope I am wrong.
 
Randell said:
I read a claim somewhere that the only difference from the movie was resolution. Do you hold to this at all Kristof?

No, as said they tried their best to make it look like the movie but obviously they could not match it perfectly and thats pretty much the conclusion as well. This demo was done Square to see how far they could go. But as said quite some shortcuts had to be taken.

Which does not take away that some things are quite stunning, especially the old guys beard is quite impressive when they go into wireframe mode.

You might be able to find the GDC presentation, it was given by one of the guys from Square (so it was no NVIDIA hardware rulez presentation).

K-
 
Kristof,

Wouldn't it make Sony angry at Square, if they're not using Sony hardware (EE and GS)? And also because it might be construed as positive PR for Mirosoft's XBox?
 
bbot said:
Kristof,

Wouldn't it make Sony angry at Square, if they're not using Sony hardware (EE and GS)? And also because it might be construed as positive PR for Mirosoft's XBox?

I believe this project was done before the XBOX.

K-
 
The demo is impressive for today's 3D consumer level graphics hardware. Although it is obvious that a GeForce 4 Ti can't compete with off-line render software I thought I would post a comparison of Aki to drive the point home for some.

nVidia GeForce 4 Ti 4600

aki_leaningaa5.jpg

Sorry for the picture size.



RenderMan

img08.jpg



Using Square's and Pixar's technology Aki look pretty damn fine to me. ;)


Cheers
 
As long as I can count the number of tri's which make up her sleeves it aint quite there yet :)

Personally I think as far as computations go the quality could be approached in the near future in a game with a free floating camera... bigger problems are storage, aliasing and lack of tools which make it possible to actually let artists create content in reasonable time frames. For storage we will need something revolutionary, 1T DRAM made with optical litography wont do and neither will DVDs. For aliasing in non realtime they can just use hideous supersampling ratios and simple LOD schemes, because they control the camera, view dependent multi-resolution tesselation with the right multiresolution geometry representation could lessen the need for supersampling a bit (also for non realtime, but its easier said than done ... its just much more essential for realtime use).
 
I can't tell if Rachel is smiling or about to cry, that expression looks weird. :eek:

I still think it's pretty impressive, considering how close it does come to the original version, and it can run in real-time as opposed to stills put together to make a movie.
 
Another thing to remember about the Final Fantasy stills is that they're most often composites of multiple renders which have been thouched up after renderman did it's job. If you have the DVD on hand there is a featurette giving some good examples on how the different layers are rendered and composited to give the final result you see in the movie.

If we übertweak the rendering (like nvidia has done) on a scene by scene basis we can probably recreate one of the layers pretty well in realtime today, but the compositing and multiple layers? Nope, sorry...
 
I liked the Rachel demo, but she looked a lot better if you had FSAA on but then it slowed down too much. It was her teeth that looked wierd to me. Kind of like a sharks :D

The GF FF looks pretty cool, that's the kind of thing I wanted to see. You can see how much they had to leave out in the quality of the textures etc. It also seems to make Aki look much younger!!

Thanks for digging the piccies up.

Didn't Matrox start to investigate realistic facial imagery with their headcast technology (or maybe it wasn't theirs but licenced?).
 
Back
Top