nVidias Final Fanatsy real time demo

Bogotron said:
Another thing to remember about the Final Fantasy stills is that they're most often composites of multiple renders which have been thouched up after renderman did it's job. If you have the DVD on hand there is a featurette giving some good examples on how the different layers are rendered and composited to give the final result you see in the movie.

If we übertweak the rendering (like nvidia has done) on a scene by scene basis we can probably recreate one of the layers pretty well in realtime today, but the compositing and multiple layers? Nope, sorry...

Of course, but that's not quite the point behind the whole idea. I'm a 3D Artist, and realtime graphics are only just beginning to aproach a level of quality and complexity that I was producing over 5 years ago. Many techniques (like compositing scenes from multiple layers to have the most control over the final output) are probably never going to make sense in the realtime area, but like I said that's not the idea behind this "Toy Story Quality" in realtime goal. The purpose is to reach a level of *comparable* visual quality concerning surface complexity, lighting and shading capabilities and most of all to display this at acceptabe framerates.

The techniques used in creating a movie like Final Fantasy are vastly different than making a 3D game, so expecting a company (in this case NVidia) to take the data and convert it 1:1 into realtime is a bit naive. Of course there got to be tweaks and optimizations, the whole realtime 3D art depends on tricks, cheats and workarounds to achieve a desired effect, especially on today's hardware - but if the end-result looks stunning and at first glance almost identical to film output, then that is exactly the desired outcome.

Over the next years, as hardware becomes more powerfull, Artists will have to cut less and less corners to achieve comparable results to the movies of today. The time where you can just plug your Maya/Lightwave/MAX/XSI scene into a realtime engine and get the exact same output as from your software renderer is not gonna come anytime soon, nor is it the desiered goal right now IMHO...
 
Wow, first time I've seen the nVidia realtime rendered FF screens - that is, other than the thumbnail sized shots at nVidia's website. It may not touch the movie's quality, but that's amazing stuff. It'd be great if the demos for their upcoming hardware would show things like this, eye-candy is always fun to look it.
 
Gollum:
The techniques used in creating a movie like Final Fantasy are vastly different than making a 3D game, so expecting a company (in this case NVidia) to take the data and convert it 1:1 into realtime is a bit naive.

You know this. Most other people don't, probably including some readers of this forum. I just though I'd point out that it takes a bit more than to hit "Render" and an image like the FF movie frame pops out. I didn't know how much work went into each scene and how many layers they used and what impact they had on the final result before I saw the featurette on the DVD.

Basically, like you say, realtime and off-line rendering are two different ways of working, each with different types of tools and feature-sets/limitations. Getting realtime to emulate the off-line renderers, even if they're imperfect when you study them closely, is a feat in itself.

I wonder if ATI or Nvidia have something simmilar planned for this year's siggraph.... :)
 
But they do that with most feature film regardless if it is CG or not. Its probably a standard procedure.

But in the future, I am pretty sure, the hardware would be able to do those kind of compositing in Real Time. Maybe done between different passes.
 
The most unrealistic thing about FF the movie was the fact that all the characters had brown eyes. Maybe next time they'll add a little variation. There's more to realism than just image quality. ;)
 
I should mention that the screenshots were taken on a Quadro DCC if I remember correctly (the professional version of the Geforce3) not a Geforce4 Ti4600 :)
 
What actual polygons did the GeForce4 Ti4600 achieve on the final fantasy 'cut-down' demo?

The stats for FF the rendered movie are impressive, what are the equivalent stats of FF the demo I wonder?
 
Sorry for bringing up such old post, but the difference is clear :LOL:
The hair rendering is completly different, lighting, there's also no background, she looks skinner and there's many missing details (look at her bend
http://www.pcrave.com/images/reviews/gf3ti500/aki_leaningaa5.jpg
Aki_73.jpg


From a 2CD DivX rip i made a year ago, i lent my DVD to a friend, will get it back for a better ss next time...
 
Jaysis, man!

Don't post images directly IN THE THREAD please, it makes the width of the page explode. Post LINKS to the images instead... Thank you.

Anyway, the thread's over a year old, do we really need to bring it back up from the grave? :)
 
I sincerly don't see why FF couldn't be done on the NV40 or R420, with at least around 24 FPS on it. I'm not saying it would be IDENTICAL, but it would be remarkably near with a very small IQ difference, barely noticable.
Of course, the "hacks" Kristof explained might have to be put to the next level then.

If the GF4 can do that at 300Mhz and as a 4x2, with 2 VS units, at 24FPS... I don't see why with a 550/8x2/4 VS units architecture, you couldn't do much, much better.
Remember, that's about 3-4x the raw power in just about every single aspect. And tons more precision; FX16/FP16/FP32 is much better than FX9 of course!

1:1 without hacks is 100% impossible for the several years to come. But if your focus is speed while maintaining very good IQ, and not just having "perfect" IQ as in most films, I sincerly don't see why not. Not that I find that particularly interesting either; it doesn't show us what current hardware can do in real games, nor does it show us whether it's sufficient for non-realtime film production...


Uttar
 
Hmm

What about the LOTR demo that was showcased after the debut of the Radeon 9700? If I am not mistaken it was done on a linux box. Does anyone have any screen shots of that by any chance?

Raystream
 
Uttar said:
I sincerly don't see why FF couldn't be done on the NV40 or R420, with at least around 24 FPS on it. I'm not saying it would be IDENTICAL, but it would be remarkably near with a very small IQ difference, barely noticable.
You would also have to limit the scene to something very specific. There's no way it could do the whole movie in anywhere close to realtime, in any reasonable approximation.

And then you'd also probably want to preprocess a number of different things, such as shadows and whatnot, and the resolution would be vastly lower than the resolution the movie was rendered at for the big screen (though it would probably be comparable to DVD resolution).

There would also be no motion blur, and the shaders themselves would be far less complex.

But yeah, it'd be a lot closer than what could be done previously.
 
Could anyone compare this to the ATI LOTR real-time demo I heard about awhile ago? Were there ever any screenshots posted of that?

Edit: Never mind, didn't realize there were two pages to this and that someone had already beaten me to it.
 
The clothing doesn't even have bump mapping for the threads I'd suggest that the movie either used displacement mapping so that means the whole thing would have been subdivided into sub pixel triangles in alot of renders. Anyone think that the current cards can do that near to real time?
 
Back
Top