How much graphic power needed to run Ffantasy IQ in games?

artisan7

Newcomer
lets say ,Nv30 and RAdeon 9700 are released Now..

WHat kind of graphics Power and Pc harware we will need to have

in our home personal Pc's in the future to catch Final Fantasy movie

true quality in games and in REAL time with at least 60frames per

seconds per multiplayer level..?


im not talking above the gameboy version of final fantasy played in

a Geforce4 in E3 .. people say it looked impressive ,which maybe it was

but certainly it was not the quality of the BIGscreen by far ,

(which characters hair and faces looked lifelike)

also where real reflections and weather effects like water looked very

impressive.


it is possible to estimate how far we are in years
from today best videocards r300/Nv30 + P4 2.5 ghz
to see this kind of Graphic power in the future?
asumming that everything (destop pc's ) continues to grows in performance in the future at the same rate is today. ?

as a side note ive heard that FInal Fantasy characters have
more than 1 million Polys ,rendered at HUGE
resolutions up to 4000x 4000 .

thanks in advance for the info..
 
@2x2 resolution with no AA i could render it with no HW acceleration whatsoever with my Duron only :D
If your home personal PC is capable of displaying 640x480 only then we'll be there pretty soon.
But, you need 6,25 times more pixels rendered for 1600x1200 compared to 640x480. Throw in 16x supersampled FSAA ... throw in 8x temporal supersampling for motion blur .. and you are in dire need of at least 256chip NV30 cluster :D
... and i believe some pro-s on this board have said before that real movies are rendered with about 10x higher resolutions and much more advanced methods of antialiasing ..
 
Depends on how close you want to get. We're probably only a year or two away from seeing that sort of quality on basic rendering, but issues like motion blur are going to be problematic for realtime renderers for a while yet I reckon.

4000x4000 isn't really necessary on anything but the biggest monitors.

nvidia's Final Fantasy demo was a good example of what you can do already. It was a drastic simplification of the original scene but it did look very close unless you could do a side-by-side.

A better question might be 'when is the content going to catch up?'
 
Hi there,
WHat kind of graphics Power and Pc harware we will need to have
in our home personal Pc's in the future to catch Final Fantasy movie
true quality in games and in REAL time with at least 60frames per
seconds per multiplayer level..?
As Final Fantasy: The Spirits Within uses huge amounts of compositing and post-work, I'd say:

not even with God's own hardware will this be possible, ever. ;)

ta,
-Sascha.rb
 
nggalai said:
Hnot even with God's own hardware will this be possible, ever. ;)
NEVER say never ;)
what prevents one from building AI compositor robot with storyline branch-prediction ... er well

Actually .. IQ is a subjective thing, so maybe NV PR people have so low quality standards that the mentioned GF3 demo already looked like movie to them ..
There was a couple of miles long discussion about measuring IQ on these boards when some benchmark ( was it 3Dmark ?? ) tried to do IQ comparisons with simple XOR... i think the general concensus was that IQ IS subjective and cannot be measured... only argued/flamed over :p
So the question is kinda flawed
 
As Final Fantasy: The Spirits Within uses huge amounts of compositing and post-work

This what many people don't realise, movies have sooooooo much post-work compositing done before the final render. Final Fantasy in real-time will not happen any time soon.

Any company that tells you that they will be doing Final Fantasy in real-time within the next year are talking out of thier arse's, simple as that.

I am no expert in this field, but I would guess that we are still about 5-7 years away, maybe more.... I hope I am wrong and we see it much sooner.

Fuz
 
Well, I can only assume that FF the movie was raytraced (though I havent actually seen it) and there aint nobody thats gonna do that in real time until quantum computers come our way.

Dave
 
The FF movie wasn't raytraced. they never woul dhave been able to finish the movie in time if that was the case. Honestly, there's no need for ray tracing on everything.
 
Hi there,
no_way said:
NEVER say never ;)
what prevents one from building AI compositor robot with storyline branch-prediction ... er well

LOL :D

SGI has a bit online about FF:SW at http://www.sgi.com/features/2001/july/fantasy . Maya was used for modeling / animation, RenderMan (on Linux) for final renders.

Regarding the post-work / compositing bits: I strongly encourage people to go and rent the FF double-DVD over the week-end or so. Lots of information about the production on that second DVD, and some funny outtakes, too. ;)

ta,
-Sascha.rb
 
Dave B(TotalVR) said:
Well, I can only assume that FF the movie was raytraced (though I havent actually seen it) and there aint nobody thats gonna do that in real time until quantum computers come our way.

Dave

Actually, I think raytracing may be possible in realtime on silicon processors...provided that you set a fairly limited number of reflections/refractions for each ray (like 4-8 or so).

It's Radiosity that I don't believe we'll see anytime soon.
 
People keep mentioning that FF has been rendered in many seperate layers for compositing, and that this will never happen in real time. Well, that is true, but actually not the point IMO!

The question is wether or not we can achieve the quality level of FF, not if we can render the movie FF as it was produced, scene by scene on hardware. By quality level I mean, can hardware process the amount of vertex- and polygondata in realtime, can the same rendering accuracy be achieved, can we have the same or similarly good looking shading on our characters and gameworld for both lighting and most importantly several texturelayers and can real-time shadows of similar quality be rendered?

Today the answer if you stick to the statistics of FF or similar movies would still be no, and even the next generation won't change that, as the computational power and bandwidth for such massive scenes (the polygoncount of the more complex scenes inclusing background world and characters is beyond AGP throughput and hardware specs) is just not there yet. However, the color precision and surface shading possibilities should be introduced this DX9 generation, and geometry power, while not quite there yet, should be vastly sufficient for game purposes (a scene with 2mil polygons can look almost as good as one with 20mil, as long as the detail is spent in the right places). Shadow calculations are still a bit questionable, but I guess many would be surprised how often shadowmaps or different "hacks" are used in the CGI industry in order not to constantly require full raytracing or even radiosity (basically still not usable for animations, if used its very limited cases and radiosity caching is used).

So, do I think the movie FF will be rendered in realtime anytime soon? No.
Do I think that 3D hardware will soon be capable of a quality level very similar to the FF movie? Yes.

Real-time and CGI are indeed moving ever closer together quality wise, the first big step was the last year, the next one will be this generation and the following one will bring us ever closer. Nit-pickers can scream all they want that the CGI movies of yesterday and tomorrow won't be rendered 1:1 in real-time anytime soon. Nvidia and ATI can both be accused for "ridiculous claims" if people argue that way (but neither Nvidia nor ATI ever made such a claim IIRC), that's true, its definitely not gonna happen anytime soon in real-time, but its entirely beside the point IMO.

Even DX8 hardware can already produce some stunning visuals, not movie quality yet but a major step beyond what was possible before, some XBox games even demonstrate this clearly (e.g. the upcoming Team Ninja games). This fall's hardware (at least R300 and NV30) will be capable of some excellent graphical quality and appear to be way beyond current generations computationally (for shading and raw geometrypower). I have little doubt they will be able to, if exploited correctly, get very close to past CGI movies - not identical, but very close.

The sad thing is, it will take a couple cears before developers will begin to touch this power. No matter how hard I look, Doom3 is still the only game I know off that really goes down, the complete real-time per-pixel-shading route, but even Doom 3 is not fully exploiting what the next crop of hardware could do, we'll probably have to wait till JC's next engine for that...
 
Gollum said:
Today the answer if you stick to the statistics of FF or similar movies would still be no, and even the next generation won't change that, as the computational power and bandwidth for such massive scenes (the polygoncount of the more complex scenes inclusing background world and characters is beyond AGP throughput and hardware specs) is just not there yet.

It should be possible to reach those polycounts through the use of HOS and storing geometry in video memory. But, yes, the geometry power isn't quite there yet.

The sad thing is, it will take a couple cears before developers will begin to touch this power. No matter how hard I look, Doom3 is still the only game I know off that really goes down, the complete real-time per-pixel-shading route, but even Doom 3 is not fully exploiting what the next crop of hardware could do, we'll probably have to wait till JC's next engine for that...

What's exciting is that DOOM3 is "based on what was made possible with the GeForce1." If that's the kind of image quality that the GeForce1 allows, just imagine having an NV30 as a min spec?
 
Chalnoth said:
What's exciting is that DOOM3 is "based on what was made possible with the GeForce1." If that's the kind of image quality that the GeForce1 allows, just imagine having an NV30 as a min spec?

Thats all well & good up to a point, is that in terms of features capability or performance capability? i.e stencil shadows, cube maps, DOT3 are all there but JC said he was aiming for 30fps on a Gf3 a while back didnt he?

What the Nv30 will allow him to develop v what hardware you would want to run on on at just '30fps' (assuming 10x7, no AA, No AF) are two different things IMO. Of course we would expect no performance impact AA/AF by then at 10x7 :)

Back to FF:SW - didnt somebody say it had 500+ texture layers?
 
Randell said:
What the Nv30 will allow him to develop v what hardware you would want to run on on at just '30fps' (assuming 10x7, no AA, No AF) are two different things IMO. Of course we would expect no performance impact AA/AF by then at 10x7 :)

Back to FF:SW - didnt somebody say it had 500+ texture layers?

Yes...I'm excited about what kinds of technologies that developers like JC can think of that will run at "only 30 fps" with most effects turned down on an NV30 :)

Of course, most everybody on this board will have a video card with capabilities in excess of those provided by the NV30 by the time such a game launches.
 
Back
Top