About Adrianne

Is that smoke demo impressive for its physics or graphics? I'm assuming it's about physics because I didn't think that smoke looked real at all. Adrianne looks totally animated, too. (Well, obviously it is..) I like the facial expression, though.

There was a thread about Rachel/Wanda here but I can't locate it now. I think Rachel and Wanda look 10 times better than Adrianne. Instead of such cheap, cheesy and plastic looking, they had that special aura surrounding them, which would inevitably be getting to me even when I was not looking at them.









P.S. When can we expect a Hollywood movie quality graphics in real time?
 
Never...next question?:D

Actually, one thing I'm just itchin' like crazy to see is something like a clip of Toy Story running in realtime as a tech demo, with nearly the same level of fidelity the offline rendering gives (e.g. doing an XOR compare of the images would produce differences of less than 1%), excluding some REALLY impractical stuff like the raytraced reflections renderman does. I definitely think we have enough graphics power now to quite literally do TS in realtime.
 
John Carmack once said (during the final stages of development of Doom 3 I think) that we should have Shrek-like graphics by 2010. At the time 2010 seemed very far away, but now that it's only 3-4 years away, I don't think he really guessed too close.

He has been really quiet since Doom 3 came out, probably because suffered a real blow by Unreal Engine 3. I wonder what his next engine will be like.
 
Bahh, I would take that JC quote with a minor grain of salt. Remeber some Sweeney quotes/forecasts that were outlandish?Well, I don`t think JC is necessarily the one to foresee the future completely. Remember how he was saying that Doom3 approaches in quality first Pixar renders? Load up the game and check that one out.

The thing is, doing something realtime is a helluva lot different from doing separate scenes that are then patched together. There`s a lot of hand-tweaking in offline rendering, and exact problem solving rather than a general approach. A game has to look good in a variety of scenarios, and has to run acceptably well...so no, I don`t think hollywood quality is achievable, unless we`re talking about Hollywood quality from the past. Subtract 10-15 years(maybe less), and yes, those years` quality may be achievable.

Check out even the advanced tech demos, they`re not there, and have never been, even though they`re a pretty clear, reproductible and tweakable scenario and they`re written by the dudes that know the hardware best.
 
The thing is, doing something realtime is a helluva lot different from doing separate scenes that are then patched together. There`s a lot of hand-tweaking in offline rendering, and exact problem solving rather than a general approach. A game has to look good in a variety of scenarios, and has to run acceptably well...so no, I don`t think hollywood quality is achievable, unless we`re talking about Hollywood quality from the past. Subtract 10-15 years(maybe less), and yes, those years` quality may be achievable.
That's exactly it - both real-time and offline renders are continually improving, so it's a moving target. The holy grail is always seen as "Hollywood FX", but that's a totally different ballpark from 15 years ago.

The Last StarFighter? Sure, can be done relatively soon, heck can almost be done now aside from the aliasing.

LOTR next gen? Er...no.
 
The Last StarFighter? Sure, can be done relatively soon, heck can almost be done now aside from the aliasing.
now admittedly I haven't seen the movie, but for 1984, it better be damn good for it to be just barely possible to run in real time 22 years later. I mean they were running Luxo Jr (1986) in realtime at the GeForce 3 launch.
 
It was an interpretation of Luxo JR....they were running an interpretation of Spirits Within on a TI500, does that mean FF graphics are at hand?;)
 
Well it's going to be an "interpretation" regardless, you can't expect the method images are rendered to mirror how they were originally done exactly, can you? The question is how close does it have to get before we can't tell the difference and all that doesn't matter. Although Luxo Jr might not have been that close, I don't know.
 
now admittedly I haven't seen the movie, but for 1984, it better be damn good for it to be just barely possible to run in real time 22 years later. I mean they were running Luxo Jr (1986) in realtime at the GeForce 3 launch.

Last Starfighter was rendered in a very high res, (IIRC something like 8k), so even today, most gaming cards would struggle with that. I believe it was done that way to make up for the lack of good AA solutions at the time.

You could probably simulate something very similar with today's tricks (such as edge AA), but it wouldn't be quite the same.
 
My point is that I don`t think convergence between real-time targeted interpretations and the original offline rendered thing is that plausible, because both evolve, and the offline side have the advantage of additional tweaking and not having to bother with performance all that much...in real-time it`s likely that you`ll cut something out in order to have playable performance. Everything IMHO, by the way, I`m not preaching gospel here
 
There's never going to be a convergence because both points are moving, but we can set a fixed point. We could set a goal, like, "How long till we can approximate ToyStory in realtime at interactive framerates?". I think our technology today could manage a decent approximation over a very limited scene area, but how long till we can expand those tech demos into an entire gaming world, not just a tiny, cube or spheroid "region" with a wrap-around texture as a back-drop... That I don't yet know, although I'd presume with some of the focus coming back to geometry, that architectures today will expand, and evolve around that, and that performance parameters will allow for that within another ten years, but again, it's hard to say.
 
Well, the issue is quite complex really...once you start using the ammount of geometry that would be adequate for a close enough approximation, you`re going to have to deal with other limitations like for example the width of the communication bus between GPU and system. It`s quite hard to forecast actually, but it`s not as close as I`ve seen being touted by some.
 
Yeah, some people seem to think it's already here, or it was already here years ago. They don't appear to understand that there are logistical nightmares that still stand in the way of it.
 
Back
Top