NVIDIA Fermi: Architecture discussion

Was just watching Stardust on cable last night and was laughing at DeNiro during that scene. Cute movie, the actress who played Dunstan's mother was hot, hot, hot.
 
http://www.electronista.com/articles/09/10/30/nvidia.fermi.details.escape/

1. Whoa.

2. On second thought, they're probably using stuff like BRDF measurements and such which are pretty far from viable in any computer games. Not to mention most of the data is usually static.
www.debevec.org
But it sure looks pretty cool.

Are those the same renders they showed back at GTC which were actually rendered by SLI'd GTX285's? (not the faces, they come from a different video and could be done on your mothers GTS250.)

And about their mobile parts, I hope "electronista" know that 3XXM parts are based on GT21x.
 
Last edited by a moderator:
Are those the same renders they showed back at GTC which were actually rendered by SLI'd GTX285's? (not the faces, they come from a different video and could be done on your mothers GTS250.)

And about their mobile parts, I hope "electronista" know that 3XXM parts are based on GT21x.

Those are fake, they're stills from Playground Festival trailer with nVidia logos photochopped on them
http://onesize.nl/projects/playgrounds-titles-2009
http://vimeo.com/6947473
http://motionographer.com/theater/on…ground-titles/
 
Heh, fooled me.
But as I've mentioned it shouldn't be impossible to do something like that, it's actually quite similar looking to the Alfred Molina CG head from Spiderman 2 that's been demoed on the PS3 a few years ago.
 
it seems to me that in the past few years Nvidia has been an enabler more than anything else, proprietary standards aside.

Actually, it seems to me that lately Nvidia has been more of a disabler than anything else (disabled Batman AA for ATi cards, disabled PhysX in the presence of an ATi card). ;)
 
Heh, fooled me.
But as I've mentioned it shouldn't be impossible to do something like that, it's actually quite similar looking to the Alfred Molina CG head from Spiderman 2 that's been demoed on the PS3 a few years ago.

Possible or not the fact remains that those images are a proven fake and makes any speculation or discussion of them irrelevant to the topic of this thread.
A better example than yours is the Killzone 2 pre-rendered movie fiasco Sony tried to pass off as being real-time rendered on a PS3.
 
Yeah, I got it, what I've tried to say is that even if it was done in realtime, it would in no way be representative of what's actually realistic in a game project (except maybe the Fight Night series).

(ironically I was one of the few who tried to convince people about that KZ2 movie being CG ;)
 
Yeah, I got it, what I've tried to say is that even if it was done in realtime, it would in no way be representative of what's actually realistic in a game project (except maybe the Fight Night series).

(ironically I was one of the few who tried to convince people about that KZ2 movie being CG ;)

Of course. I mean even that unigine engine benchmark causes a 40% drop in performance with tessellation enabled while being devoid of AI and game characters, etc.
 
It's more about Debevec's techniques - it requires a large and expensive light stage and measuring sessions, storage of a lot of extra data beyond color/normal/spec maps, and of course real live actors to 'scan'. Which is all good for a few digital doubles of famous actors for a feature film VFX studio, but very unpractical and expensive for a game.

Not to mention the difficulties of finding real live aliens and fantasy monsters and stylized looking people ;)
 
Of course. I mean even that unigine engine benchmark causes a 40% drop in performance with tessellation enabled while being devoid of AI and game characters, etc.

Unigine also goes way over it's shoulders with tesselation at times, applying sickeningly high levels of tesselations on areas that definately don't need it. With better adaptive tesselation algorithm they'd probably get it to perform at least 10-20% faster
 
Unigine really isn't a useful benchmark for me personally. One reason is that as you stated they aren't applying tessellation in a practical fashion (tessellating a slope to make stairs?). For me a useful tessellation benchmark is one that has the option to render the same high poly scene either using pre-tessellated meshes sent from the CPU or doing adaptive tessellation on the GPU from a smaller base mesh. That's where we'll see the benefits.

Of course, it's still nice to have it the current way with just a regular scene and a higher quality tessellated version but that only makes sense if the tessellation is used properly and the regular scene isn't gimped.
 
Unigine is a benchmark to compare different hardware configurations, not different techniques...
 
Back
Top