INVidia may well be planning to deliver early cards for such non-gaming customers.
What play games on it, who me? no never
can I have my Fermi card now nvidia
INVidia may well be planning to deliver early cards for such non-gaming customers.
I haven't been following this thread much the last few weeks, is the card out yet?
digisan frowns and goes back in his closet wearing a golden sequin jacket and carrying his bucket of glitter.Get back in your closet; we'll call you to have the glitter ready when the time is right
digisan frowns and goes back in his closet wearing a golden sequin jacket and carrying his bucket of glitter.
http://www.electronista.com/articles/09/10/30/nvidia.fermi.details.escape/
1. Whoa.
2. On second thought, they're probably using stuff like BRDF measurements and such which are pretty far from viable in any computer games. Not to mention most of the data is usually static.
www.debevec.org
But it sure looks pretty cool.
Are those the same renders they showed back at GTC which were actually rendered by SLI'd GTX285's? (not the faces, they come from a different video and could be done on your mothers GTS250.)
And about their mobile parts, I hope "electronista" know that 3XXM parts are based on GT21x.
it seems to me that in the past few years Nvidia has been an enabler more than anything else, proprietary standards aside.
Heh, fooled me.
But as I've mentioned it shouldn't be impossible to do something like that, it's actually quite similar looking to the Alfred Molina CG head from Spiderman 2 that's been demoed on the PS3 a few years ago.
Yeah, I got it, what I've tried to say is that even if it was done in realtime, it would in no way be representative of what's actually realistic in a game project (except maybe the Fight Night series).
(ironically I was one of the few who tried to convince people about that KZ2 movie being CG
Of course. I mean even that unigine engine benchmark causes a 40% drop in performance with tessellation enabled while being devoid of AI and game characters, etc.