Accurate human rendering in game [2014-2016]

Status
Not open for further replies.
From the hellblade documentary
IQJ1TT3.gif


Screen behind is real-time in-game
 
Oh wow, absolutely amazing work Ninja Theory! The facial expression is possibly the best I've seen thus far not to mention the incredible lighting and fire effect. I got that instant CGI feel straight off the bat all thanks to the combination of convincing facial animation, lighting and effects. Tho how close the gameplay graphics resembles the teaser is still up in the air, with such a small team I am having doubts. But if they keep the scope tight and concentrate just on the characters and small to medium environment it maybe possible.
 
Facial animation look weird imo. The upper lip doesn't feel right.

Might also be the actress...she seems to look weird to in this gif and seems to 'over-act' imo.
 
Facial animation look weird imo. The upper lip doesn't feel right.

Might also be the actress...she seems to look weird to in this gif and seems to 'over-act' imo.

Well, she's not an actress i think she's an Ninja Theory employee.

https://twitter.com/juergensmelina

She's their video editor apparently. But even if that animation looks a bit weird and needs some tweaking it's a huge step in the right direction.
 
She did a life performance of Senua on GDC 2016 [tracking > full engine rendering]. Hopefully we will get footage of that.
 
She did a life performance of Senua on GDC 2016 [tracking > full engine rendering]. Hopefully we will get footage of that.


Dude what? You talk about it, 10 minutes later Epic upload this!? :LOL:

They are using anisotropic reflections and light transmission in their hair shader for paragon also
 
The most impressive part is at 9:25 where she's laughing and thanking the audience, the in-game character is actually believable! You can still see that it needs some tweaking but this could save a LOT of time.
 
Oookay, so the accuracy of the facial solver in the live demo wasn't as impressive, I wonder why.
Also, the constant movement of both the camera and the character were too much and sickening. Stay still, damnit :)

Still, the technology is amazing. I wonder how far they can go in reducing the lag - interaction in VR would be quite disturbing at this level.
 
Oookay, so the accuracy of the facial solver in the live demo wasn't as impressive, I wonder why.
Also, the constant movement of both the camera and the character were too much and sickening. Stay still, damnit :)

Still, the technology is amazing. I wonder how far they can go in reducing the lag - interaction in VR would be quite disturbing at this level.

From the presentation in the twitch stream, sorry for the watermark, you have to blame nvidia for that...
desktop03.17.2016-21.wajyd.jpg

desktop03.17.2016-21.a2jy4.jpg

desktop03.17.2016-21.wmjgt.jpg

desktop03.17.2016-21.mxkm2.jpg

desktop03.17.2016-21.hajlj.jpg

desktop03.17.2016-21.bpjv1.jpg

desktop03.17.2016-21.82jla.jpg

desktop03.17.2016-21.5ekvk.jpg

desktop03.17.2016-21.gekac.jpg

desktop03.17.2016-21.5mj5j.jpg

desktop03.17.2016-21.wnj2e.jpg

desktop03.17.2016-21.5vjnz.jpg
 
Yeah so their live capture and 'proper' capture aren't the same, there's obviously a lot of differences probably in quality as well. Means that the realtime stuff is more about previz and they'll have to re-do the performance for the actual content. Still impressive though :)
 
That's not what he said.
Was in a rush to work so I skimmed through the page and missed out on all the context :), the teaser obviously had proper tweaks and what not so if that's what would be reflected in the game then happy days. But yeah the tech could be a real nice previsualization tool that very closely resembles the artist's vision.
 
Care to elaborate?

I think what ultragpu means to say is that using this technique it's very easy to setup lighting and cameras in-game and get a very-close-to-final result in minutes even if it needs some tweaking afterwards.
 
Status
Not open for further replies.
Back
Top