Accurate human rendering in game [2014-2016]

Status
Not open for further replies.
So they do all this extra work and such to get a more realistic body... and keep the fake boobs. Right.

See nothing wrong with the breasts on this woman. I guess unless they are A cups, firm breasts are now considered a bad thing in this weird age where men pretend to appreciate the looks of potato-shaped landwhales on tumbler.
 
I like firm breasts, but when they are natural, they have that beautifull sliglty conical shape, with a bit of downward curvature on the top half, I mean, a tit shape.
Those from the vid-doc look like perfect half-spheres. They look like plastic surgery. Nothing atractive about them, and neither do they convey female-ferral-warrior. It just looks weird.
 
I like firm breasts, but when they are natural, they have that beautifull sliglty conical shape, with a bit of downward curvature on the top half, I mean, a tit shape.
Those from the vid-doc look like perfect half-spheres. They look like plastic surgery. Nothing atractive about them, and neither do they convey female-ferral-warrior. It just looks weird.

It's what they are starting to look like when you have well trained pectoral muscles and little to no body fat, though. Maybe a little bit smaller, but once the the fat is pretty much gone, all you've left is basically the very much circular shaped mammary glands sitting directly on top of pectoral muscles. Heck, they scanned the body of an athelete here, and since they're going to cover her up entirely anyway, I doubt they're deliberately going for a fake look.
 
Last edited:
Yes I've gotta agree those breasts look natural, believe it or not ppl have breasts in the real world that look like that (even larger) naturally
 
Don't know but these doll shader coated DOA babes just don't appeal to me any more, I much prefer a more realistic textured, modeled character these days :). Say Izzy from The Order who's nowhere near as perfectly curved in comparison yet the more down to earth proportion and gritty texture catches my attention so much more. Yeah I'm definitely getting old.
 
giphy.gif

Very good but a little uncanny valley effect with facial animation.
 
Well they're talking about some interesting stuff, even if only superficially - but they sure as hell are not talking about any pipeline in detail :)
Then again I'd agree with anyone in that defining what a pipeline in 3D production is can be difficult, and certainly not easy to describe to a layman... Still, let me try to come up with an explanation...

So, basically, a production pipeline is both the various stages of operations performed and the data flow between those stages.

For a character's creation, it could look like this:

Code:
concept design                                                                                            texturing               shading
                                                                              modeling                                                                                asset compiling and export
real-world data acquisition and processing                                                                  rigging

I hope the formating works...

So, in operations, you could do concepts and scanning in parallel, but then the two should be both completed for modeling; then you could take the final mesh and do the textures and shaders in parallel with rigging.

In terms of data flow, all stages would create lots of different data; scans would give you lots of multi-million poly meshes and textures, design would create images; then modeling would give you a final mesh which rigging would use to build bones, create controls and rig mechanics while texture and shader data are created; then all of it would compile to an engine specific description of meshes, skin weights, bones, textures etc.

There are also a lot of smaller stages in the pipeline, concerned with how the data is actually handled and transfered between the operation stages. Things like how you export the scan data and in what format; how it's imported into modeling; or how the rigging gets the mesh.

Then the more advanced stuff comes in :)

A lot of this stuff can be automated by the scripting languages in most content creation tools; and studios can also write custom code to even modify this data or generate new data from the inputs. For example, if you have like 50-100 scanned expressions, the output from that is a lot of high-res meshes with auto UVs for their textures - you can write a tool to take the final model of the character head and do a first pass fit to those expression scans and calculate the final textures for its UVs from the source data. These tools and their application are also considered to be a part of the pipeline.

Then, one of the most common problems with pipelines is their inherently linear nature, creating dependencies, delays in iterations, and so on. So many studios are driven to introduce even more parallelism, usually adding more stages and more data but gaining flexibility and sometimes also saving lots of work.
For example, we've seen a lot of automated tools released in the past few years - automatic retopology tools building rough models from scan data (or rough zbrush sculpts), automatic UV mapping, transfering skeleton skin weights, even generating texture maps and shaders. The most common use of these tools nowadays is to really quickly create prototype assets, in order to speed up iterations for look development; then a lot of the intermediate data can also be re-used for the eventual asset. But more and more often the automatic tools can actually create final assets as well.
But in terms of the pipeline in general, this means that you can hand off an automatically created crude model to rigging well before the final model is completed, so they can actually start working in parallel with modeling. Or, you can give the shading guys rough textures to start with, well before the texture painters complete their work.

And going further on, you can also deliver preliminary data to other departments, in order to get them started in parallel with the asset creation as well. Animators can get a rough character rig to start building motion clips; cinematics directors can also start with a rough character, and so on.

Oh and of course all departments can have their own individual pipelines - not just asset creation, but animation, or in games, level design, cinematics, and so on.

Finally, I'm nut sure if there is an "overall pipeline" of sorts in games; but there sure is one in creating animated movies.
It goes like, script - storyboard - animatics - asset creation - scene building and layout - animation - effects - lighting - rendering - compositing - editing... At least in general. Then it also gets more complicated, with parallelism, iterations and so on... Oh and you also have to include stuff like concept design, look development, respond to feedback and such.
Or maybe you're creating VFX for live action movies, which can add some extra stages like location surveys, previz and post-viz, camera tracking, rotoscoping...

Then you end up a bit sleepless, mentally exhausted, get some grey hairs and/or hair loss, and all other kinds of fun stuff.
And still, at the end of the day, you wouldn't want to do anything else :)
 
One reason there is constant iteration is computational constrains for the final asset, which is not known in advance, as the context the asset playes in in either still on storyboard or varying to a large degree, and you have to optimize it for the "hardest" shot. For movies you can just do whatever you need to some degree.
 
Doubt it.
22153858586_e5d16d755ixp73.png

PfXYfRW.jpg

I like my PS4 but it is much better than Until Dawn character. The only things I saw not so far from this is UC4. But UC4 character are more stylized. It is difficult to compare to Star Citizen character. It is a true PC games with a very big budget and a great art team.
 
They said that blend shapes and bones just approaching the 1000 mark on controllers. They don't differentiate between blendshapes and bones. They have also got texture performance. 44 areas on the face that can wrinkle, diffuse, compress skin and deform map etc. The characters will look nearly like this (which is still WIP ans will look better in the end) during gameplay. Characters in gameplay often looks significantly worse than in pre canned realtime cutscenes with close shots and less or no player or camera control.

In addition, Suqadron 42 will be an open world sandbox game with many characters und kilometer long space stations where each room should be accessible. They also have about 30 characters where there aim to have such a level of quality. When making a singleplayer only game, you can can do a lot that you can’t do when you are making an MMO. Everything in Star Citizen (like characters and ships) has to be an item.

There will be more "deep" dives into characters.
 
Status
Not open for further replies.
Back
Top