Yes I did, sorry
Well, it's not that clear IMHO.
Some of our clients have asked us for our assets, years ago, for their nextgen R&D teams, so that they can study it. Funnily enough, this is the reason why a new game's lead character has
my hands - as in, it's the geometry that I've modeled, and it is also based on my own hands
On the other hand, 343 has built a workflow and tools for their facial animation in Halo 4 that is far more complex than ours. They've programmed their own performance capture solver for the face, they've scanned real actors for every character, and the pipeline can be scaled up for the Xbox One easily; Halo 4 is like its lowres version.
Or to talk about Quantic, this demo is clearly using an evolution of their existing pipeline. They've clearly invested a lot of R&D into the software side and a lot of money into their own Vicon mocap system, so they're trying to scale it up for PS4. Unfortunately I believe it's a dead end, using straight translation data to drive bones is not enough to capture the subtleties of how the face deforms.
Actually I can't, as I would not be content with stuff that looks like it can be done in realtime
But I think you're wrong here, because our main characters look vastly superior to what the engine can do even on the PS4. But Aiden is very closely based on his ingame version, so the differences are somewhat subtle. Have you noticed the tiny hairs on his sweater, for example? Or that the knitted stuff is actually displaced, instead of just using a normal map? Or that he has actual hair and facial hair and eyelashes instead of textured polygons? These are small but important differences that can still go a long way. Not to mention the subtleties in the raytraced bounce lighting and area shadows. Or that all the clothing is properly simulated, at a much higher fidelity then what the engine can do. Not to mention the polygon counts, but I'd rather check the actual data at the office tomorrow