This is not strictly related to game development, but I believe it is still interesting for various reasons - here's a behind the scenes look at our work on the Mass Effect 3 cinematic trailer.
http://www.fxguide.com/featured/cinematics-case-study-mass-effect-3/
Here are some of my thoughts about this stuff and how it relates to realtime graphics:
- There's a significant amount of simulation behind most of the effects work and they're all processing and storage intensive. Note the disk cache sizes for the volumetric fluid (fire/smoke) stuff in particular. This kind of work on Wrath of the Titans required more than a hundred terrabytes (!) of disk space, so we're only scratching the surface here. While there's R&D into utilizing GPUs for the calculations, it's still not good enough most of the time... and there's still the caching part to cut down the iteration times when refining the look.
Games are still using 2D sprites and the jump to 3D volumetrics, even at a lower quality level, looks pretty damn expensive.
There's also cloth and hair, not discussed here because it's considered to be more or less a standard feature - games are making progress here but they still don't get the chance to cheat and fix problems like intersections, frame by frame, only for the camera.
- Matte paintings of all kinds are a very cost effective cheat to create grand vistas even with real camera movement. Simple 2D images are just the start as you can see, it's also possible to project back onto 3D geometry and cards to create a sense of depth that even works in stereo. Interactive cameras limit games in utilizing this technique, significantly increasing production costs. Don't be surprised to see newer games taking away camera control more and more often...
- The HDR lighting, reference image shooting, tone mapping stuff is however something that practically any game can make use of, particularly on the next generation of hardware where I expect frame buffer depths, gamma curves and other hw features to be far better optimized for real-world requirements.
- It's also worth to mention that there's enough convergence on the content creation side to allow us to make use of game development related materials of all kinds in our work, and I'm sure the movie VFX guys should also keep their eyes open.
We've already settled on the Right Way to do things in various aspects of CG, and as games adapt these, the limitations of real time execution and hw features force engineers to be more creative and effective in their implementations, thinking of things an offline CG programmer would never bother to look into.
So, I hope the article was interesting