Well they're talking about some interesting stuff, even if only superficially - but they sure as hell are not talking about any pipeline in detail
Then again I'd agree with anyone in that defining what a pipeline in 3D production is can be difficult, and certainly not easy to describe to a layman... Still, let me try to come up with an explanation...
So, basically, a production pipeline is both the various stages of operations performed and the data flow between those stages.
For a character's creation, it could look like this:
Code:
concept design texturing shading
modeling asset compiling and export
real-world data acquisition and processing rigging
I hope the formating works...
So, in operations, you could do concepts and scanning in parallel, but then the two should be both completed for modeling; then you could take the final mesh and do the textures and shaders in parallel with rigging.
In terms of data flow, all stages would create lots of different data; scans would give you lots of multi-million poly meshes and textures, design would create images; then modeling would give you a final mesh which rigging would use to build bones, create controls and rig mechanics while texture and shader data are created; then all of it would compile to an engine specific description of meshes, skin weights, bones, textures etc.
There are also a lot of smaller stages in the pipeline, concerned with how the data is actually handled and transfered between the operation stages. Things like how you export the scan data and in what format; how it's imported into modeling; or how the rigging gets the mesh.
Then the more advanced stuff comes in
A lot of this stuff can be automated by the scripting languages in most content creation tools; and studios can also write custom code to even modify this data or generate new data from the inputs. For example, if you have like 50-100 scanned expressions, the output from that is a lot of high-res meshes with auto UVs for their textures - you can write a tool to take the final model of the character head and do a first pass fit to those expression scans and calculate the final textures for its UVs from the source data. These tools and their application are also considered to be a part of the pipeline.
Then, one of the most common problems with pipelines is their inherently linear nature, creating dependencies, delays in iterations, and so on. So many studios are driven to introduce even more parallelism, usually adding more stages and more data but gaining flexibility and sometimes also saving lots of work.
For example, we've seen a lot of automated tools released in the past few years - automatic retopology tools building rough models from scan data (or rough zbrush sculpts), automatic UV mapping, transfering skeleton skin weights, even generating texture maps and shaders. The most common use of these tools nowadays is to really quickly create prototype assets, in order to speed up iterations for look development; then a lot of the intermediate data can also be re-used for the eventual asset. But more and more often the automatic tools can actually create final assets as well.
But in terms of the pipeline in general, this means that you can hand off an automatically created crude model to rigging well before the final model is completed, so they can actually start working in parallel with modeling. Or, you can give the shading guys rough textures to start with, well before the texture painters complete their work.
And going further on, you can also deliver preliminary data to other departments, in order to get them started in parallel with the asset creation as well. Animators can get a rough character rig to start building motion clips; cinematics directors can also start with a rough character, and so on.
Oh and of course all departments can have their own individual pipelines - not just asset creation, but animation, or in games, level design, cinematics, and so on.
Finally, I'm nut sure if there is an "overall pipeline" of sorts in games; but there sure is one in creating animated movies.
It goes like, script - storyboard - animatics - asset creation - scene building and layout - animation - effects - lighting - rendering - compositing - editing... At least in general. Then it also gets more complicated, with parallelism, iterations and so on... Oh and you also have to include stuff like concept design, look development, respond to feedback and such.
Or maybe you're creating VFX for live action movies, which can add some extra stages like location surveys, previz and post-viz, camera tracking, rotoscoping...
Then you end up a bit sleepless, mentally exhausted, get some grey hairs and/or hair loss, and all other kinds of fun stuff.
And still, at the end of the day, you wouldn't want to do anything else