Er, I don't think that's how the Substance pipeline works, even at ND.
For a start, the software is built around bitmaps and not procedurals; either hand painted, extracted from the high res source model (things like normals, cavity, AO, direction, edge detection etc) or based on photographs.
Edit: thinking about it a bit more, there may be some default fractal / Perlin noise functions in the package as well, but they're probably only used to add variation to the masks generated from the highres model. And yeah, maybe you can call the cavity and edge maps procedural as well, after all they're generated from the normals and maybe height maps
They might do some of the complex texture map combinations in real time, although it's probably still be better (ie. faster) to just bake out everything into bitmaps. But Substance is AFAIK a content creation tool and not a runtime middleware.
By the way there was a GDC presentation from last year, about the Halo 2 anniversary pipeline, that was already talking about a lot of the possibilities and workflows possible in Substance, combining it with Zbrush etc. Not to take away any credit from Naughty Dog - especially because they seem to had a LOT of input on how to develop the software - but this approach to asset creation isn't such a big news.
Also I think Ready at Dawn had a pretty similar pipeline for The Order, but without using Substance and developing a completely new inhouse tool instead.