I was thinking more realtime content. AFAIK that mentioned method is pre-creating textures algorithmically on startup and then using them exactly as normal. I thought SpeedTree did this by varying the trees as they are created when they come into view.
My inspiration for considering this was outside in the evening seeing the tiny shadows cast by the detail on a brick wall. I got thinking how you could create procedural texturing for a surface as you get closer. You'd have your large brick textures from afar, and as you approach the wall, detail is added algorithmically to add realism. To keep it consistent you'd need a 'seed' texture or value, perhaps per vertex, so the procedural material doesn't vary either between frames or when you zoom out and zoom back in again. For things like random noise, procedural texturing should add considerably to realism, elliminating repeating textures, without costing much to implement, but it depends how you get the texture info to the GPU. It may be better to have the maths done in pixel shaders rather than CPU. It'd be great on things like walls, trees, dirt and the such.