Could always figure out runtime substance designer! Or hell, runtime substance designer + shader graph stuff for 99% procedural art.
I often thought about this, but some simple procedural worley and perlin noise... is it still good enough? And layers of that already have some cost. So you would like to precompute on asset load. Why a fast SSD then? Just stream and display would no longer work.
It's much more attractive to do it offline, so results are really good, and just compress it as usual. I think even Substance designer is not enough anymore for the purpose. I consider to utilize fluid simulation and curvature fields to guide material growth and placement. This could give good quality, but realtime is out of reach.
For artists the workflow of browsing Quixel assets and build your stuff from that seems also quite restricted. Can't model every bit with Zbrush and add all those details. So a leap in procedural geometery tools is also necessary. Just placement of scanned building blocks is not enough either.
Now we could make those tools, but then we get too much unique geometry. Not enough instancing, not enough storage.
What's the solution? Using less details for less needs of instancing? To have more freedom to create more unique worlds?
Technically it would not work. Because at some distance there is no difference if we use poly at centimeter or miliimeter level. At some distance it's all the same, and memory requirements do not change because of less detail. Because LOD switch only happens at sub pixel level, memory will be always full with tiny triangles.
So that's the missing part. How do they handle LOD if subpixel detail is not possible, and discrete switches become visible above that?
I'm curious about this, also for mobiles or other low power platforms.