But I mean right now. When we look at a port of a PC title to PS5 or Xbox, are we guaranteed they don’t just load textures into memory the old way? And are we guaranteed they are flushing textures when not needed because they can be loaded back so fast?
To be honest, it doesn’t seem all that likely that this is guaranteed to be optimized for all titles on next-gen consoles?
Not all games want or need to use streaming textures. Something like Twelve Minutes, for example, doesn't need to stream in textures. Not the greatest example since the graphics aren't out of this world, but they could be if the developer had the budget for it. Heck, entire genres (like fighting games) don't really need streaming textures.
What will be interesting to see this generation is whether larger open world games that rely on streaming (HZD/Spiderman type games) will be able to approach the graphics quality of preload level type games (with smallish levels like GoW 2016).
Assuming that a studio has a large enough budget for the required asset creation, streaming games should have a large advantage with regards to variations in assets with any given scene.
But here we run into the biggest limitation most development studios run into, time and budget. To fully realize the potential for streaming with the new I/O, studios are, IMO, going to need a relatively large bump in time and budget compared to the previous generation in order to create all the assets for much more detailed and diverse worlds that can really show what the new I/O can do. Alternatively perhaps some type of AI driven automatic or semi-automatic asset creation (which again is likely out of reach of any but the largest dev studios).
Although, counterintuitively, to fully realize the potential of streaming I/O, developers may have to develop new techniques where less work is done per pixel so you minimize how much work is done with new assets so that you can flush and load in more assets per second without most of the memory being used for work done per pixel. Something like variable rate shading, for example. I'm sure developers will come up with more techniques to reduce work per pixel over the generation.
So, in short, some developers will be able to exploit the benefits of increased I/O but I'm not sure whether any but the largest development houses will be able to show significant improvements in graphics quality due to the faster I/O. For development studios with a limited budget, they may be just as well off with pre-load systems as they would be with streaming systems. Heck, some might be better off with pre-load systems as they are inherently less complex than a streaming system.
Regards,
SB