Next-Generation NVMe SSD and I/O Technology [PC, PS5, XBSX|S]

During the portal transitions? It absolutely does. From your own linked video:

View attachment 9401

These are not simply duplicate frames that you get in cutscene transitions to establish the TAA, you can see there are multiple stutters, and you can feel it. The Native 4K mode is already starting out at a much lower fps so of course it's not going to exhibit the same drops in absolute framerate terms if the the overheard of the I/O is the same as in the performance mode, as it would be.

There are two segments further on in the game that have a similar jump between several worlds in a short span, and they stutter just as much, if not more. Significant? Not really. Better than a high-end PC? Perhaps, they certainly stutter less than on my PC (12400f, 3060). But, they do stutter on PS5, and it's not just a single frame.

Those might be the kind of points where you're allocating memory and then initialising for the map and various objects on the other side of the portal transition. Given that a lot is changing during the portal transition this is where I'd expect to see the biggest hit from something like this. Naturally, you'd expect the cost of something like memory allocation to be greater on PC.
 

Didn't know this hasn't gone through yet, and now it's scrapped.
Though after the deal was announced it seems that many newer SSD uses Phison's controllers instead.
 
(massive post of blabbing incoming)

Just thinking out loud here on the topic of the thread..

The Consoles, especially with this current generation, are all about maximizing available memory for what is immediately around the vicinity of the player. As Mark Cerny put it, the idea is to have "most of the RAM working on the game's behalf" instead of being filled with things that may or may not be used any time soon. And in such a system with unified memory, that makes 100% perfect sense.. the large throughput afforded by the SSD to system memory allows them to pull things into memory just before it's needed, and allows the CPU/GPU to access the data as quickly as possible.. thus you can more effectively utilize your memory.

On PC however, things are a bit different.

We have fast SSDs, fast buses between all the components... however games are still designed around pulling data from storage into RAM as it is needed. Now, on the surface, that's fine.. but on a Windows PC there's CPU costs to pulling that data into system RAM and "checking it in"... There's costs to decompress that data on the CPU. There's costs to copying that data over the PCIe bus over to VRAM.. where it can FINALLY be used.

When you really think about it... what SSD storage is to the consoles.. RAM should be to PCs.. because there's that extra step of getting the data from RAM to VRAM in a GPU ready format. Meaning that for games on PC to properly utilize the architectures strengths... that data needs to ALREADY be in RAM, either already decompressed by the CPU (No DirectStorage) or still compressed (DirecStorage) so it can be called and decompressed rapidly by the GPU.

So, why am I saying this? Well, games need to better allocate and utilize available memory ahead of time! On console, you have a I/O block which can decompress the data you need and have it in memory in a ready format automatically and thus you can call it just before it's needed. On PC, due to the various overheads.. it takes more time. You need to reduce that overhead as much as possible.. and the best way to do that is to already have that data in memory long before it's needed to go. The reason to do it much further ahead of time, would ideally be to do it in a more granular fashion so that there's no massive taxing done to the CPU to get it into RAM. Do it more granularly and don't stress the CPU as much. Once that data is in RAM, the GPU can quickly decompress that data as it's needed. So THEN a developer can more tightly load things into VRAM as they are needed.

And that's the strength of the PC architecture. We (can/do) have system RAM in abundance. Let's think of Ratchet's Portal sequence for a second.. On console, you could have each transition pull data in from the SSD into memory ready to go.. but on PC you could have that entire sequence, all the textures, all the animations, sounds, effects...everything, loaded into memory way ahead of time in a more granular fashion as you play and not stress the CPU/SSD nearly as much. DirectStorage (both CPU and GPU based) already reduces a lot of these bottlenecks.. but the most important thing is getting that data into RAM first and foremost.

Sure, games these days allocate a lot of RAM... but do we REALLY know what is going on there? I could see a game allocate 20GBs of RAM.. but that doesn't necessarily mean that it's loading things in a smart way. That could actually mean that the game is potentially failing to properly garbage collect and that they've simply allowed the game to bloat up until a forced collection happens at some point.. likely causing a stall. We need games to start pre-loading data in a smarter way for the platform. As I said, PCs have incredibly fast CPUs, SSDs, and buses... so when loading initially and going full out with DirectStorage, you should be able to fill up massive amounts of RAM very quickly at Gen 4/5 speeds.

Which leads me to another point. We as the PC gaming community.. need to change our outlooks and reactions upon games utilizing high amounts of resources! We should expect our games to fully utilize the resources we have at our disposal! If a game comes out and is utilizing 28GB of RAM on a 32GB machine.. despite not providing an obvious reason as to why.. people will lose their minds and slam the developers for their terrible optimization and blame them every which way. We have to cut that out until we know more about what it's doing. The idea should be that we want games to use as much resources as possible, and that we judge the merits of the performance on how well it performs as you scale down! If the game is utilizing 26GB of 32GB of RAM on my PC... but also runs just fine on a system with 16GB at the same settings... why would that be bad? Perhaps the game loads a bit faster, or textures don't pop in, or it stutters less often or not at all? We shouldn't automatically assume that high memory utilization is bad. That goes for RAM, VRAM, and even GPU and CPU resources. I've seen people complain that their CPU utilization is too high... claiming the game was needlessly taxing their system.. How do they know what the game was doing? Perhaps it was elegantly compiling shaders in the background or doing some other task? I think developers may be apprehensive about allowing games to utilize heavy amounts of resources because they fear getting blamed for releasing an unoptimized game.. when in reality, we should encourage games to utilize the resources we can offer.

I see that Baldur's Gate 3 (a game with it's own set of performance problems) has a setting called "Slow HDD mode" which has the game load in more data off storage into RAM ahead of time. Slower storage access at the expense of higher ram utilization. If you have the RAM.. why not?? And thus that's what I think we need more games to do. I'd be down for more games to think like that and even have it as a setting to activate.
 
When you really think about it... what SSD storage is to the consoles.. RAM should be to PCs.. because there's that extra step of getting the data from RAM to VRAM in a GPU ready format. Meaning that for games on PC to properly utilize the architectures strengths... that data needs to ALREADY be in RAM
The more graphically dense the scene the more vram you need. You want more and more layers, running higher resolutions, running highly detailed textures, all of that has to be in vram at the moment it’s required to be rendered. let’s say at the theoretical best only load in just what you need to see, let’s say perfectly, but there is always a moment where as you scale higher more vram is required.

So consoles suffer from the exact same issue, their limitation is 14ish GB vram. That’s their limit, no amount of high speed data transfer will resolve that. Assuming unlimited compute power, The size of vram and the speed of vram will dictate what you can render, how many fps you can render do to bandwidth limits. Fortunately our developers are smart, and streaming technologies are headed towards micro tiling and micro geometry dramatically reducing what is required in vram to render.

But that can blow up with more density and further draw distances.
 
Wow, I didn't realize Immortals of Aveum uses UE5 world partition system. I think this is the first game to use all three signature features of UE5, Lumen, Nanite and WP. This isn't a game for me but it makes me curious about my suspicion of UE5 WP scaling well with i/o performance in a significant and measurable way. Maybe there is console advantage to be seen here.


And then there’s World Partition, Unreal Engine 5’s method of loading segments of the environment as you move through it. That allows for some truly enormous play areas. “Our levels are huge,” Mark says. “We have a big variety of level sizes—some that are large and open-ended and others that are more focused. A couple of our levels are the equivalent of 25 or 30 kilometers in length! That’s just the playable space! To be clear, we’re not an open-world game though. We love to play those games, but it doesn't fit the vision we have or the story we want to tell.”

“So what World Partition does,” says Dave, “is lets us split up those huge levels into smaller sections that can be streamed in and out of memory as the player moves through them. This allows us to maximize the visual fidelity surrounding the player at all times—and also lets us build levels of a size that previously wasn’t possible without doing loads that interrupt gameplay.”

“And unlike open-world games you’re used to,” Mark says, “all our levels are meticulously designed with all this bespoke art, which is made possible by Lumen and Nanite. So covering that ground would actually be a lovely walk! And you’ll find all kinds of environmental storytelling that adds to the feeling that this is a lived-in world.”

And it’s a world you’ll be able to manipulate yourself. On your travels through Aveum you’ll encounter structures and statues that reach into the sky, and they’re not just for show; with the power of your sigil, you’ll be able to reform the world around you to create new paths to traverse and new ways to explore. Giant statues made of ancient stone will move with life under the guidance of your hand. Overgrowth that wrapped itself around mountains eons ago will untangle and reach out at your will.
 
but the most important thing is getting that data into RAM first and foremost.

Sure, games these days allocate a lot of RAM... but do we REALLY know what is going on there? I could see a game allocate 20GBs of RAM.. but that doesn't necessarily mean that it's loading things in a smart way. That could actually mean that the game is potentially failing to properly garbage collect and that they've simply allowed the game to bloat up until a forced collection happens at some point.. likely causing a stall. We need games to start pre-loading data in a smarter way for the platform. As I said, PCs have incredibly fast CPUs, SSDs, and buses... so when loading initially and going full out with DirectStorage, you should be able to fill up massive amounts of RAM very quickly at Gen 4/5 speeds.

Which leads me to another point. We as the PC gaming community.. need to change our outlooks and reactions upon games utilizing high amounts of resources! We should expect our games to fully utilize the resources we have at our disposal! If a game comes out and is utilizing 28GB of RAM on a 32GB machine.. despite not providing an obvious reason as to why.. people will lose their minds and slam the developers for their terrible optimization and blame them every which way. We have to cut that out until we know more about what it's doing. The idea should be that we want games to use as much resources as possible, and that we judge the merits of the performance on how well it performs as you scale down! If the game is utilizing 26GB of 32GB of RAM on my PC... but also runs just fine on a system with 16GB at the same settings... why would that be bad? Perhaps the game loads a bit faster, or textures don't pop in, or it stutters less often or not at all? We shouldn't automatically assume that high memory utilization is bad.
So this was something people were thinking about early on already - would PC's need lots of system RAM to handle XSX/PS5-based next gen titles? If consoles can use memory much more efficiently, then PC's would need a lot more RAM instead since they couldn't just get data from the storage drive to be used immediately. DirectStorage was the hopeful answer to this, but if this somehow cant work well enough, then yes, perhaps large amounts of RAM are needed.

Excepted needed is the key word. This wouldn't be about 'smarter' usage, it would simply become a raw minimum requirement increase. And some stutter or whatever if you're under-spec'd are the least of the problems - you're talking about potential unstable software. And then you face the problem of idiot PC gamers, which is most of them, thinking that the game is unoptimized because it needs 32GB or even 48GB of RAM, and then trashing on the game and the developers.

I think we really just better hope that DirectStorage itself and its implementations by developers will simply get better in time and we're still just facing natural teething issues. Developers would certainly prefer to be able to optimize memory management on PC in a more similar way to consoles than having some completely different strategy. That's a lot more work overall. And devs are already stretched too hard in terms of the amount of work they have to do.
 
Last edited:
So this was something people were thinking about early on already - would PC's need lots of system RAM to handle XSX/PS5-based next gen titles? If consoles can use memory much more efficiently, then PC's would need a lot more RAM instead since they couldn't just get data from the storage drive to be used immediately. DirectStorage was the hopeful answer to this, but if this somehow cant work well enough, then yes, perhaps large amounts of RAM are needed.

Excepted needed is the key word. This wouldn't be about 'smarter' usage, it would simply become a raw minimum requirement increase. And some stutter or whatever if you're under-spec'd are the least of the problems - you're talking about potential unstable software. And then you face the problem of idiot PC gamers, which is most of them, thinking that the game is unoptimized because it needs 32GB or even 48GB of RAM, and then trashing on the game and the developers.

I think we really just better hope that DirectStorage itself and its implementations by developers will simply get better in time and we're still just facing natural teething issues. Developers would certainly prefer to be able to optimize memory management on PC in a more similar way to consoles than having some completely different strategy. That's a lot more work overall. And devs are already stretched too hard in terms of the amount of work they have to do.
I did a rudimentary test in this post here. Even a small 4GB cache was enough to capture 90% of the disk reads playing through the first mission in R&C.

This is just one game though.
 
Back
Top