The need for sustained high throughput loading of data in games? *spawn*

When people say ssd isnt going to do much for gaming outside of less loading screens and giving devs less headaches(admittedly both good things£, i cant help but think that its a lack of imagination saying that rather than reality. How things have worked up to now, blinds to possibilities not even thought of.

We already have early examples of games leveraging the tech and it will only evolve from here as engines and devs get used to not having to strictly use traditional methods.

Plenty of devs have talked about great potential possibilities with this text. They are just potential right now but in the future it wont be
 
If games start being designed around constant SSD throughput potentially keeping only mere seconds of gameplay in memory, I wonder just how hot the SSD chips are going to get inside both consoles.. especially the PS5. Surely it'll be attached to a massive heatsink and expected performance should hold up even under unfavorable scenarios, but hours of constant loading at that speed isn't typical. Then there's the issue of the add-in drives.
 
NPC AI doesn't really sell games so it's not focused on. None of it needs better hardware. Every gen is the same because the bottleneck is dev time, not hardware.

Im not talking AI. I’m talking lack of variety when dealing with the main enemy NPCs. If streaming allows for varied characters in game then I would be super hyped.
 
Im not talking AI. I’m talking lack of variety when dealing with the main enemy NPCs. If streaming allows for varied characters in game then I would be super hyped.
Both are more bottle necked by dev time than tech. Designing game interactions doesn't really have much to do with hardware speeds when you can just turn down graphics if you really want.
 
If games start being designed around constant SSD throughput potentially keeping only mere seconds of gameplay in memory, I wonder just how hot the SSD chips are going to get inside both consoles.. especially the PS5. Surely it'll be attached to a massive heatsink and expected performance should hold up even under unfavorable scenarios, but hours of constant loading at that speed isn't typical. Then there's the issue of the add-in drives.

Isn't that exactly what Sony appear to be doing? They've worked with Epic, for years apparently, who are designing an engine that makes full use of that throughput. Didn't someone from Epic recently state that they had to rewrite parts of the engine because it wasn't making full use of the SSD?

I find it baffling how some people want to throw cold water on the area that Sony have banked on.

If Epic have known about these possibilities for a long time, then imagine what studios like Naughtydog have planned?
 
Isn't that exactly what Sony appear to be doing? They've worked with Epic, for years apparently, who are designing an engine that makes full use of that throughput. Didn't someone from Epic recently state that they had to rewrite parts of the engine because it wasn't making full use of the SSD?

I find it baffling how some people want to throw cold water on the area that Sony have banked on.

If Epic have known about these possibilities for a long time, then imagine what studios like Naughtydog have planned?

Nope. Epic clearly said that it was latency which was more important with more mundane throughput being perfectly ok. It's the I/O coprocessors and the susbtantial on-die SRAM that eliminates overhead in accessing data which are being targeted.
 
Both are more bottle necked by dev time than tech. Designing game interactions doesn't really have much to do with hardware speeds when you can just turn down graphics if you really want.
isnt thats exactly what devs currently did? They turned down the models variety. Resulting multiple characters in one level got the same model (or same clothing, or same animation) being repeated. With fast SSD IO, they probably can grab and trash them on-the-fly, so they dont need to allocate tons of RAM space for that.
 
isnt thats exactly what devs currently did? They turned down the models variety. Resulting multiple characters in one level got the same model (or same clothing, or same animation) being repeated. With fast SSD IO, they probably can grab and trash them on-the-fly, so they dont need to allocate tons of RAM space for that.
They can turn down models and still have different models doing different things. I don't see why getting better specs would suddenly make them focus on different models rather than just doing what they always done with more specs, which is to increase the visuals to the point where they can't have more models. The PS4 didn't suddenly have 10x more variety than the PS3 and there is no reason to expect PS5 will change that.
 
NPC AI doesn't really sell games so it's not focused on. None of it needs better hardware. Every gen is the same because the bottleneck is dev time, not hardware.
I remember Bethesda made a huge deal with their Radiant AI system for Skyrim's NPCs and it got a lot of coverage at the time.
 
I remember Bethesda made a huge deal with their Radiant AI system for Skyrim's NPCs and it got a lot of coverage at the time.
Which is funny cause Skyrim's NPC AI is probably really complicated but in practice, they aren't exactly what people think when they think good NPC AI. Although I'm sure Bethesda made a bunch of effort in it, the end result seems no better than a small improvement over previous gen games. The amount of effort to make really good AI is just not worth it to most game devs or publishers.
 
Both are more bottle necked by dev time than tech. Designing game interactions doesn't really have much to do with hardware speeds when you can just turn down graphics if you really want.

Let’s take State of Decay 2. There are dozens upon dozens of survivors with unique looks and talents that exists in the game. You can dress these guys in different wear while providing a load out from an inventory that can consists of dozens of weapons (melee and ranged).

However, there are only 4 zombie types with 95% of the zombies consisting of just one type with 3 models (man, woman & obese man).

That reality isn’t an art budget issue. It’s a technical one as it puts less pressure on vram storage if you have 8-10 NPCs in a scene represented by 1 or 2 models than if you have 8-10 NPCs represented by 8-10 models.

I would love it if the new streaming systems allowed devs greater latitude with variety amongst enemy NPCs. Less scene data maintained as resident on VRAM making room for more NPC models.
 
Nope. Epic clearly said that it was latency which was more important with more mundane throughput being perfectly ok. It's the I/O coprocessors and the susbtantial on-die SRAM that eliminates overhead in accessing data which are being targeted.

Can you clarify, you're suggesting that Epic have *not* made an engine that makes full use of the throughput?
 
Let’s take State of Decay 2. There are dozens upon dozens of survivors with unique looks and talents that exists in the game. You can dress these guys in different wear while providing a load out from an inventory that can consists of dozens of weapons (melee and ranged).

However, there are only 4 zombie types with 95% of the zombies consisting of just one type with 3 models (man, woman & obese man).

That reality isn’t an art budget issue. It’s a technical one as it puts less pressure on vram storage if you have 8-10 NPCs in a scene represented by 1 or 2 models than if you have 8-10 NPCs represented by 8-10 models.

I would love it if the new streaming systems allowed devs greater latitude with variety amongst enemy NPCs. Less scene data maintained as resident on VRAM making room for more NPC models.
They could increase the number of zombies models or increase the number of hats you can use with the increase in power in every gen, guess which one of those wins every gen? Do you think next gen they are going to think: "Okay, these hats are plentiful and shiny enough, lets focus on the zombie models"? I don't think so.
 
Isn't that exactly what Sony appear to be doing? They've worked with Epic, for years apparently, who are designing an engine that makes full use of that throughput. Didn't someone from Epic recently state that they had to rewrite parts of the engine because it wasn't making full use of the SSD?

I find it baffling how some people want to throw cold water on the area that Sony have banked on.

If Epic have known about these possibilities for a long time, then imagine what studios like Naughtydog have planned?

Agreed.

The added utility of Sony’s use of an SSD with its performance doesn’t need to be realized with constant streaming at full bandwidth. No more than 36-52 CUs dont need to churn out 10-12 Tflops every second to realize a significant performance gain over last gen hardware.

Last gen software would only need to utilize a fraction of 1% of the bandwidth offered by the PS5’s SSD, so there is no need to run the drive like a rented mule to see a significant performance increase.
 
They could increase the number of zombies models or increase the number of hats you can use with the increase in power in every gen, guess which one of those wins every gen? Do you think next gen they are going to think: "Okay, these hats are plentiful and shiny enough, lets focus on the zombie models"? I don't think so.

I would hope so. NPC variety (outside of using them as scene props) is probably one aspect of gaming that has evolved the least over the years.

The biggest reason it isn’t an immersion breaker is that we have become soo used to it. But imagine watching a war film where every human soldier is played by just 4 actors. LOL.
 
Last edited:
Can you clarify, you're suggesting that Epic have *not* made an engine that makes full use of the throughput?

What do you mean by "full use of the throughput" ? Epic are on the record in EDGE for saying that the throughput demand is not out of the ordinary for SSDs while latency is, which is what you should expect from a Just-In-Time system that minimizes "waste" (caching in DRAM).
 
What do you mean by "full use of the throughput" ? Epic are on the record in EDGE for saying that the throughput demand is not out of the ordinary for SSDs while latency is, which is what you should expect from a Just-In-Time system that minimizes "waste" (caching in DRAM).

This makes sense that the fundamental principles of Nanite and Lumen wouldn't require throughput as high as the PS5 (or even XSX) as they need to be able to run on slower PC solutions. That said I'd expect the engine has the ability to use as much bandwidth as is on offer if a developer chooses to use it for an exclusive game.

In terms of latency that's not a problem since even on PC's with slow SATA SSD's you would use the system RAM as cache for anything likely to require low latency access that won't fit in the available VRAM. You'd just have to be smart about what you're pre-caching in system RAM to ensure it's available when needed. But going back to the original point of this thread, if you have a GPU with 8GB VRAM coupled with 16GB system RAM of which say 12GB is usable for pre-caching, then that's 20GB total or around a 5th of your entire game content. Surely that should be enough to pre-cache at least the next several second of gameplay, if not in fact the next several minutes with only minimal streaming from disk as the player moves through the game world.
 
This makes sense that the fundamental principles of Nanite and Lumen wouldn't require throughput as high as the PS5 (or even XSX) as they need to be able to run on slower PC solutions. That said I'd expect the engine has the ability to use as much bandwidth as is on offer if a developer chooses to use it for an exclusive game.

In terms of latency that's not a problem since even on PC's with slow SATA SSD's you would use the system RAM as cache for anything likely to require low latency access that won't fit in the available VRAM. You'd just have to be smart about what you're pre-caching in system RAM to ensure it's available when needed. But going back to the original point of this thread, if you have a GPU with 8GB VRAM coupled with 16GB system RAM of which say 12GB is usable for pre-caching, then that's 20GB total or around a 5th of your entire game content. Surely that should be enough to pre-cache at least the next several second of gameplay, if not in fact the next several minutes with only minimal streaming from disk as the player moves through the game world.

Maybe by using the VRAM as a form of high bandwidth cache? But it will still be a massive waste of DRAM which the PS5 aims to rectify by using an I/O stack that acts as a DRAM multiplier.
 
Nope. Epic clearly said that it was latency which was more important with more mundane throughput being perfectly ok. It's the I/O coprocessors and the susbtantial on-die SRAM that eliminates overhead in accessing data which are being targeted.

who told you it was those two things that had the biggest impact on the I/O rather than the other improvements such as the on chip decompression which has already been said to have the biggest impact?
 
Back
Top