Playstation 5 [PS5] [Release November 12 2020]

Cerny said the SSD was designed to be able to load 4GB or so when you turn the camera 180 and reload everything that's behind "which sounds about right for next gen". That would be loading data at max rate in peaks when you turn, but there's no hardware reason it cannot stay at that peak constantly.

Logically it can stream all the required audio samples continuously without taking much ram if any, and stream animation without taking any ram either. The multiple priority queues make a lot of sense here.
 
We could apply the same rules to any game, you'd have core assets that'd always remain in a poll of ~10GB RAM; character model and textures, physics, rendering engine, etc. Then essentially everything else is somewhat variable within a set radius of the player.

It means detail would drastically increase.

Scenarios where constant flushing of data would make the best use of the SSD speed. Devs would then presumably need to consider which variables are static/changeable and what the radius should be.



Is that right? I would have thought youd be able to maintain those speeds, but really I have no technical knowledge to really comment further.
you mean, to just stream absolutely everything off your SSD and never store it back into memory? You'll eventually hit a scene where you'll exceed your 9GB/s bandwidth. VRAM for both systems provide a bandwidth level of up to 480+ GB/s
 
Cerny said the SSD was designed to be able to load 4GB or so when you turn the camera 180 and reload everything that's behind "which sounds about right for next gen". That would be loading data at max rate in peaks when you turn, but there's no hardware reason it cannot stay at that peak constantly.

Logically it can stream all the required audio samples continuously without taking much ram if any, and stream animation without taking any ram either. The multiple priority queues make a lot of sense here.
Right.
What you're really saying is that by pushing their SSD to the limit at all times, will obtain an equivalency of going from 16GB to 20GB of capacity in terms of available memory and an increase of bandwidth of 9GB/s
 
It doesn’t always have to apply to static details in the game either. One of the Battlefield games had these events take place in a multiplayer game where a storm would occur and you’d get waves and a crashing boat. This data could easily be flushed in/out of RAM super quickly.

Imagine Normandy Beach reimagined with next-gen SSD tech, constant events happening around the player – everything from the sounds to the animations could be added and removed. I know sound doesn’t take much space, but once the dialogue has occurred that’ll then free up some space (which I guess is within the speed of a HDD, but you follow the point).

So it’s not just spatial data but the temporal dimension too.

When we hear of devs saying that SSDs are going to be a huge leap, you can really start to understand why when you think about it. Also makes sense why the RAM hasn't increased much this gen - it almost doesn't need to.
 
Right.
What you're really saying is that their SSD to the limit at all times, to obtain an equivalency of going from 16GB to 20GB of capacity in terms of available memory and an increase of bandwidth of 9GB/s
The 4GB is the things that changed while turning.

There's a more exponential saving while you advance quickly which is in any direction. More again with all audio samples being streamed, and the animation can take some chunk of memory with the advanced rigging we expect next gen. Sot it's much more than 4GB, it's the totality of what can be needed based on events in the game, so it's basically the entire install size if they want to.
 
I think you're mixing up capacity with total memory addressable.
The SSD and the Memory are actually 2 separate pools of data. One has 16GB of capacity and 448 GB/s of bandwidth. The other has up to 5.5 GB/s throughput or 9GB/s if we go with compressed numbers with a total memory addressable up to the game install size.

That SSD is still locked by it's 9GB/s, it doesn't matter that it can address 100GB of space. If you're using the SSD to feed the GPU directly to by-pass memory, the 9GB/s is being used to feed the GPU. You can't load in more to the GPU for the next set of scenes are coming up because it's already feeding the GPU the same 9GB/s that it still needs to feed the screen. All textures must be loaded into memory, and it will stay there for processing each consecutive frame until those textures are out of sight so that it can drop them. So if you are somehow loading entirely 9GB/s worth of new data into memory, that also must mean you are able to drop 9GB/s of texture data out of memory. Because if you aren't, then the SSD is feeding the GPU directly. Which means, your design is now limited by the SSD speed.

I find both cases unlikely; unlikely that players are experiencing that much change per second for any game and still be playable, and unlikely that SSD will ever fuel the GPU in that manner.

All solutions will still leading to a streaming solution in which the SSD is properly fueling the memory with a relative amount of buffer available like we do have today, with just a much tighter window to buffer with.
 
Last edited:
It's not feeding the GPU to bypass memory. It's the laundry list of things that can be loaded just in time versus keeping all the possibilities based on the events happening, including but not exclusively, both player movement and player turning.
 
It's not feeding the GPU to bypass memory. It's the laundry list of things that can be loaded just in time versus keeping all the possibilities based on the events happening, including but not exclusively, both player movement and player turning.

I think a really good candidate is animation. Gears 5 and Doom Eternal use alembic animation caching. I imagine there are other games as well. It's supported in Unity and Unreal. It seems like the perfect kind of data to be streamed off ssd as needed.
 
It's not feeding the GPU to bypass memory. It's the laundry list of things that can be loaded just in time versus keeping all the possibilities based on the events happening, including but not exclusively, both player movement and player turning.
And i'm perfectly okay with that, because that's my expectation of real world game code.
But if you're waiting around for things to happen, that isn't the same as running the drive at 9GB/s every single second as being realistic.
I think when a lot of things happen, it'll blip to 9GB/s but I don't expect it to keep it there as part of it's average game code usage. That's all I was saying.
 
I think a really good candidate is animation. Gears 5 and Doom Eternal use alembic animation caching. I imagine there are other games as well. It's supported in Unity and Unreal. It seems like the perfect kind of data to be streamed off ssd as needed.
Yeah that could be really nice. It's a case of loading only the current animation data of characters and object in view before every frame, maybe animated textures too (baked stuff, fire, etc..), and there's a massive difference between keeping all frames of all cycles versus just one. It should allow much more detailed and varied animations with a negligible memory footprint comparatively..
 
It doesn’t always have to apply to static details in the game either. One of the Battlefield games had these events take place in a multiplayer game where a storm would occur and you’d get waves and a crashing boat. This data could easily be flushed in/out of RAM super quickly.

Yup. I think folks are becoming obsessed with the realtime possibilities rather than the more practical things. I bet there are tons of things that game designers would like to do which just aren't possible because of RAM availability and slower I/O times. When you can suddenly shuffle almost half the consoles memory to/from SSD in a few seconds, that opens up immeasurable possibilities.
 
Regarding destruction in the game world. Say you are moving through large city blowing up buildings and when you have traveled far enough where RAM data is being replaced for the new area. How is that handled? is the destruction information only handled in RAM? or is the information written back to HDD to be streamed back later when you revisit the area? If the latter is the case can we expect to see a change for this using SSD where destruction can be more detailed and extensive?
 
or is the information written back to HDD to be streamed back later?
Correct, anything can be paged out of memory and back into memory as required. That's why they're given virtual memory to work with as an overflow buffer if they exceed 16GB capacity. They can still store stuff away onto the SSD as virtual memory if they need to load more stuff into physical memory. I'm not sure how DICE does it though.
 
Regarding destruction in the game world. Say you are moving through large city blowing up buildings and when you have traveled far enough where RAM data is being replaced for the new area. How is that handled? is the destruction information only handled in RAM? or is the information written back to HDD to be streamed back later when you revisit the area? If the latter is the case can we expect to see a change for this using SSD where destruction can be more detailed and extensive?

I wouldn't have thought either Sony or Microsoft would be all that keen on having a game regularly/constantly keep writing back to the SSD. That'd be terrible for the life of the onboard NAND. I'm curious if either of them have stipulations regarding this?

Would it be all that necessary to feed destruction data back to the SSD anyway? If I'm not mistaken, Microsoft have stated that 100GB of the SSD will be available per game. I imagine Sony are doing something similar.

So if each console grants developers up to ~100GB of addressable storage, might it make more sense to, for example, store several versions of each building, in various states of destruction, and stream them in as necessary? Would it require a sizeable memory footprint to keep track of the destruction of each building/face?

It's not that dissimilar in principle to something like Minecraft, which could store pretty large worlds, as well as render them, on the PS360 with their paltry 512MB (well, including EDRAM, I think it was 544MB for the X360) of memory. But rather than being shoebox sized cubes, the dynamic destruction would be building sized cubes.
 
Would it be all that necessary to feed destruction data back to the SSD anyway? If I'm not mistaken, Microsoft have stated that 100GB of the SSD will be available per game. I imagine Sony are doing something similar.

Yeah, but how much space will be left for us to install games etc then?`100gb reserved (i assume?) for virtual ram, then allocate or reserve to be able to have this quick resume/save state space for five games, OS, and recordings and all. In PS5's case, were already at 800GB, were going to be close to 600GB? effectively left for installs? I see the emphasis on external storage now.
 
Would it be all that necessary to feed destruction data back to the SSD anyway? If I'm not mistaken, Microsoft have stated that 100GB of the SSD will be available per game. I imagine Sony are doing something similar.
I think we endure this anyway with our current SSDs on PC. If you don't have a platter drive, you're virtual memory is going to be written somewhere. I don't think this issue would be more pronounced on the consoles than on PC.
 
The PS5 is not paging 100GB of memory to disk.

Memory mapping is the process of using a virtual address space which is managed by the kernel in order to remap the accesses differently.

In this case it's the installation data going through the decompression automatically, and is mapped to a virtual address space, so it's easier to access the installation data transparently.
 
The PS5 is not paging 100GB of memory to disk.

Memory mapping is the process of using a virtual address space which is managed by the kernel in order to remap the accesses differently.

In this case it's the installation data going through the decompression automatically, and is mapped to a virtual address space, so it's easier to access the installation data transparently.
but there will be still virtual memory; the amount is unknown though. You think its buried within OS reserve? So somewhere in that 20-30GB that the OS takes up?
 
but there will be still virtual memory; the amount is unknown though. You think its buried within OS reserve? So somewhere in that 20-30GB that the OS takes up?
I haven't seen any mention of swapping memory on PS5, anywhere.

The correct term here is swapping, or swap memory. There is none as far as we know. We only know of the memory mapped installation data.

Virtual Memory is a kernel abstraction which provides a lot of functionality. It doesn't mean that memory is disk based. It allows segmentation (stack, heap, data, text, etc..), paging which means splitting the physical address space into manageable pages that can be reordered to be contiguous in the virtual address space, swapping which is what you think is happening, and sharing pages around multiple applications into their individual virtual space. It would also allow mapping the virtualized uncompressed data of the installation, which I said previously.
 
I haven't seen any mention of swapping memory on PS5, anywhere.

The correct term here is swapping, or swap memory. There is none as far as we know. We only know of the memory mapped installation data.

Virtual Memory is a kernel abstraction which provides a lot of functionality. It doesn't mean that memory is disk based. It allows segmentation (stack, heap, data, text, etc..), paging which means splitting the physical address space into manageable pages that can be reordered to be contiguous in the virtual address space, swapping which is what you think is happening, and sharing pages around multiple applications into their individual virtual space. It would also allow mapping the virtualized uncompressed data of the installation, which I said previously.
Well, yes, if you’re looking at virtual memory as the process of virtualization of addresses mapped to physical addresses so that programs don’t break each other, security etc; yes you are right. But it is also through virtual memory can we expand memory onto other methods of storage; like disk storage and all of it appears as 1 continuous large bank of memory to programs.
I am talking about the expansion of memory onto disk; in another thread it was revealed that there is such an expansion for PS4. So I assumed this would carry forward; a simple perhaps small amount of space for swapping for additional memory.
 
Back
Top