Playstation 5 [PS5] [Release November 12 2020]

Yes, it needs to go through the main system RAM. Because its a cache, not a local store. A cache, by definition cannot contain anything not backed by an address in main memory.

And if Sony had made customisations allowing the use of L3 as an SPU-style local store, they definitely wouldn't have been quiet about it.

Could they not memory map parts of the SSD? (not saying its a smart decision :p).
 
It's a relatively new product that's 32GBs Optane XPoint and 1 TB NAND storage

View attachment 3747

Xpoint is stupidly expensive, HPC stuff. It's hundreds of dollars for 32/64 GB quantities. It should be pretty obvious that something called "1TB Optane" at $280 isn't going to be comparable to Optane SSDs at over a grand for <1TB capacity and one should look into the specifics rather than just the brand name. ;)

Ah theres the catch, didnt know it was devided actually. Stupidly expensive indeed, stupidly fast also :p
 
That would result in more erase/writes, though, would it not? To maintain coherency?

I was thinking about it in relation to this part of an earlier post, which should avoid the coherency issue as it should not require writes or be modified.

I specifically said the CPU or GPU could avoid making access requests to the RAM by using the SSD directly on code that is less latency sensitive.
For example a small 100KB script that determines enemy behavior, runs on the CPU and fits a small portion of the L3. Does it ever need to go through the main system RAM?
 
Controller overhead. File system overhead. Etc.
http://people.csail.mit.edu/ml/pubs/fast2016.pdf



AFAIK DirectX Storage minimal read/write block is 64K.
It will be faster than 4K.



I think people just don't optimize it. Because HDDs were slow there was no real difference.
Typical FS block is 4K. Then that's what you get.



I've read the patent.
They are much much bigger. But again. That''s not on the agenda. :)

The loading test is not useful in gamenexus test. He tests launching the game to the menu...
:D

https://www.gamersnexus.net/guides/1577-what-file-sizes-do-games-load-ssd-4k-random-relevant

That noted, I personally found the fact that these games are only loading ~70MB (or less) of data during initial launch to be somewhat shocking. I'd anticipated gigabytes of reads, given the size of the games, but I suppose it makes sense: Watch Dogs, Titanfall, and the others don't need all their texture files and level data immediately accessible at launch. They just need the menu and the input functions, maybe a background video. So loading small amounts of data is fine, then the rest is fetched when the user progresses to a playing state. Surprising efficiency in the I/O department for these games. This perhaps indicates that storage isn't the only load-time bottleneck going forward -- maybe the CPU is getting choked-up on tasks (inefficient task queuing or execution?) or something similar.

For sure menu will not represent assets on the disc. Same for the 6 minutes test what does he is doing to see nearly no asset loaded maybe on PC the 16 GB of RAM are filled with data...

I think the Gamernexus test is not very convincing...

EDIT: And here this the number of files, this is not the total file size. If you are clever you can merge the read of this little file if you need to stream many 4k file at the same time.

He never gives the file size of the file probably the few percents of superior to 64 kb files are GB of data.

Edit: I would go further it show how non optimized is the loading of menu. You know the main menu file are linked. Load inside an archive file and extract them from the file in RAM.
 
Last edited:
Why doesn't that suprise me ;)

Learn to read, he does not test anything interesting out of load to the menu. This is logical to see only sub 64k files and I would say you can merge all the static menu files inside one file and goes much faster than here.

After the truly interesting number would have been to add the size of the files and not only the number of files(Probably GB of files for assets superior to 64kb during the 6 minutes of gameplay) and after another interesting things could have been how much 4k files is it needed to stream at the same time. The PS5 SSD does have problem with load a few 4k file, it loses efficiency when you need to stream many 4kb files at the same time. I suppose it means they are linked. You can write and read little files if they share something in common in one big file. It is called optimisation.

Like here the menu why load it like many little files. Create a big archive file with all the little part of the menu load it at the same moment and extract the file in RAM.

This is an advice they give directly in one of the patent.

Again try to argument but maybe you don't understand the test.

Edit: I find the article because I did not understand the result and after read I have a better understanding of the test and I find nothing interesting. Just things we know and game dev need to optimize the data (menu).
 
Last edited:
Perhaps things have changed a little bit since the test, "Published August 11, 2014" ?
 
Perhaps things have changed a little bit since the test, "Published August 11, 2014" ?

And what change file size did not change. You can optimize your data. The patent was write in 2016 and they explain it. A game menu is a menu. If the composition of the menu is files of 4k, 16 and 32 k merge them inside one file.

Same for other linked little files and it is valid for load and write file. If the dev know he will load many little file merge them.
 
I just figured maybe things in games have progressed a little bit since the era of optimizing for PS3/X360 consoles and also releasing on PCs. If they're no longer targeting PS360 hardware means assets grew in size, etc.

Is it really unthinkable that in 5.5 years things might have changed?
 
I just figured maybe things in games have progressed a little bit since the era of optimizing for PS3/X360 consoles and also releasing on PCs. If they're no longer targeting PS360 hardware means assets grew in size, etc.

If size file grows it helps PS5 to reach it maximum speed.

The problem of PS5 SSD is stream many 4k* files they give an example inside one of the patent.

*Probably sub 64k file.
 
I'm not saying that now things will have shifted to make one side better than the other or anything like that. I'm just commenting that after 5.5 years I'm fairly confident in the statement "maybe things have changed". It would be nice to have more recent data points.
 
I'm not saying that now things will have shifted to make one side better than the other or anything like that. I'm just commenting that after 5.5 years I'm fairly confident in the statement "maybe things have changed". It would be nice to have more recent data points.

I speak because people think Mark Cerny and I would say MS with DirectStorage do unuseful things. If they do it, they know what they are doing.
 
In Gamer's Nexus analysis you can find the tool he is using. I decided to run the tool and play some Wolfasssteeain (2014) and Doom Eternal (2020).

Both games are running off my HDD while my OS runs off the SSD so I thought the readings would be mostly accurate, however the tool is very unstable and crashes quite a lot so maybe the results are not 100% real.

What is weird is that I'm getting a lot more 4K write operations compared to Gamer's Nexus results. Anyway, you can download the tool here if you want to perform some tests.

I think I got more 256K and 512K readings in Doom Eternal because in Wolfenstein the tool crashed in the middle of the game play, so it only recorded the HDD activity while playing but not when loading the save file and the game itself.

xTS1kqH.jpg


FOR SCIENCE¡
 
In Gamer's Nexus analysis you can find the tool he is using. I decided to run the tool and play some Wolfasssteeain (2014) and Doom Eternal (2020).

Both games are running off my HDD while my OS runs off the SSD so I thought the readings would be mostly accurate, however the tool is very unstable and crashes quite a lot so maybe the results are not 100% real.

What is weird is that I'm getting a lot more 4K write operations compared to Gamer's Nexus results. Anyway, you can download the tool here if you want to perform some tests.

I think I got more 256K and 512K readings in Doom Eternal because in Wolfenstein the tool crashed in the middle of the game play, so it only recorded the HDD activity while playing but not when loading the save file and the game itself.

xTS1kqH.jpg


FOR SCIENCE¡

This looks good here only 4k(10,2%) would be a problem for PS5/XSX on Doom Eternal without optimisation and this is a current gen game made with HDD in mind I expect to see more 1024 and 2048 files in next generation game.
 
This looks good here only 4k(10,2%) would be a problem for PS5/XSX on Doom Eternal without optimisation and this is a current gen game made with HDD in mind I expect to see more 1024 and 2048 files in next generation game.

You are going to have a lot of cross gen titles for some years. Plus, not every gaming PC has a massive SSD, I still run all of my games of an HDD with some exceptions and I have an above average rig.
 
You are going to have a lot of cross gen titles for some years. Plus, not every gaming PC has a massive SSD, I still run all of my games of an HDD with some exceptions and I have an above average rig.

I speak about next gen only you will see some from first party on Sony side from launch and a few from third party in 2021.
 
Last edited:
Cerny stated that the controller has to attempt to arbitrate above the M.2 SSD’s 2 priority levels, and this overhead reduces its peak speed.
The impact was described in terms of bandwidth, rather than latency. I think Sony should be profiling this, since there can be large disparities in latency between the best/average case and 99th percentile depending on the quality and fullness of the drive. Can the PS5 assert itself over the expansion drive's potentially inconsistent firmware and controller, or would this mean that Sony's latency promises are not as iron-clad as the bandwidth ones?
(edit: corrected wrong word for bandwidth)
 
Last edited:
In Gamer's Nexus analysis you can find the tool he is using. I decided to run the tool and play some Wolfasssteeain (2014) and Doom Eternal (2020).

Both games are running off my HDD while my OS runs off the SSD so I thought the readings would be mostly accurate, however the tool is very unstable and crashes quite a lot so maybe the results are not 100% real.

What is weird is that I'm getting a lot more 4K write operations compared to Gamer's Nexus results. Anyway, you can download the tool here if you want to perform some tests.

I think I got more 256K and 512K readings in Doom Eternal because in Wolfenstein the tool crashed in the middle of the game play, so it only recorded the HDD activity while playing but not when loading the save file and the game itself.

xTS1kqH.jpg


FOR SCIENCE¡

Probably the reason the game load so fast on SSD

 
Back
Top