Ratchet & Clank technical analysis *spawn

Obviously that scenario doesn't apply to the PS5 which is why the SSD in it's case was essential to the experience.
Well precisely. Some really dumb 'tech' arguments out there. There must be any number of PS4 games that can be dumped to RAM and run 1000x faster than from HDD. PS5's IO tech is about getting performance into a $400 console irrespective what a $limitless PC can do. The only interesting discussions here are:

1) What's the minimum level of PC to match PS5? Notably regards the same PS5 16 GB RAM
2) What happens when PS5's IO is being thrashed? R&Cis only pulling <2GB/s. How much GPU is used ot match the tiny, low-power decompression block PS5 uses?
3) Are any games ever even going to hit those limits, or does it turn out that no PC will ever need more than 640 KB RAM no game will ever need more than 3 GB/s IO?

The latter is especially interesting. We are looking at a future scenario where more RAM and IO isn't actually necessary, that there's a natural limit and maybe 16 GBs and 3 GB/s will be perfect for next gen also thanks to software streaming more. 8K is the upper limit for displays I'm guessing. We're on the doorstep of the first real limits in tech where we can't benefit from more and over the future decades, there won't be improvements beyond this 'perfect' setup.
 
This was known. Even on the PS5, they tested the slowest nvme available and it was barely slower than the internal PS5 drive. There is a bottleneck elsewhere.

Or maybe the game doesn't need the max level of throughput that the PS5 internal drive can provide.
 
Or maybe the game doesn't need the max level of throughput that the PS5 internal drive can provide.
That is also correct but you'd think they could get loadings from 2.5s to like 0.5s. Otherwise, a maxed-out PC is also a bit slower than the PS5, so there could be a bottleneck there as well that's unrelated to the storage device's speed.
 
PS5's data transfer advantage is effectively larger than DF's claims because Alex's setup constantly fails to load in the highest mip levels. The idea that Nixxes overlooked this is so farfetched, considering the game's nature as data management heavy during signature moments. We have seen this texture behavior before with Spiderman Miles Morales. It's starting to look like there's only so much Nixxes can do to try and keep up with ps5 loading performance.

Screenshot (280).jpg

Screenshot (278).jpg
 
  • Like
Reactions: snc
That is also correct but you'd think they could get loadings from 2.5s to like 0.5s. Otherwise, a maxed-out PC is also a bit slower than the PS5, so there could be a bottleneck there as well that's unrelated to the storage device's speed.

They maybe want to sync up the animation / spinning of Ratchet to moving through the portal? Transitions that were too fast might break animation sync with what happens post exit, or cause animation in the rift to fast forward and look broken / crappy.
 
2) What happens when PS5's IO is being thrashed? R&Cis only pulling <2GB/s. How much GPU is used ot match the tiny, low-power decompression block PS5 uses?
thats the big one, right? Cerny statet in his slides that future games they envisioned use all free available memory only for next 30sec of gameplay. so that every camera pan will cause the whole I/O Block to overdrive. The PS5 can take it,also heatwise .. id imagine some PC M2 SSds start to throttle with such a onslaught. When they are not properly cooled anyways..
 
I wonder if one of those hybrid HDDs would run R&C as well as a SSD.
I never owned one, and they're outdated tech now, but they usually performed as well as SSDs. At least until their cached data ran out.
 
thats the big one, right? Cerny statet in his slides that future games they envisioned use all free available memory only for next 30sec of gameplay. so that every camera pan will cause the whole I/O Block to overdrive. The PS5 can take it,also heatwise .. id imagine some PC M2 SSds start to throttle with such a onslaught. When they are not properly cooled anyways..

Actually, he foresees all game memory to be utilized for the next 1 second of gameplay. It will be a real tech milestone.
 
Are these people serious? The problem isn't getting it to work on older HDD (which DF proves possible), but rather the benefits that a fast SSD/NVMe and new IO tech can provide during given gameplay elements. FYI, that warping time is shit in that clip.
sure it is, but that's not the point. The point is that the warp time while not immediate is working on very old tech at a relatively decent speed. Think of it like the classic elevator scenes in beat em' ups where one elevator is much faster than the other but imho that doesn't seem to affect actual gameplay -dunno how Ratchet & Clank works as a game, but from what I see you immediately switch dimensions or whatever-

It might be an interesting tech for games like a future Super Mario and so on.
 
For what games? PS5 exclusives? How many??

Primarily, yes. In the beginning. But Sony's innovations will carry gaming tech towards this direction of how data is managed. Which leads me to your follow up question which is a bit misguided - "How many?" is the wrong question to ask. Just know that more and more games will be made around this design philosophy until it becomes the norm. But PlayStation is spearheading all of this.
 
On the whole "SDD - is it needed or is it not" debate. I think the argument is a little silly tbh, of course it's needed to achieve the results as seen on the PS5 within the other constraints of that console. Is it also needed on PC? The answer for the most part seems to be yes, although nothing like the speed of the PS5 drive is required.

Thank for your taking the hit to your psyche and watching the NxGamer video. Your sacrifice is appreciated. 🫡

Curiosity got the better of me and nabbed a copy late last night to check out, especially wanted to see how it ran from a 5400 RPM HD. The biggest performance savings is avoiding those large Gdeflate textures, setting the textures to Medium and it's shockingly playable from HDD. I wouldn't have a problem at all playing from a HDD with this game frankly, but again - medium textures only. They actually look quite good the majority of the time, the biggest difference is the ground textures, but this is certainly nothing like the first release of TLOU - Medium textures scale quite well and will look good.

Frankly from my short experience with this so far, aside from the RT issues and some frame pacing problems (mostly in cutscenes), the biggest disappointment so far from my experience is...Directstorage GPU's decompression. Even on an SSD and with 12GB of vram, enabling them at high/very high causes a noticeable overall GPU performance hit, there will be small stutters even with enough GPU headroom, and the GPU is clearly doing more work so you can't really hit a DLSS 4K Performance 60/High which is mostly obtainable with Medium textures and everything else set accordingly. We'll see how patches hopefully will iron this out, being the first release using DS 1.2 I'm sure there's room for improvement. Using High Preset but with Medium textures, it's actually impressive how well this game holds to 4k 60fps with DLSS Performance (which looks very good) on my 3060 with some headroom on some worlds to increase it dynamically, it's just those texture issues. If they can wrangle those under control and fix the RT issues, it will be a solid port. It doesn't seem to really be overly demanding from a CPU/GPU perspective outside of that.

(Note settings and system specs are in each video's description. Each video was done after a fresh reboot so nothing is cached.)

Opening rift sequence on HDD:


General loading performance, jumping between worlds and streaming performance (not sure if there is much with medium textures mind you) on worlds too:

 
Last edited:
Frankly from my short experience with this so far, aside from the RT issues and some frame pacing problems (mostly in cutscenes), the biggest disappointment so far from my experience is...Directstorage GPU's decompression. Even on an SSD and with 12GB of vram, enabling them at high/very high causes a noticeable overall GPU performance hit, there will be small stutters even with enough GPU headroom, and the GPU is clearly doing more work so you can't really hit a DLSS 4K Performance 60/High which is mostly obtainable with Medium textures and everything else set accordingly. We'll see how patches hopefully will iron this out, being the first release using DS 1.2 I'm sure there's room for improvement.

Opening rift sequence on HDD:


I assume you're using a 3060 12GB? Interesting info about the GPU decompression. Assuming the higher res textures aren't in themselves causing the frame rate drop (you shouldn't be VRAM limited but I guess they could be sapping VRAM bandwidth?) then that is certainly something that warrants further investigation.

It may be worth you deleting the directstorage dll from the game to force it off like @HolySmoke did to see if that frees up the GPU at all (put presumably loads up the CPU).
 
Some more interesting stuff. I figured that since DirectStorage bypasses Windows' file cache then why not set up a RAM partition in PrimoCache?

So, I put a small 4GB read cache on my SATA SSD and played through the entire first mission until you land in Nefarious city. It got a cache hit rate of 89.6%.

X5G0fWr.png
 
Back
Top