Ratchet & Clank technical analysis *spawn

Well it seems like the CPU's are fast enough anyway, that there's more CPU availability than GPU in this title.

Absolutely. In my own testing disabling DS see's a very modest CPU increase of maybe around 10% at very high for a pretty big overall performance increase. The actual decompression load doesn't seem that high at all, at least in the test areas I've seen. It's making me wonder whether there's something wrong with the NVIDIA implementation given there seems to be no impact at all on AMD of changing the texture settings.
 
Absolutely. In my own testing disabling DS see's a very modest CPU increase of maybe around 10% at very high for a pretty big overall performance increase. The actual decompression load doesn't seem that high at all, at least in the test areas I've seen. It's making me wonder whether there's something wrong with the NVIDIA implementation given there seems to be no impact at all on AMD of changing the texture settings.

Did you ever upgrade your CPU?
 
It feels like most of the income Sony gets from ports are from people doing research to prove PC is better than PS :D
It feels like most of the reason Sony releases PC games in a sub optimal state is to convince people PS is better than PC ;)
 
  • Like
Reactions: JPT
Testing the Rift sequence at a resolution where I'm CPU bound with my 12400f and 3060.

1080p, DLSS Ultra Performance, HIgh Preset, Very High Textures, no RT. In combat I'm usually around 110 fps with GPU utilization in the 80% range at these settings.

System was rebooted between each test.

DirectStorage Enabled:

1690748745527.png

DirectStorage Disabled:

1690748785974.png
 
BTW I regret to inform you if you're tired of the vram debate, a 50 gallon drum of kerosene just got thrown onto the fire:

96KaXK9.png
 
I don't think there's much of a debate to be had. This is a PS5 port originally working on 12.5GB of available VRAM. It stands to reason it would take as much VRAM as possible so 8GB not being nearly enough is completely normal and expected. 8GB has been mid-range on PC since like 2016.

There was some theorizing that DS could assist in this (for reasons that weren't entirely clear to me), but just pointing out it's yet more grist for the youtube mill. What I found somewhat noteworthy at the above is that it's 1080p, so even conforming to Nvidia's silly marketing of the 4060/ti class cards, it still comes up short.

But yes, 8GB being midrange on PC since 2016 is exactly why there's such a fervor about it.

Edit: To be clear this isn't a dig at Nixxes at all, as unlike TLOU on release, where this game excels is in it's adaptability. While we have 'yet another game that falters on 8GB cards' in terms of perhaps equating the console version pixel per pixel, you can at least drop down the textures to high and get the vast majority of the detail retained, unlike TLOU at launch where medium textures turned into soup. Likely having to axe RT as well is a bitter pill for a $400 Nvidia card though, but you can at least retain the majority of the games presentation on 8GB cards and also take advantage of other superior aspects like DLSS.
 
Last edited:
BTW I regret to inform you if you're tired of the vram debate....

alfred-pennyworth-batman.gif

There was some theorizing that DS could assist in this (for reasons that weren't entirely clear to me), but just pointing out it's yet more grist for the youtube mill. What I found somewhat noteworthy at the above is that it's 1080p, so even conforming to Nvidia's silly marketing of the 4060/ti class cards, it still comes up short.

DS assisting GPUs with lower amounts of VRAM came from the idea that fast, low overhead "just in time" loading of textures or texture tiles meant that you could store less in VRAM at any time, and therefore get away with less VRAM in total. This was at times tied in with MS's talk of their "Velocity Architecture" and the use of Sampler Feedback to give you certain knowledge of the first frame on which a higher texture LOD was needed, so you could load it in time for the next frame (or thereabouts).

R&C doesn't seem to do anything quite this fine grained AFAICS. It appears to stream data (and I think whole texture LODs*) as you move, like Spiderman, but also has portals where it dumps a whole new chunk of map real fast.

And yeah, 8GB on the 4060 and especially the 4060Ti is a legendarily "cynical as fuck", market dominance enabled piece of product debasement. And customer debasement. And platform debasement.

*sometimes, at least on PC. Sometimes on PC it just forgets like "lol nm".
 
but then the game supports RT on Intel Arc GPUs and runs fine, so something is amiss here.
It's a problem with AMD drivers this time, the game works fine on their hardware without RT. Even the new Blender RT acceleration for Intel and AMD is less stable on AMD than on Intel Arc right now.

Intel’s RT implementation is more stable than AMD’s. It feels stable enough to rely on Arc right now, but since some projects failed to render with HIP-RT, we can’t say the same about AMD at the moment.
 
Damn, the game definitely has a camera microstutter problem, at least when locked to 60fps. It's sporadic and very small, but it definitely exists to varying degrees in most scenes. Not surprising people have missed this but going back and forth between it and the PS5 (and well, most PC games) confirms it. Rivatuner does not remedy this.


This greatly exaggerates the stutter, it's not that egregious as youtube makes it out to be - but it is representing where the skips occur. Not something you're going to see with a mouse, you will need a gamepad and be turning very slowly, and be in an area with good contrast to catch it.

Edit: Hmm forcing vsync from the Nvidia CP may fix it, will have to play with it more.

Yep, forcing vsync fixes it. Also fixes the far more egregious frame pacing issues during cutscenes.

Update: Crap, nah. Vsync helps the frequency of it, but the microstutter still happens.
 
Last edited:
If anything, Ratchet confirms to me that PCs current architecture *without DirectStorage* on a Gen3 drive... is already damn close to PS5's custom I/O, and can already handle what is likely the most demanding game from a I/O standpoint
Couldn't have said it better myself, currently DirectStorage 1.2 doesn't even improve loading times or portal times in the slightest. They are the same with DS on or off. So in reality, current systems on PC are already running the game just fine with the old paradigm.

 
Last edited:
Damn, the game definitely has a camera microstutter problem, at least when locked to 60fps. It's sporadic and very small, but it definitely exists to varying degrees in most scenes. Not surprising people have missed this but going back and forth between it and the PS5 (and well, most PC games) confirms it. Rivatuner does not remedy this.


This greatly exaggerates the stutter, it's not that egregious as youtube makes it out to be - but it is representing where the skips occur. Not something you're going to see with a mouse, you will need a gamepad and be turning very slowly, and be in an area with good contrast to catch it.

Edit: Hmm forcing vsync from the Nvidia CP may fix it, will have to play with it more.

Yep, forcing vsync fixes it. Also fixes the far more egregious frame pacing issues during cutscenes.
Was going to say that I don't have that issue lol.
 
Anyone else's vsync not working right?

The frame rate always locks 2fps below the vysnc limit.

So 60fps the frame rate locks to 58fps...etc..etc..
Do you have Reflex enabled? If it's on (or on+boost) it will do that, I noticed. There's something bugged with it which causes frametime spikes and the framerate to never reach it's target fps. Also of course if you have framegen on it will obviously automatically force Reflex.. so disable that too.

Sucks, but that's what has to be done until they fix it.
 
Do you have Reflex enabled? If it's on (or on+boost) it will do that, I noticed. There's something bugged with it which causes frametime spikes and the framerate to never reach it's target fps. Also of course if you have framegen on it will obviously automatically force Reflex.. so disable that too.

Sucks, but that's what has to be done until they fix it.

It is on from memory but FG isn't, I'll try turning it off, thanks.
 
Back
Top