Ratchet & Clank technical analysis *spawn

Especially since DF other Members always ALWAYS say that if a 40fps mode is available and stable that it is the best option , the best mix out of visuals fps and responsiveness. Alex sits next to them nodes his head as if he has to.
What he realy thinks is shown here...
And yeah he said he does not have a capture card for such modes - ok.

But then if work schedule for a corporation of Alex and the member with such a card (Rich) is not possible why not simply adress it in the video?
Just a mention of a (at least visually )better mode of PS5 but why it is not shown.
And to pretend that it is all only about PC is such a bad faith argument as well.
We all know that as soon it is PS5 footage and PC footage the secret videotitle is "Game xyz - PC vs PS5"
i think you are overreacting, it's just a PC focused analysis, they choose what they want to put in it, and there are already several video comparisons of PS5 quality mode VS PC.
As long as all the facts presented in the video are correct i see no problem with these videos.
for a PC centric video, performance mode may be better because PC players don't like playing under 60, it's their 30fps mode !

Everybody should just breath and relax, these are just games afterall, and tech analysis are just a nice way to compare tech and programming skills.
 
Last edited:
I'm not sure it's expected when RT is in use. And yeah without knowing what the DRS bounds are its really hard to make performance comparisons. We would need some scenes with resolutions matched.

That's a good point, I'd forgot about the inclusion of RT, and yes you would usually expect Turing to perform relatively better in that case. I suspect what's happening here is that the RT implementation is very light. Considering it's almost certainly a very similar implementation to that in Spiderman - which had reflections on virtually every building and was still pretty light if the AMD comparisons were considered, this would presumably be a lot lighter than that. As I understand it R&C doesn't use RT reflections on bodies of water for example. It'll be interesting to see how AMD GPU's handle the PS5 equivalent RT settings but I suspect that like other RT light games, the comparison to Turing will be favourable.

On the other hand, if it were possible to ramp up the RT settings on the PS5 to the higher PC settings then I expect you would see the Turing advantage come into play. However as I've commented in the past, these comparisons are always optimal for the console so that's not something we can test.
 
And again that is a Sony problem. They should have been investing in more studios to handle pc ports.

Right now your post is just full of making excuses for terrible Sony ports to pc gamers in which they want the full msrp of the ps5 game at launch day years later. It's appaling that you are defending them charging $60 for a game that performs worse than a game I can walk into walmart and buy for $40 bucks right now.

The last of us still has huge issues months after release but you are grateful for that ?
I'm not 'making excuses', I'm explaining that the situation we have right now is ultimately still pretty good for us as consumers. And again, calling this a 'terrible port' is just straight hyperbole. With such standards, you're unlikely to ever be happy with 95% of future demanding games. It's just not realistic, quite frankly.

I'm just a fan of having some perspective, that's all. I've got no special love for Sony or anything.
 
+ the DS api is still kind of new. Some pipe cleaning games will be needed for devs&co to know how to best use it imo.
There's scope for improvement in basically all three areas - Microsoft's API, game developers, and potentially even Nvidia/AMD in terms of optimizing utilization. This is going to be the paradigm of the future, so it's inevitable that there'll be ways to make it better after the VERY FIRST game properly utilizing it.
 
shadows look really nice

2CgIz0G.png
 
This is a rare example where we get to see a more accurate gpu comparison. PS5 Performance non-RT Mode runs 1440p native 80-100fps with VRR. and is obliterating this 3070 coupled with a Ryzen 3600.

 
This is a rare example where we get to see a more accurate gpu comparison. PS5 Performance non-RT Mode runs 1440p native 80-100fps with VRR. and is obliterating this 3070 coupled with a Ryzen 3600.


An accurate GPU comparison where a clear CPU bottleneck can be seen?

Also DLAA is a far superior AA and can have quite a noticeable performance hit, so it's quite a poor comparison really and could better.

And what matched settings with PS5 are being used?

And is there a video of this same scene on PS5 with an accurate frame rate counter available?
 
Yes because it is similar CPU in PS5, hence why I said accurate GPU comparison; CPU is standardized variable here.
A 6 core CPU is similar to an 8 core?
Lol, just stop.

I'm sorry that facts hurt your feelings.

PS5 uses DRS which in performance none RT mode has a lower bound of native 1080p so it's not a static 1440p as you stated (Performance RT mode also uses DRS and can drop to 1080p)

So unlike the PS5, the 3070 is running a fixed 1440p.

Do you homework before posting next time as saves this thread getting clogged with useless posts.
 
Last edited:
A 6 core CPU is similar to an 8 core?


I'm sorry that facts hurt your feelings.

PS5 also uses DRS which in performance none RT mode has a lower bound of native 1080p so it's not a static 1440p as you stated (Performance RT also uses DRS and can drop to 1080p)

So unlike the PS5, the 3070 is running a fixed 1440p.

Do you homework before posting next time as saves this thread getting clogged with useless posts.

The PS5 non RT mode typically runs at 1440p. Stop trying to use DRS as an excuse, it's embarrassing. Do you see me trying to excuse the PS5 because ultra textures that would blow 3070 framebuffer to smithereens isn't being selected? The 3070 is being EASILY outperformed by the PS5. Full Stop.

And the CPU isn't bottlenecking the 3070 you can see his framerate scale with adjustments to non RT presets. Even at 1440p medium settings the 3070 barely makes it into the 60s. Not to mention the PC 3600 runs in excess of 4ghz and PS5 cpu core is reserved for OS.

 
Especially since DF other Members always ALWAYS say that if a 40fps mode is available and stable that it is the best option


John Linneman said:
To a man, the DF team reckons that the performance RT option is the way to go

It's almost as if you're making it up as you go along....
 
The PS5 non RT mode typically runs at 1440p. Stop trying to use DRS as an excuse, it's embarrassing.
PS5 uses DRS and can drop to 1080p.

That is a fact.
The 3070 is being EASILY outperformed by the PS5. Full Stop.
To which you've provided no PS5 data to go along with anything.
And the CPU isn't bottlenecking the 3070 you can see his framerate scale with adjustments to non RT presets.
His GPU load is fluctuating, it's CPU limited.

PS5 uses DRS and custom quality settings that can't be matched on PC, so any attempt to try and compare on an apples to apples basis is idiotic.
 
What a ridiculous thing to say.
It's really not, DF testing just confirmed it, an HDD with cached data in RAM (ie: preloading necessary data like many games do) breezes through the portal sequence just like the PS5. In otherwords, rearchitecting the game to take advantage of PC hardware makes the SSD + decompression block irrelevant. Again, It's already been done in Dishonored 2, Titanfall 2 and even Psychonauts 2. So this is nothing new. A regular SSD works even better too.

no 16GB is not enough if you don't have VRAM on your GPU also
32GB now is becoming the standard on PCs, which means they are more than ready to handle "data intensive" games.

I don't think he was referencing PC hardware when describing capabilities not possible before. That statement was made by a dev whose franchise has been relatively exclusive to the Playstation since inception. Consoles don't have the option to cache to system RAM to deal with the bandwidth limitation presented by HDDs.
I wasn't targeting the developer with my previous post, I was targeting the people who claimed an SSD + decompression block is necessary on PC.
 
It's really not, DF testing just confirmed it, an HDD with cached data in RAM (ie: preloading necessary data like many games do) breezes through the portal sequence just like the PS5. In otherwords, rearchitecting the game to take advantage of PC hardware makes the SSD + decompression block irrelevant. Again, It's already been done in Dishonored 2, Titanfall 2 and even Psychonauts 2. So this is nothing new. A regular SSD works even better too.

So then it's not really running off an HDD then now is it?
 
Lol, just stop.

I mean, it's true. DLAA is superior to TAA and has a performance cost. And RT Performance mode in R&C is dynamic 1440p, going down to 1080p. What is the average res when it's running unlocked?

Secondly, R&C on the PC has an issue with changing settings back and forth - the game's performance will plummet and stay there until the game is relaunched if you keep bouncing between settings. I don't know if that's the issue here, but that is simply not reflective of what that class of GPU delivers.

For example, Here's 1440p High, DLAA from that video. Maybe the 3070 is vram limited at those settings, it would be with RT and this is known - but 1440p, no RT, and only high? Highly doubtful 8GB is a barrier here.

Regardless, they get a paltry 51fps:

1690589799044.png

Here's 1440p, DLAA High, no RT in the same area (actually showing more geometry here): 58fps. On my 3060.

1690590603745.png
Here's TAA, 69fps:
1690590709401.png



And the CPU isn't bottlenecking the 3070 you can see his framerate scale with adjustments to non RT presets. Even at 1440p medium settings the 3070 barely makes it into the 60s. Not to mention the PC 3600 runs in excess of 4ghz and PS5 cpu core is reserved for OS.

Yes, which should have clued you in that something was up, if you actually care. At 1440p, TAA, Medium preset, I'm in the mid-70's in that area.

And the CPU isn't bottlenecking the 3070 you can see his framerate scale with adjustments to non RT presets. Even at 1440p medium settings the 3070 barely makes it into the 60s. Not to mention the PC 3600 runs in excess of 4ghz and PS5 cpu core is reserved for OS.

Dude, I own this game on PS5 and I think compared to my 3060, it's the superior platform to play it on atm. DLSS performance mode at 4k though when compared to R&C non-RT performance (which tops out at 1800p and has a lower bound of 1080p, but is far more commonly near the upper bounds according to DF, so the best-case scenario for resolution on the PS5 when aiming for 60fps) actually looks superior a good deal of the time despite the lower native res, with the exception on some characters fur when combined with Dof. This really shouldn't be that surprising. I think DLSS has some flaws in some games sure, but more often than not it does deserve its reputation as the gold standard for reconstruction tech atm for a good reason.

So no, that video doesn't show us anything. A 3070 not managing 1440p/60 on high with no RT is simply not normal. You're excited about it because you're a fanboy who's only purpose on this forum is to constantly re-iterate that. Go back to neogaf.
 
Last edited:
DLAA (not DLSS) is not superior to PS5 ITGI
DLSS is rendering at lower resolution then upscaling the image using AI and applying AA using AI as well, to produce an image that is near/equal to/better than native resolution + TAA, depending on the situation. DLSS is designed to boost fps primarily while minimizing or maintaining image quality.

DLAA is rendering the game at native resolution then applying AA using AI to boost the quality to near super sampling resolution, so it's always universally better than native + TAA. DLAA is designed to boost image quality at the expense of fps. DLAA is the ultimate form of DLSS.

In R&C, even DLSS is providing superior image to ITGI, so naturally DLAA is providing even better image than DLSS, at the cost of some performance (typically around 10% of fps).
 
Last edited:
Back
Top