Ratchet & Clank technical analysis *spawn

I see proprietary consoles being replaced with "console" form factor PCs. I see publishers developing only one version of a game, which scales with a wide range of hardware and can be deployed across all PC and cloud streaming devices.
You've just described the PC, and there are already compact PC options out there now and they cost predictably what PCs of those specifications should cost.

Consoles provide a performance profile at a cost that simply isn't achievable on PC because the economic model is selling the console cheap, at cost, or at a loss and being able to recover those costs over time as people buy games from the platform holder.

If somebody like Dell could sell a 12Tf, 16Gb PC like Series X for $400, then Microsoft could the same for a fraction of the cost, and make games super-affordable with GamePass on top.
 
Last edited by a moderator:
You've just described the PC, and there are already compact PC options out there now and they cost predictably what PCs of those specifications should cost.

Consoles provide a performance profile at a cost that simply isn't achievable on PC because the economic model is selling the cheap cheap, at cost, or at a loss and being able to recover those costs over time as people buy games from the platform holder.

If somebody like Dell could sell a 12Tf, 16Gb PC like Series X for $400, then Microsoft could the same for a fraction of the cost, and make games super-affordable with GamePass on top.
Indeed.

Consoles provide a performance profile at a cost that isn't achievable on PC in part because games aren't developed for PC specifically in mind from the beginning. The costs that Sony and Microsoft would pay buying in bulk isn't what consumers would pay for the total cost of the components, but yes I agree to an extent. You'd also have costs saved in R&D, as well as costs for your developers by being able to focus on a single platform that scales, instead of that platform plus 2-3 different bespoke console SKUs on top of that. Being able to easily switch between the various component vendors and really have more choice and control over everything.

I mean they're at the point where they're releasing mid gen refreshes every 3 years or so already anyway... What's going to happen going forward when if they decide want to break free from some specific hardware and lose compatibility? When does it become more work than it's worth? They have to build up an install base every time.

All I'm saying, is eventually I see a world where these platform holders simply publish their games on their own client app platforms, which work on any PC-based device natively, or through cloud on streaming platforms and provides the networking, subscription, and account infrastructure for their players. 3rd parties will continue make deals with all the various platform stores, and they'll take a cut as usual. I just think it gets to a point where consumers are like... I just want to play my games literally anywhere.. and there's already a general purpose platform which allows them to do that basically.
 
All I'm saying, is eventually I see a world where these platform holders simply publish their games on their own client app platforms, which work on any PC-based device natively, or through cloud on streaming platforms and provides the networking, subscription, and account infrastructure for their players.
What you describe feels like what is already the situation on PC, where major publisher insist on selling their wares through their own stores/platfroms, and where even when games can be launched from other platforms (like Steam), games typically don't update unless you running a great pile of platform/stores systems which eats a ton of resources.

That is not the ideal future of PC gaming from my perspective and it still does not address leveraging the economics of making hardware more affordable/accessible, mentioned previously because - again - that own works one somebody controls the sales of all games, in which to take a cut in order to lower the cost often base hardware.
 
just seen this

Performance Review from IGN

Comparing the RTX3080? with the PS5 and HDD tested


He's really outdone himself this time. Essentially concluding that the PS5 hands in a better performance that a 5800X3D + RTX 3080.

Mental gymnastics to achieve this impressive feat include:

  • Running the PC at a capped framerate in average frame rate comparisons (but he says it wouldn't impact the result that much so it doesn't matter - despite there being no reason to actually have the cap on in the first place).
  • Using portal transition sequences (where admittedly we do see more stutter on PC) to measure highest frame times and then using that as an indicator of general relative performance because the PS5 is "twice as fast as the PC in the lows"
  • Acknowledging the PC can use higher settings and achieve better image quality but claiming this system isn't powerful enough to enjoy them using the example of setting everything to maximum again (as opposed to something far more optimised for this system)
  • Repeatedly claiming the 10GB on the 3080 is a limitation vs the PS5 and using settings (4k DLSS Q, Max) well in excess of the PS5's to demonstrate that.
  • Naturally running with DS enabled despite the widely available info showing big performance gains by disabling it
  • My favourite - Head to head performance comparisons with DRS and the same assumed frame rate target in use on both systems and then simply assuming they are running at the same resolution, despite the fact that with the same frame rate target (45fps), the more powerful card could consistently be running at a higher resolution (see example screenshot below)
Screenshot 2023-08-05 163334.png

Look at the detail in circuitry in the foreground left. Much of it is obscured by the frame time graph on the PC but the parts that aren't look clearly higher resolution on the PC to me.

Also at one point he "praises" the PC's menu advantage in that you don't have to reboot the game after every settings change unlike the PS5. While that would be lovely if true, it's not, and we've seen time and again that this game can lose big chucks of performance after a settings change if it isn't restarted. So is he doing that as well?
 
just seen this

Performance Review from IGN

Comparing the RTX3080? with the PS5 and HDD tested

Oof. It's pretty sad a 2070 can't get a good experience with RT on in this game despite having more RT performance. Just goes to show how much more efficient the console architecture really is.
 
Oof. It's pretty sad a 2070 can't get a good experience with RT on in this game despite having more RT performance. Just goes to show how much more efficient the console architecture really is.
With its 8GB frame buffer? No, you can’t. Need to go 1080p or set textures to like Medium.

Also, why are we posting NxGamer’a videos? We know they’re disingenuous.
 
Oof. It's pretty sad a 2070 can't get a good experience with RT on in this game despite having more RT performance. Just goes to show how much more efficient the console architecture really is.

Or that like the Spiderman games the RT in this game is very light weight and the RTX's RT advantage doesn't really come into play. We see that in the Spiderman games were AMD is just as fast as NV with RT on and arguably R&C uses even less RT than the Spiderman games.

Throw in the fact that the 2070 should be a bit slower than the PS5 anyway (unless RT capability is being stressed), and it's constrained by the smaller VRAM and this result doesn't require any significant efficiency benefits,
 
He's really outdone himself this time. Essentially concluding that the PS5 hands in a better performance that a 5800X3D + RTX 3080.

Mental gymnastics to achieve this impressive feat include:

  • Running the PC at a capped framerate in average frame rate comparisons (but he says it wouldn't impact the result that much so it doesn't matter - despite there being no reason to actually have the cap on in the first place).
  • Using portal transition sequences (where admittedly we do see more stutter on PC) to measure highest frame times and then using that as an indicator of general relative performance because the PS5 is "twice as fast as the PC in the lows"
  • Acknowledging the PC can use higher settings and achieve better image quality but claiming this system isn't powerful enough to enjoy them using the example of setting everything to maximum again (as opposed to something far more optimised for this system)
  • Repeatedly claiming the 10GB on the 3080 is a limitation vs the PS5 and using settings (4k DLSS Q, Max) well in excess of the PS5's to demonstrate that.
  • Naturally running with DS enabled despite the widely available info showing big performance gains by disabling it
  • My favourite - Head to head performance comparisons with DRS and the same assumed frame rate target in use on both systems and then simply assuming they are running at the same resolution, despite the fact that with the same frame rate target (45fps), the more powerful card could consistently be running at a higher resolution (see example screenshot below)


Look at the detail in circuitry in the foreground left. Much of it is obscured by the frame time graph on the PC but the parts that aren't look clearly higher resolution on the PC to me.

Also at one point he "praises" the PC's menu advantage in that you don't have to reboot the game after every settings change unlike the PS5. While that would be lovely if true, it's not, and we've seen time and again that this game can lose big chucks of performance after a settings change if it isn't restarted. So is he doing that as well?
Yeah, I don't think this comparison makes sense at all.

For one, DRS is inconsistent across different system configurations and assuming both systems are at the same resolution entirely defeats the purpose of DRS. Why would you assume that the PS5 is running the same resolution when the entire point of DRS is to enable a system to maintain a target frame rate...at the expense of resolution? I just tried this test on my 2080 Ti with a 45fps target with IGTI enabled and it performs very similarly to his 3080. Likely because it drops more res to maintain the target frame rate.

You're also correct that the resolution on the PC side is obviously sharper with that matched shot.

What a bizarre comparison.
 
Acknowledging the PC can use higher settings and achieve better image quality

Let me guess - DLSS doesn't factor into that at all, right?

Also this, while this video is running "And this is why I recommend going below Playstation 5 equivalent settings on 10GB cards" - so why is the video clip showing "Quality Max" in this segment?

I'd say the odds are decently high that 10GB will not be enough for VH textures and perhaps even the PS5's RT settings, you may be getting more stutters even with that, so sure. But...show that though. Don't present a sequence with RT features and raster sliders slammed to the right and present that like it's either 5fps or 'sub-PS5' quality. And as you mentioned, he's likely benching these after switching settings, and especially the DLSS setting being changed massively increases the resident vram, so I'm sure that plays a part.

1691261038101.png

Hey we actually get DLSS mentioned in an NxGamer video though, so there's that. 🙄

Also, yeah I get why he keeps using a 2700X in these comparisons, but man - you can, you know, actually buy a PS5 with the same specs as the one on launch day. You literally cannot get a 2700x, or at least not without a huge markup because they haven't been supplied for ages. There comes a point where you have to accept the reality of the PC market, you know?

Yep, resolution is most definitely higher on the 3080 all the while maintaining a much higher fps in some cases.

It's hard to compare as from my own tests, and with Spiderman as well, IGTI on the PC does not resolve the same way as it does on the PS5. Those images look a little crisper on the PC, but the edges also have noticeably more stair-stepping. IGTI on the PS5 is extremely adept at smoothing out edges, perhaps sometimes to a fault where it can obscure texture detail but it makes comparisons, even with supposedly using the 'same' upscaling method a little difficult.

But yeah, as @Metal Snake Eater says, DRS is also not great to use it for comparisons with performance, as it's far more 'fine-grained' on PC, to a fault. On the PS5, the jumps down in resolution are more severe but it keeps enough headroom as a result where it's not constantly bouncing back and forth as it does on the PC. With Spiderman on my system, the headroom is enough where it works well - on R&C though, DRS enabled just causes significantly more stutters as it can't adjust fast enough.
 
Last edited:
Alex warned about this behavior of DRS not being the same from the Spider-Man Games and again in Ratchet & Clank

But yet here we are
Direct from the horses mouth: DRS on PS5 functions differently than it does on PC. DRS on PS5 works from a list of pre configured resolutions to choose from with limits on the top and bottom res with course adjustments (aka 1440p down to 1296p). PC DRS is freefloating and fine-grain. If you turn on IGTI with DRS set to 60, it will max your GPU essentially at the highest and most fine grain res possible.

Directly comparing frames per second performance is pointless and also wrong as the PC DRS is not good at stabilising FPS, it is about maximising GPU utilisation and image quality.
 
Oof. It's pretty sad a 2070 can't get a good experience with RT on in this game despite having more RT performance. Just goes to show how much more efficient the console architecture really is.
a 2070 can totally get a fine experience with RT on in this game - play at 1080p.

edit: double post hours apart ftw
 
a 2070 can totally get a fine experience with RT on in this game - play at 1080p.

edit: double post hours apart ftw
Honestly, 1080p with upscaling is not what I would call a good experince on this class of card at all. (Without upscaling, the frame drops are huge, as seen with IGN's performance video and even if that were not to be the case, plain 1080p just looks bad today.) Remember, PS5 is running between 1080p and 1440p and then upscales it to 4K, which will look much better. The 2070 is supposed to be on par with PS5, yet in this game it performs much worse. A behavior we've already seen with The Last of Us Part 1. It's safe to say the console punches way above its weight again.
 
Honestly, 1080p with upscaling is not what I would call a good experince on this class of card at all. (Without upscaling, the frame drops are huge, as seen with IGN's performance video) Remember, PS5 is running between 1080p and 1440p and then upscales it to 4K, which will look much better. The 2070 is supposed to be on par with PS5, yet in this game it performs much worse. A behavior we've already seen with The Last of Us Part 1. It's safe to say the console punches way above its weight again.
The PS5 has ~12.5GB to play with, the 2070 has 8GB. Also, as I recall, the PS5 was closer to a 2070S/2080. With that in mind, it’s no surprise that the 2070 completely folds in a game which relies on a large variety of high quality textures that quickly stream in and out.

TLOU Part I is also the same thing. The 2070S will suffer massive drops because of its tiny frame buffer.
 
Honestly, 1080p with upscaling is not what I would call a good experince on this class of card at all. (Without upscaling, the frame drops are huge, as seen with IGN's performance video and even if that were not to be the case, plain 1080p just looks bad today.) Remember, PS5 is running between 1080p and 1440p and then upscales it to 4K, which will look much better. The 2070 is supposed to be on par with PS5, yet in this game it performs much worse. A behavior we've already seen with The Last of Us Part 1. It's safe to say the console punches way above its weight again.

1. Said upscaling on the 2070 produces a much better final image than what PS5 has.
2. PS5 would have huge frame rate drops without it's DRS system.

Here's a reminder of how god-damn awful upscaling looks on PS5 compared to DLSS.

Screenshot 2023-07-29 011821.png
 
Direct from the horses mouth: DRS on PS5 functions differently than it does on PC. DRS on PS5 works from a list of pre configured resolutions to choose from with limits on the top and bottom res with course adjustments (aka 1440p down to 1296p). PC DRS is freefloating and fine-grain. If you turn on IGTI with DRS set to 60, it will max your GPU essentially at the highest and most fine grain res possible.

Directly comparing frames per second performance is pointless and also wrong as the PC DRS is not good at stabilising FPS, it is about maximising GPU utilisation and image quality.
When you say "straight from the horse’s mouth", I assume you mean Insomniac?
 
Back
Top