What level of GPU are the consoles equivalent to? *spawn

I disagree, they're typically at 2070 level with raster and 2060 (or lower) with RT enabled.
This is just not true. The 2070 performed about halfway between a 1080 and 1080ti at raster. It’s completely absurd to claim this is the most common performance bracket the consoles fit in. They only perform like a 2060 in the occasional Nvidia backed game in terms of RT.
 
I disagree, they're typically at 2070 level with raster and 2060 (or lower) with RT enabled.
By the countless DF comparisons, it's been well-established that they're roughly on the level of a 2070S/2080/RX 6700 in rasterization and closer to the RTX 2070 or 2060S when ray tracing is involved to a certain degree. There's almost no game where they're as slow as the 2070.

- In Frontiers of Pandora, Alex established that a 2070S is needed to match console performance, IQ, and settings. This of course includes ray tracing.

- In Watch_Dogs Legion, the PS5 performed on the level of a 2060 or perhaps 2060S with ray tracing enabled,

- In A Plague Tale Requiem, DF showed that the RTX 2070S is 3% faster on average than the PS5.

- In a recent DF episode, Cyberpunk 2077 without RT and an RX 6700 downclocked to 2.24GHz (PS5 specs) performed a bit worse than the PS5. The RX 6700 typically boosts to around 2.45GHz in games. This means the 6700 would be around 10% faster with is normal game clocks. That's once again in line with the 2070S which is around 3-5% slower than the 6700. Do keep in mind that the performance mode is capped to 60fps and that the PS5 averaged 59.9 fps. There is little doubt it would do slightly better were the frame rate uncapped.

- In Assassin's Creed Valhalla, the PS5 performed in line with an RTX 2080.

- In Death Stranding, the RTX 2080 was 96% of the PS5's performance. However, in some scenes, it could actually be a bit above. Same level overall.

- In The Last of Us Part I, the PS5's GPU is much faster than the 2070S. In fact, it's closest to the RTX 2080 Ti/RTX 3070/6750 XT.

- In Uncharted 4, the 6800 according to NXGamer was 24% faster at 4K and about 15% at 1440p but it ran into a CPU bottleneck on PC. This would put the PS5 on the level of a 2080 Ti/3070/6750 XT.

And the list goes on but this is what I can remember off the top of my head. We're generally looking at 6650 XT/6700/2070S/2080 in rasterization. In first-party titles, it's closer to 6750 XT/2080 Ti/3070. Ray tracing varies quite a bit depending on the extent of the implementation. If you just have low-grade RT shadows, then the PS5 won't lose that much performance. If you throw copious amounts of ray tracing at it, then the PS5 can fall on the level of a 2070 or even 2060S. It's not very different for the Xbox but you do have games where there is a large performance disparity such as with Alan Wake 2 or A Plague Tale Requiem which see the Series X outperform the PS5 by around 20-25%. The opposite can also happen.
 
I guess that's why Forza Motorsport needs a 6800XT or Rtx 3080 VGA to be able to bring the graphics level of Series X in a stable 60 FPS... :)
It doesn't. The game also suffered from severe CPU issues at launch and image reconstruction wasn't even working properly. I think FSR2 straight up didn't work.
 
I feel like there doesn't need to be any grand argument over this. They're not using some esoteric or highly custom architectures or anything. Their specs are basically directly relatable to what exists on PC(on AMD's side).

Trying to argue this by performance in multiplatform games is a fool's errand cuz it will vary heavily based on the game, all while exact console settings are often difficult or impossible to nail down on PC to do a proper comparison anyways. Picking a choosing some games does not tell you what the consoles are 'actually' equivalent to, especially when people are likely gonna be using a bunch of cross-gen titles and whatnot. And it doesn't tell you what the consoles are overall capable of in general, as devs always get more and more out of them as time goes on, not to mention the extra advantage of something like Playstation 1st party console exclusive titles having just one fixed spec to target and optimize for, along with having 1st party development resources and knowledge.
 
I feel like there doesn't need to be any grand argument over this. They're not using some esoteric or highly custom architectures or anything. Their specs are basically directly relatable to what exists on PC(on AMD's side).

Trying to argue this by performance in multiplatform games is a fool's errand cuz it will vary heavily based on the game, all while exact console settings are often difficult or impossible to nail down on PC to do a proper comparison anyways. Picking a choosing some games does not tell you what the consoles are 'actually' equivalent to, especially when people are likely gonna be using a bunch of cross-gen titles and whatnot. And it doesn't tell you what the consoles are overall capable of in general, as devs always get more and more out of them as time goes on, not to mention the extra advantage of something like Playstation 1st party console exclusive titles having just one fixed spec to target and optimize for, along with having 1st party development resources and knowledge.
The delta doesn't vary that much, honestly. It's generally within a 10% window with some outliers. We have over 3 years worth of games to compare and what I posted above is what consoles have been doing in general. A PS5 won't start performing like a 6800 XT but it won't drop to the level of a 6500 either.
 
Sometimes these thread moves result in maybe missing the context of the conversation. Wasn't the original discussion in relation to whether a new console gen needed "sooner" in 2026 is needed or not?

I think the issue is regardless of how fast we feel the PS5 is relative to 2018 GPUs if we look at relative gains a 2024 GPUs (assuming no delays) later in the year will likely be x3 faster than 2018 GPUs roughly per tier, maybe optimistically x3.5. Which stands to reason that a 2026 PS6 (or Xbox) if targeting the same cost profile would likely not hit x4 faster and likely only x3 faster. Well at least conventionally, RT performance and overall performance leveraging ML (or other specific units) would result in higher gains.
 
- In Assassin's Creed Valhalla, the PS5 performed in line with an RTX 2080.

- In Death Stranding, the RTX 2080 was 96% of the PS5's performance. However, in some scenes, it could actually be a bit above. Same level overall.

- In The Last of Us Part I, the PS5's GPU is much faster than the 2070S. In fact, it's closest to the RTX 2080 Ti/RTX 3070/6750 XT.

- In Uncharted 4, the 6800 according to NXGamer was 24% faster at 4K and about 15% at 1440p but it ran into a CPU bottleneck on PC. This would put the PS5 on the level of a 2080 Ti/3070/6750 XT.
In these games, the PC is simply paying the DX12 tax, ports without proper implementation of DX12 will have lower performance than consoles on equivalent GPUs. It's the reason this variance in performance exists at all.
 
Valhalla is a broken port. It uses less than 350W on a 4090 in 4K. It is a great example how unoptimized software stack can hold back modern GPUs.
 
In these games, the PC is simply paying the DX12 tax, ports without proper implementation of DX12 will have lower performance than consoles on equivalent GPUs. It's the reason this variance in performance exists at all.
There's definitely something going on with those Sony ports. Some argue that most third-party games simply aren't optimized on PlayStation though. This has some merit because Sony games look top-tier and also run amazingly well on the PS5.
 
this is the whole Digital Foundry Analysis so far minus Avatar Frontiers of Pandora which saw a +5% for RTX2070Super (without Mesh shading) over PS5 (check table below)

Village as they covered Series X was performing 10% than PS5

i wouldn't mix NXG with DF results though as they Differ Widely by big margins

for Example

1- in Deathloop according to DF their 2070S was performing 8% higher than PS5, in NXG case the PS5 was outperforming 33% better than OC 2070 (39% difference between the two)
1705071920504.png 1705071955756.png

2- in Death Stranding according to DF PS5 outperforming 2070S by 10-12% and 3% over 2080 while in NXG case it's 36 - 64% over OC 2070 thus concluded you need a 3070 to match
in DF coverage used a 2080TI and it was always above the PS5, far from what NXG suggested (26 - 52% difference between the two)

1705072165318.png 1705072103210.png

3- in Ratchet & Clank DF can't match equal settings as Nixxes told them DRS works differently on PC than it does on PS5 .. however NXG concluded the PS5 is right on the heels of a 3080 in a benchmark pass .


1705070532024.png
 
this is the whole Digital Foundry Analysis so far minus Avatar Frontiers of Pandora which saw a +5% for RTX2070Super (without Mesh shading) over PS5 (check table below)

Village as they covered Series X was performing 10% than PS5

i wouldn't mix NXG with DF results though as they Differ Widely by big margins

for Example

1- in Deathloop according to DF their 2070S was performing 8% higher than PS5, in NXG case the PS5 was outperforming 33% better than OC 2070 (39% difference between the two)
View attachment 10601 View attachment 10602

2- in Death Stranding according to DF PS5 outperforming 2070S by 10-12% and 3% over 2080 while in NXG case it's 36 - 64% over OC 2070 thus concluded you need a 3070 to match
in DF coverage used a 2080TI and it was always above the PS5, far from what NXG suggested (26 - 52% difference between the two)

View attachment 10604 View attachment 10603

3- in Ratchet & Clank DF can't match equal settings as Nixxes told them DRS works differently on PC than it does on PS5 .. however NXG concluded the PS5 is right on the heels of a 3080 in a benchmark pass .


View attachment 10600

Great summary! Although I think we can all treat that NXG conclusion on Ratchet & Clank as fanciful at best.

I recall DF didn't do a direct comparison there but what they did do saw the 2070S struggling to get to 60fps on PS5 settings in heavier areas.
 
3- in Ratchet & Clank DF can't match equal settings as Nixxes told them DRS works differently on PC than it does on PS5 .. however NXG concluded the PS5 is right on the heels of a 3080 in a benchmark pass .
About this, he conceded that the RTX 3080 had an average fps by 25% but that the lows were much lower than on PS5 and attributed this to the I/O and memory management subsystems. He might not be entirely wrong because in my case, my 2080 Ti performs much better on my 13900K with DDR5 than it does on my 9900K with DDR4. Apparently, R&C really likes bandwidth.
 
Back
Top