Do you think there will be a mid gen refresh console from Sony and Microsoft?

Xbox One was 900/30 and Xbox One X was dynamic 4k/30. I think PS5 Pro could be at least dynamic 4k/60. PS5 Pro could be released in end of 2024 and tgat will be 4 years since PS5 release, same as Xbox One X. If it will not be at least dynamic 4k/60, then there is even less need of that.
 
Xbox One was 900/30 and Xbox One X was dynamic 4k/30. I think PS5 Pro could be at least dynamic 4k/60. PS5 Pro could be released in end of 2024 and tgat will be 4 years since PS5 release, same as Xbox One X. If it will not be at least dynamic 4k/60, then there is even less need of that.
Xbox One X was a 4x+ increase in GPU capability. That will clearly not be happening here. Better to look at PS4 to PS4 pro where it went from being 900p-1080p/30 to 1080p-1440p/30.

That sounds like a fetch
How so? The trend is quite clear. Even Spiderman 2, which is rooted in existing PS4 rendering tech and is much better optimized than the vast majority of games this generation will be, fits in that range.
 
Last edited:
Until we see substantial RT increases from AMD GPU's I think they should stay away from a Pro console to be honest.

imo the software gap is bigger than hardware gap, though at least one part of it very well may be due to the hardware gap(DLSS/XeSS looking much better than FSR).
6900XT level of RT performance can run Cyberpunk Ultra RT at 30fps with FSR2P. With TV distances, even Psycho RT can be possible with Ultra Performance mode.

As I said in my other posts, the path-tracing updates to games like Serious Sam had 6900XT at about a third of 4090's performance. But Cyberpunk's OD mode or Portal RTX, just destroy the performance on RDNA2 cards and 7800XT fares similarly. Intel also fare badly so it's very likely the software advantage of nvidia coming to fore.


So R&D into PT optimizations that has allowed Cyberpunk's OD mode to exist, would be the better leveller than hardare improvements. And I wonder if Sony are willing to put out PT updates to their classic games.
 
1. Larian is a AA.5 studio. :) Seriously though, I don't consider anything less than $200m to be a AAA budget anymore. Why don't we say that my AAA is $200m and Seanspeed's is $80m and call it a day. We can each have our own AAA definition. LOL

2. I used to be a big believer in a mid-gen upgrade, but now that every dev team except Insomniac is struggling to get even one game out on existing hardware I feel like new hardware can wait until 2028 at this point.

3. The S is not a waste of silicon for those people who can only afford a $250 console, which is many people. It's also nice for PS users that just want to dip their toes into Xbox for a few key games they'd like to play. There's too many people around here that are desperate to prove that MS made a mistake with the S. Sales say otherwise.
 
Why does it need to be 4k?

1440p is a good input resolution for good upscaling results.
Because now is 2023 and next year will be 2024. 4k/30 become standard for many console games and same only in 60 fps should be on PS5 Pro if that will be released. Otherwise developers could do 1080/60 and upscale from that. :)
Xbox One X was a 4x+ increase in GPU capability. That will clearly not be happening here.
Why that was possible then and can be possible now?
Better to look at PS4 to PS4 pro where it went from being 900p-1080p/30 to 1080p-1440p/30.
Maybe, but how many people will buy this slightly better console? An also, there are a lot PS5 games that run 4k/30 and if PS5 Pro will not run games at 4k/60 for many people that will be failure.
 
Because now is 2023 and next year will be 2024. 4k/30 become standard for many console games and same only in 60 fps should be on PS5 Pro if that will be released. Otherwise developers could do 1080/60 and upscale from that. :)

Why that was possible then and can be possible now?

Maybe, but how many people will buy this slightly better console? An also, there are a lot PS5 games that run 4k/30 and if PS5 Pro will not run games at 4k/60 for many people that will be failure.
Because Xbox One was much lower on the performance and production cost spectrum than PS5 is. 4x PS5 is 4090 territory. 4x faster than Xbox One was 580 territory. Massive difference. Power efficiency has not been improving much at all this last generation, especially on the AMD side. There is no technology AMD has to even offer that level of performance, let alone in a TDP suitable for a console.

Only some cross gen games run at 4k/30.
 
the path-tracing updates to games like Serious Sam had 6900XT at about a third of 4090's performance. But Cyberpunk's OD mode or Portal RTX, just destroy the performance on RDNA2 cards
The Serious Sam/Doom mods are outliers, they are probably not well optimized for NVIDIA (relying more on shaders than acceleration). We have numerous other tests that show a substantial hit to RDNA2/RDNA3 once Path Tracing is enabled anywhere. Desodre, 3DMark Ray Tracing Feature Test, Minecraft, Quake 1, Quake 2, Half Life 1. Even heavy RT does the same effect: Dying Light 2, Hitman 3, Lego Builder's Journey, Guardians of the Galaxy, Chernobylite, The Witcher 3, Ratchet and Clank .. etc.

But you might be right, NVIDIA invested into lots of optimizations for ReSTIR, RTXDI/RTXGI as well as various denoisers (denoisers are often a big chunk of the load). I am sure this helps. DLSS3.5 is going to be another big factor as well.
 
Last edited:
How so? The trend is quite clear. Even Spiderman 2, which is rooted in existing PS4 rendering tech and is much better optimized than the vast majority of games this generation will be, fits in that range.
SM2 received visual and technical enhancements beyond the original and operates at 1080p-1440p/60fps in performance mode (higher framerates with 120hz mode and/or VRR) and at 1440p-2160p/30fps in fidelity mode (or 40fps or unlocked framerates with VRR).

It's not a simply 1080p or1440p and that doesn't fit to the 720p/1080p 60fps machine description you used at all.
 
Because now is 2023 and next year will be 2024. 4k/30 become standard for many console games and same only in 60 fps should be on PS5 Pro if that will be released. Otherwise developers could do 1080/60 and upscale from that. :)

Amount of pretty pixels > amount of pixels.
 
SM2 received visual and technical enhancements beyond the original and operates at 1080p-1440p/60fps in performance mode (higher framerates with 120hz mode and/or VRR) and at 1440p-2160p/30fps in fidelity mode (or 40fps or unlocked framerates with VRR).

It's not a simply 1080p or1440p and that doesn't fit to the 720p/1080p 60fps machine description you used at all.
Spiderman 2 is 1080p more often than it's higher and is at the top of what PS5 achieves performance wise in a modern game at 60 fps. Most other games fall well below 1080p. Which is why I listed it as a 720p-1080p machine for 60 fps. Its technical enhancements are primarily world size and scene density. The quality of what is rendered is only minorly improved in a very few select areas.
 
The Serious Sam/Doom mods are outliers, they are probably not well optimized for NVIDIA (relying more on shaders than acceleration). We have numerous other tests that show a substantial hit to RDNA2/RDNA3 once Path Tracing is enabled anywhere. Desodre, 3DMark Ray Tracing Feature Test, Minecraft, Quake 1, Quake 2, Half Life 1. Even heavy RT does the same effect: Dying Light 2, Hitman 3, Lego Builder's Journey, Guardians of the Galaxy, Chernobylite, The Witcher 3, Ratchet and Clank .. etc.

But you might be right, NVIDIA invested into lots of optimizations for ReSTIR, RTXDI/RTXGI as well as various denoisers (denoisers are often a big chunk of the load). I am sure this helps. DLSS3.5 is going to be another big factor as well.

The Serious Sam/Doom mods were tested by PCGH and 2080Ti performed quite worse than 3070.


I will have to check if I have Q2 results for 6800XT, but for Dying Light 2 and Witcher3, they had similar performance difference as the PT upgrades with 4090 being around 3x of 6900XT.
 
Given there's no actual grading specification, the whole term AA and AAA is relative to the subjective assessment of whomever is grading them. In some cases a game could be AAA in terms of what's accomplished even if the studio is a small indie, while another massive investment could produce a B grade game in terms of quality.
Wikipedia has an article on the use of these definitions over time, including AAA+ and AAAA.

I think using either head count or budget as a defining metric is that these can be seemingly inflated for different reasons. Take for example the 2,169 headcount for The Last of Us Part II, which was sourced from the end game credits. Credit lists are going to inflate the headcount for any long project because one post may be filled by multiple people over the project and plenty of folks credited input will be minimal and could be measured in days or even hours, depending on their role. It's why genuinely useful metrics like FTE (Full Time Employee hours) can be more meaningful.

Likewise, spending a lot of money is no guarantee of quality, nor do flat budget numbers help when the economic situation in the country of development is quite different to some other countries, e..g the Witcher 3 cost was estimated $81m with a peak development size of 250 people from a core of 150 people.
 
Wikipedia has an article on the use of these definitions over time, including AAA+ and AAAA.

I think using either head count or budget as a defining metric is that these can be seemingly inflated for different reasons. Take for example the 2,169 headcount for The Last of Us Part II, which was sourced from the end game credits. Credit lists are going to inflate the headcount for any long project because one post may be filled by multiple people over the project and plenty of folks credited input will be minimal and could be measured in days or even hours, depending on their role. It's why genuinely useful metrics like FTE (Full Time Employee hours) can be more meaningful.

Likewise, spending a lot of money is no guarantee of quality, nor do flat budget numbers help when the economic situation in the country of development is quite different to some other countries, e..g the Witcher 3 cost was estimated $81m with a peak development size of 250 people from a core of 150 people.
Ubisoft describing Skull and Bones as AAAA is pure comedy.
 
R&C is 1296p-4k in quality mode.
Mostly 1800-2160p.
TLOU Remake is basically a PS4 game.
Not at all. Assets quality, textures resolution and physics are higher than in any PS4 game including TLOU 2.
FF XVI is 1080p-1440p in quality mode.
Ok, forgot about that. I also played it seme weeks ago and that games looked very high res in quality mode.

Of coure I didn't meant that those games always are 4k, they are dynsmic 4k. But for some years there is almost no games on consoles that work not in dynamic resolution. Sad, this is one of moments why I liked consoles before 8th gen more.
 
Mostly 1800-2160p.

Not at all. Assets quality, textures resolution and physics are higher than in any PS4 game including TLOU 2.

Ok, forgot about that. I also played it seme weeks ago and that games looked very high res in quality mode.

Of coure I didn't meant that those games always are 4k, they are dynsmic 4k. But for some years there is almost no games on consoles that work not in dynamic resolution. Sad, this is one of moments why I liked consoles before 8th gen more.
I disagree on TLOU Remake looking any better than TLOU 2.

 
Last edited:
imo the software gap is bigger than hardware gap, though at least one part of it very well may be due to the hardware gap(DLSS/XeSS looking much better than FSR).
6900XT level of RT performance can run Cyberpunk Ultra RT at 30fps with FSR2P. With TV distances, even Psycho RT can be possible with Ultra Performance mode.

As I said in my other posts, the path-tracing updates to games like Serious Sam had 6900XT at about a third of 4090's performance. But Cyberpunk's OD mode or Portal RTX, just destroy the performance on RDNA2 cards and 7800XT fares similarly. Intel also fare badly so it's very likely the software advantage of nvidia coming to fore.


So R&D into PT optimizations that has allowed Cyberpunk's OD mode to exist, would be the better leveller than hardare improvements. And I wonder if Sony are willing to put out PT updates to their classic games.
Cyberpunk is a lot more complex than any other "pathtracing" game. More geometry, more animation, more moving objects, more rays etc.
 
Back
Top