Given the current form factor of PS5 I don't even see how they do a Pro model.
I think this will still be the case this generation of course too as consoles developers seem resolution greedy - and pushing resolution higher than is honestly reasonable at times at the sacrafice of GPU visual settings. We already have examples of very sub-ultra settings (mainly medium + some lower than low) in Watch Dogs Legion. That game has what I would call "real" ultra settings though... not like other games where the performance cost of Ultra is basically another game's "medium".
I see 2 possible culprits here (or a combination of these):
1 - The
Native 4K marketing point is already taking its victims, as developers feel pressured (from console makers? marketing divisions?) to release their games rendering at the full 4K because it's now a
selling point.
2 - Release window games were mostly developed on RDNA1 cards, and devs simply used the same code as the 8th-gen versions of their games (with RT being just late additions), meaning the same games could now render at 4K when they ran at 1440p-1800p + reconstruction on the mid-gens.
I can also think of 3 - SeriesS needs to be between 1080p and 1440p, and if the dev has no time to optimize for both microsoft consoles then SeriesX must always have over twice the pixel count so they can use about the same settings on both consoles (other than resolution). However, this wouldn't apply to PS5 first party games and we're seeing a bunch of those running at 4K at least on the reveal titles.
i would love 1080p 30fps with as much graphical features as possible in console games.
i'm more photo mode man than gamer now.
Well, realistically you wouldn't need 30FPS for photo mode. Just 5FPS should be fine
Very interesting. So is Oberon+ a smaller Oberon (6nm? Not 5nm EUVL?) for power consumption savings? They could still ecke out a slight power boost, especially if going 5nm EUVL, and also give more of a die shrink than 6nm (actually this is my first time hearing of 6nm in any capacity. Maybe it's a rogue TSMC node, I might've seen a single roadmap with 6nm on it xD).
N6 is
a partial EUV node that uses the same tools and design rules as N7 (DUV only), and
TSMC expected many N7 designs to transition to N6. Performance is expected to be actually similar to N7 (and inferior to N7+ BTW), and density is just 18% higher.
Changing the SoC from N7 to N6 should come from the fact that the transition should be pretty cheap, yields should be somewhat better due to the EUV layers and 18% density improvement should lead to a ~260mm^2 SoC.
N6 is most of all a cost saving measure for the SoC on Sony's part, and not for the PSU or cooling system.
That said, I think we should expect Microsoft to make the transition to N6 on their consoles as well, and most of AMD's GPU/CPU designs that will last for another year or two.
a bare die from TSMC is ~$85-95 for the ps5 SOC, say 5 nm is roughly twice the price of 7 nm at TSMC if the link below is anything to go by, lets be generous and say a 6 nm chip is only 20% more than a 7nm one, so that would be $102-114 per PS5 SOC, even if the 6nm soc produced a third of the heat of the 7nm version there is no way you could make a saving of $15-19 in the cooler that they have.
I don't know if a N6 waffer is 20% more expensive than a N7 one, but the 18% transistor density improvement alone would cover most of that price difference, meaning a $90 SoC on N7 wouldn't cost much more on N6.