I see 2 possible culprits here (or a combination of these):
N6 is
a partial EUV node that uses the same tools and design rules as N7 (DUV only), and
TSMC expected many N7 designs to transition to N6. Performance is expected to be actually similar to N7 (and inferior to N7+ BTW), and density is just 18% higher.
Changing the SoC from N7 to N6 should come from the fact that the transition should be pretty cheap, yields should be somewhat better due to the EUV layers and 18% density improvement should lead to a ~260mm^2 SoC.
N6 is most of all a cost saving measure for the SoC on Sony's part, and not for the PSU or cooling system.
That said, I think we should expect Microsoft to make the transition to N6 on their consoles as well, and most of AMD's GPU/CPU designs that will last for another year or two.
I don't know if a N6 waffer is 20% more expensive than a N7 one, but the 18% transistor density improvement alone would cover most of that price difference, meaning a $90 SoC on N7 wouldn't cost much more on N6.
Understand more clearly on that, thanks! So that actually does make N6 a good qualifier potentially for a PS5 Slim for example. Only issue is, as you just brought up, it performs worst than N7+, and IIRC PS5 is on 7nm+ (as is Series X, most likely), so if that's the case how does moving the design to N6 affect the performance? Is it a trade-off of smaller chip size but slightly worst power consumption at the same clocks, but perhaps the savings gained on a smaller node with better density could bring cooling costs down enough to offset that?
At least on cursory glance, it sounds like N6 would be better served for something else, like a PS5 Pro, but at that point 5nm would be better since you get better power reduction, a perf boost and a bigger density shrink...although the costs increase to compliment that. Personally I still think if there're mid-gen refreshes, they'll be on 5nm at least, probably 5nm EUVL for at least one of them.
Miles Morales is doing a good job. 30fps ray trace, 60fps without. I would like to be given this choice in more titles.
Honestly raw RT performance won't be the way to go for next-gen, but I think combinations of RT and other things like better SSR, plus artistic choices (if you look at Demon's Souls for example there's parts you'd of sworn were RT'd but IIRC the game has no RT in it), on combination of hardware and software approaches, can give really smart results.
Personally I'm still curious about Project Mara because the RT in the interior shots was amazing, and I wonder if scale of the game (i.e a smaller game) brings with it better budget margins for pushing RT without compromising performance too much. All the same, if things like DirectML work as well as intended that should free up a ton more of the performance budget to push RT. I hope that ends up being the case.
That said, relying on DirectML or equivalent would mean a game-by-game approach, similar to what Nvidia are doing with DLSS now on PC, therefore probably best to temper expectations on that level of RT except for 1P games and massive 3P games that can afford that type of approach on multiple platforms, like the next GTA (or Cyberpunk, at least for what I saw yesterday).
Kind of agree, or at least hoping developers find ways to make RT more efficient through software over time. Every console gen it seems like the aim is to reach a higher resolution or higher FPS. Last gen 1080P, 30-60FPS, mid gen 4K 30-60FPS, and now were aiming for 4K again but at 60-120FPS with RT. Wondering if next time around 4K 60-120 will continue to be the goal along with RT hardware being more mature? Maybe we'll finally be able to see new hardware used strictly on improving graphics as opposed to pushing more pixels and more frames per second? I don't see 8K being a thing anytime soon but who knows 7ish years from now.
Honestly I don't know why native resolution keeps ending up as a major push. I think we're good enough @ 4K, even on massive home theater screens 4K provides extremely good clarity. Are these companies expecting us to end up with cinema screens in our homes within 10 years? What about those of us in apartments, real estate is a precious resource xD.
The focus going forward should be on graphical fidelity; I know some people keep saying we're reaching a saturation point of diminishing returns...but are we really? Look, until I get a game with Fifth Element, Blade Runner 2049 or End Game levels of CG fidelity in real-time, I think there's still a good ways to go
That's even knowing the increases in production workforce that'd be needed, but I think smarter algorithms and maybe advances in things like GPT AI (that could assist massively in offloading coding and programming and expediate that process and perhaps even asset creation processes...though would prob still need a human touch involved to give it that bit of personality and guide things along) can help tremendously with that.