Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Apple is taking up all the wafer starts they can for 5 nm because they need the power savings for their mobile chip, but they're also paying quite a bit per chip.
Just to chip in on how TSMC is setup, they run lines for variations fo process for each node and operate different lines for different types of jobs. Apple likely have a semi-permanent line or two for their needs, TSMC will be cranking out silicon for them 24/7 and that will likely increase some with Apple Silicon.

But TSMC also have a whole bunch of lower volume lines that are setup to be swapped around for different customer needs. They almost always have some capacity here.
 
Given the current form factor of PS5 I don't even see how they do a Pro model.


I think this will still be the case this generation of course too as consoles developers seem resolution greedy - and pushing resolution higher than is honestly reasonable at times at the sacrafice of GPU visual settings. We already have examples of very sub-ultra settings (mainly medium + some lower than low) in Watch Dogs Legion. That game has what I would call "real" ultra settings though... not like other games where the performance cost of Ultra is basically another game's "medium".

I see 2 possible culprits here (or a combination of these):

1 - The Native 4K marketing point is already taking its victims, as developers feel pressured (from console makers? marketing divisions?) to release their games rendering at the full 4K because it's now a selling point.

2 - Release window games were mostly developed on RDNA1 cards, and devs simply used the same code as the 8th-gen versions of their games (with RT being just late additions), meaning the same games could now render at 4K when they ran at 1440p-1800p + reconstruction on the mid-gens.


I can also think of 3 - SeriesS needs to be between 1080p and 1440p, and if the dev has no time to optimize for both microsoft consoles then SeriesX must always have over twice the pixel count so they can use about the same settings on both consoles (other than resolution). However, this wouldn't apply to PS5 first party games and we're seeing a bunch of those running at 4K at least on the reveal titles.


i would love 1080p 30fps with as much graphical features as possible in console games.

i'm more photo mode man than gamer now.
Well, realistically you wouldn't need 30FPS for photo mode. Just 5FPS should be fine ;)


Very interesting. So is Oberon+ a smaller Oberon (6nm? Not 5nm EUVL?) for power consumption savings? They could still ecke out a slight power boost, especially if going 5nm EUVL, and also give more of a die shrink than 6nm (actually this is my first time hearing of 6nm in any capacity. Maybe it's a rogue TSMC node, I might've seen a single roadmap with 6nm on it xD).

N6 is a partial EUV node that uses the same tools and design rules as N7 (DUV only), and TSMC expected many N7 designs to transition to N6. Performance is expected to be actually similar to N7 (and inferior to N7+ BTW), and density is just 18% higher.
Changing the SoC from N7 to N6 should come from the fact that the transition should be pretty cheap, yields should be somewhat better due to the EUV layers and 18% density improvement should lead to a ~260mm^2 SoC.
N6 is most of all a cost saving measure for the SoC on Sony's part, and not for the PSU or cooling system.

That said, I think we should expect Microsoft to make the transition to N6 on their consoles as well, and most of AMD's GPU/CPU designs that will last for another year or two.

a bare die from TSMC is ~$85-95 for the ps5 SOC, say 5 nm is roughly twice the price of 7 nm at TSMC if the link below is anything to go by, lets be generous and say a 6 nm chip is only 20% more than a 7nm one, so that would be $102-114 per PS5 SOC, even if the 6nm soc produced a third of the heat of the 7nm version there is no way you could make a saving of $15-19 in the cooler that they have.
I don't know if a N6 waffer is 20% more expensive than a N7 one, but the 18% transistor density improvement alone would cover most of that price difference, meaning a $90 SoC on N7 wouldn't cost much more on N6.
 
Miles Morales is doing a good job. 30fps ray trace, 60fps without. I would like to be given this choice in more titles.

Well as long as there is a choice

Though I think that choice will also expose the poor cost/benefit ratio of current RT implementations on consoles.

You are basically paying a significant performance hit for "mirrors everywhere" and slightly more accurate lighting that you will barely notice during gameplay imo.
 
So looking at these 6800 reviews kinda hope RT is used very minimally on the consoles. It's going to hold back what is pretty great general performance.

It will probably be the case only shadow or reflection and no GI, AMD talk about selected raytracing. But one big engine does not use RT at all, it is Unreal Engine 5 Fable, Hellbalde 2 and tons of other first party MS games will use it. Same for Sony Bend and the new Sony San Diego studio rumored to reboot Uncharted franchise.
 
Last edited:
So looking at these 6800 reviews kinda hope RT is used very minimally on the consoles. It's going to hold back what is pretty great general performance.
Kind of agree, or at least hoping developers find ways to make RT more efficient through software over time. Every console gen it seems like the aim is to reach a higher resolution or higher FPS. Last gen 1080P, 30-60FPS, mid gen 4K 30-60FPS, and now were aiming for 4K again but at 60-120FPS with RT. Wondering if next time around 4K 60-120 will continue to be the goal along with RT hardware being more mature? Maybe we'll finally be able to see new hardware used strictly on improving graphics as opposed to pushing more pixels and more frames per second? I don't see 8K being a thing anytime soon but who knows 7ish years from now.
 
It is what I expected as an advantage for XSX.
Only if shader limited. I expect PS5 to really be close even after tools from MS get better.

------
At lower resolutions, the triangle sizes get smaller too, and PS5 may still have an advantage there by default. Still, not a good look for MS with what appears to be a growing trend. o_O
MS got jebaited. This almost looks like PS360 reverse. Oh the days where RSX would choke on vertices.
 
Wonder what the issue is on xsx. Would love to run a profiling tool on it. Unbalanced load? Poor utilization of the cus? Or simple ps5 is just outright better. Only bc has been consistent and predictable. Can Dev with exp of xsc chime in :D
 
Wonder what the issue is on xsx. Would love to run a profiling tool on it. Unbalanced load? Poor utilization of the cus? Or simple ps5 is just outright better. Only bc has been consistent and predictable. Can Dev with exp of xsc chime in :D
Cross gen games are front end heavy + memory allocation + really really hot tools + PS5 dev kits being in devs hand for more then 18 months now.

I expect general difference to be similar to DMC5 4K - 10% in favor of XSX, but that is not enough with 50mm² bigger chip, slower SSD (well, on paper) and controller difference.

MS went way to conserative if they thought Sony will not push to absolute limit with PS5.
 
Well I wasn't expecting that. I though the FPS differences people were talking about were 1-2% differences and maybe for a couple of seconds. But seeing 20% difference across whole areas... wasn't expecting that.
 
Status
Not open for further replies.
Back
Top