Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
The $100 more expensive PS360 consoles sold massively more than their cheaper models. Why would 90% of consumers prefer a $600 PS3 to a $500 but if it was $50 more, so a $400 PS5 versus a $550 PS5+, say, they'll skew so much further the other way?
$400 PS5 will attract another audience who care price other than graphics, if PS5 pro costs $150~200 more.
 
If it was gpu chiplet based of some sorts, it wouldn't really be an issue. But I'm not expecting that.

Why not?
We know the PS5 (and most probably Scarlett too) has 8 Zen2 cores. And we know TSMC has been mass producing the small 74mm^2 Zen2 CCD like crazy (goes into Epyc + Ryzen 3000 + Threadripper 3000), which has 8 Zen2 cores each.
Sure, the expected volume of a console should justify making a monolithic SoC. But the sheer economy of scale might be making the Zen2 CCD rather cheap to make at the moment. Each 300mm waffer makes >750 CCDs, and assuming some $15K per waffer (assuming cost of 7nm = 2*16FF), that means around $20 per CCD as yields go north of 90%.

Similar to what's happening in the home and server CPUs, it could be cheaper to pair the existing 74mm^2 Zen2 CCD with e.g. a 300mm^2 chip that has GPU + MCU + IO + low-power ARM, than to produce a monolithic 370mm^2 SoC.
Especially if there's relatively little development needed for the CPU part of the SoC, since the CCD is an off-the-shelf part at this point.

I'm not saying I'm sure the new consoles will be using the Zen 2 CCD, but there are plenty of good arguments for them to use it.
 
Is there any mass produced product using GPU chiplets that we know of? If not, I can't see Sony going for something so out of the left field. Their failed bet on Cell must still hurt for them to try and do something so different.
 
$400 PS5 will attract another audience who care price other than graphics, if PS5 pro costs $150~200 more.
Sure, but how big is that market and what should the hardware split be? Can you point to hard data, or even suggestive data, that given a yield rate of x%, that x% would be enough to satisfy demand for the higher-end SKU? Can you point to real evidence that we might not see something like 30-40% wanting the high-end model while only 20% of chips were good enough for the high end model, or some other similar scenario?

Again, PS360 generation, both consoles offered a lower price entry point for the more cost-sensitive consumer, yet that consumer just didn't exist among the early adopters. These companies can't just go ahead with a notion with little more than hand-waving optimism that it'll all work out okay in the end. ;)

Edit: In the event that binning is used, what's the timeline to improvement from the initial yield? Let's say 20% of chips are perfect for the Elite during initial manufacturing, how does that improve over time? Will it be stuck at 20% until a node shrink, or will things improve through other changes in the manufacturing?
 
Again, PS360 generation, both consoles offered a lower price entry point for the more cost-sensitive consumer, yet that consumer just didn't exist among the early adopters. These companies can't just go ahead with a notion with little more than hand-waving optimism that it'll all work out okay in the end. ;)
Reasonable reasons to use that as basis for comparison.
But I think the arcade was a bad sku, I think 1S and 1X is better but as you mentioned previously didn't launch at the same time.
Even the 1SADE, but that wasn't exactly a smooth launch due to RRP, and the XO market in general has cratered.
 
Is there any mass produced product using GPU chiplets that we know of? If not, I can't see Sony going for something so out of the left field. Their failed bet on Cell must still hurt for them to try and do something so different.
Also cooling going even more costly and difficult.... are chiplets stable on quite high temperature ?
 
I explicitly said gpu chiplet, as that's where I would expect the biggest difference between the two models. E.g. Top model 2 gpu chiplets, lower model 1.

Cpu chiplet is possible, but I'm still expecting a monolithic soc though.

I thought "chiplets" would always refer to anything other than a monolithic die? I.e. the Xenos used a GPU composed of two chiplets.


Is there any mass produced product using GPU chiplets that we know of?
See above.

Also, 3dfx voodoo and voodoo 2.
:)
 
I thought "chiplets" would always refer to anything other than a monolithic die? I.e. the Xenos used a GPU composed of two chiplets.



See above.

Also, 3dfx voodoo and voodoo 2.
:)

Does Xenos really count? Aren't you talking about the GPU + EDRAM? Hardly the same thing we are talking about.
3dfx voodoo and voodoo were very simple hardware compared to now days, plus they weren't mass produced on the level a console is.
 
Does Xenos really count? Aren't you talking about the GPU + EDRAM? Hardly the same thing we are talking about.
3dfx voodoo and voodoo were very simple hardware compared to now days, plus they weren't mass produced on the level a console is.
IIRC Xenos had the ROPs in the edram chip.
 
So how do they come to that conclusion? Both are going to have the same hardware?
Sounds like BS, all of it?

16 cores are the future though, Intel lags behind if they are going with 10.
 
Last edited:
So how do they come to that conclusion? Both are going to have the same hardware?
Sounds like BS, all of it?


I don't see where it says they are both going to have the same hardware. It just says that MS is pushing ray tracing more than Sony. Perhaps they will have more dedicated raytracing at the expense of other features
 
I don't see where it says they are both going to have the same hardware. It just says that MS is pushing ray tracing more than Sony. Perhaps they will have more dedicated raytracing at the expense of other features

Or they just have a more funded console like the one x vs Pro.
 
The probabilities are pretty high it’s the same feature. At best some customizations to support it further.
Think feature sets; tier levels. Not necessarily more raw performance.
That’s the most common type of customization as I see it.
 
Last edited:
Redgamingtech does have some very solid sources, as they did an exclusive reveal of Radeon VII with pictures of the card over a week before the actual release.

It's entirely possible that xbox's raytracing will be different from the PS5's, and being different means one will be more advanced than the other even if it never really translates into a substantial visual difference.

I wasn't aware of bitsandchips claiming that console makers are dual-sourcing CPU and GPU chiplets, nor those Oberon clocks from Komachi showing the PS4 and PS4 Pro clocks, followed by 2GHz.
 
Well if they going with flash storage that should save 10W ?

Unlikely, most 2.5" drives don't exceed about 2.5-3.5 W. 3.5" drives are generally around 8-11 W, IIRC.

That's obviously not a cooling system for a 100W SoC.
This is a cooling system for a 120W SoC:

348S5Cg.jpg

That's also noisier than any of the current gen consoles. I'm not sure either company is willing to go back to higher than PS3/X360 launch noise levels.

Regards,
SB
 
Drives generally only hit their max Watts usage during initial spin-ups and not during actuall use, so savings on power draw while gaming will never be that much.
 
Status
Not open for further replies.
Back
Top