Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Just food for thought...

Die sizes :

PS2 - 519mm2 ↓
PS3 - 493mm2 ↓
PS4 - 348mm2 ↓
Pro - 320mm2 ↓
PS5 - ???mm2 ?

Clocks :

PS5 - ???GHz ?
Pro - 911MHz ↑
PS4 - 800MHz ↑
PS3 - 500MHz ↑
PS2 - 147MHz ↑

...as we go further down the manufacturing node, there is clear trend of chip sizes getting smaller and smaller, while frequencies are getting higher and higher.

This is no doubt result of much higher expenses for chip design and higher cost per mm2 of chip.

nano3.png


When people ask themselves why would Sony go for narrower and faster design (beside BC consideration - which are paramount to next gen success and ability to transition PS4 players to PS5) - here is your answer.

Those rising chip design costs are easily justified on a business as big as PlayStation. It brings in $20bn a year in revenue.

Have any of the B3d threads surfaced good data on chip cost / size / yields etc?

The articles I found were really about mobile chips. They didn't give you the numbers to really say a 28nm or 16nm chip of this size cost this much more on 7nm+.

Is that extra acreage on the Series X really going to result in an extra $100 on the retail price?

(Not disputing the increasing costs of transistors, just trying to really quantify it)
 
Well, I remember another manufacturer making a wrong bet 2-3 years before console was launched that resulted in much bigger mistake then this one would be...

*cough* 8GB of DDR3 *cough*

I remember people saying "Why didn't they just go with GDDR5?" Yea, after the fact I am sure they would have gone with it, not in 2011 when it looked like there will be no chance high enough density GDDR5 will be on market in time of launch.

But yea, IMO we are talking about thin margins. Perhaps Navi was supposed to be ever so slightly better. Maybe frequency sweet spot target was not 1750MHz but 1900MHz, enough to turn entire strategy on its head. We know Navi was late, we know AMD was strapped for cash back in 2017 and 2018, I don't think it would be out of the ordinary if they just slightly missed the target - rendering PS5 hypothetical design "illogical", when it was anything but that few years ago.
Well, at least 2000MHz do looks possible right now and Sony seems trying to reach it(according to taiwanese dude).

Though i do wonder if(a big if) the 2000MHz 9.2TF PS5 and 1700MHz 12TF XSX end up real, will MS boost the clock a bit like XB1(800->853Mhz)? that tower design seems can push little more of it. Although that also means this time they're the better performance one, not as desperate as XB1 gen.
 
Last edited:
The camera is used by streamers playing games on PS4.
Oh, okay. Well that's something else really. Camera as an interface is dead. For streaming, sure, but that's not why it was invented and a far simpler solution could be used there, I'm sure. It certainly wasn't designed for streaming but for camera vision and object tracking. A streaming camera may be a thing.
 
18gbps can't happen, the yield isn't there yet to even have it in sampling quantity. Once it does, the high end bin volume becomes tied to the worldwide demand for gddr6. There are expensive GPUs where they'll pay whatever it costs to get those 18gbps parts. So I don't see how a console selling 20 million per year can procure those chips, that would be a big chunk of the production and someone else would have to buy the lower speed bins that are in the middle of the bell curve.

I think 14 is the most logical, and 16 might be possible.
You are probably right. But they didn't use 18gbps chips according to the github leak: 16gbps for Oberon A0 and around 17gbps (maybe a bit more) for Oberon B0.
 
Well, I remember another manufacturer making a wrong bet 2-3 years before console was launched that resulted in much bigger mistake then this one would be...

*cough* 8GB of DDR3 *cough*

I remember people saying "Why didn't they just go with GDDR5?" Yea, after the fact I am sure they would have gone with it, not in 2011 when it looked like there will be no chance high enough density GDDR5 will be on market in time of launch.

But yea, IMO we are talking about thin margins. Perhaps Navi was supposed to be ever so slightly better. Maybe frequency sweet spot target was not 1750MHz but 1900MHz, enough to turn entire strategy on its head. We know Navi was late, we know AMD was strapped for cash back in 2017 and 2018, I don't think it would be out of the ordinary if they just slightly missed the target - rendering PS5 hypothetical design "illogical", when it was anything but that few years ago.
The issue with XBO design was relative to PS4 in terms of price to GPU performance. This isn't the same thing...

These rumors put PS5 outside of the sweet spot of performance. Very few people would be saying much if the clock wasnt raising questions about how to keep system cool.
 
If some random forum poster says 13TF with nothing to back it up, it is straight out true. If there are test docs released straight from AMD, and DF believes it is PS5, we fight to hell and back to try and disprove it, reason, it isn't what we want it to be.

This GitHub leak is the most detailed leak we've ever gotten for consoles.
 
You are probably right. But they didn't use 18gbps chips according to the github leak: 16gbps for Oberon A0 and around 17gbps (maybe a bit more) for Oberon B0.
Maybe those clocks are for psnow?

Years ago they were talking about reducing the latency by using faster hardware. Clocking it at 2ghz but with throttling 50% of the time between frames would fit much better in a reasonable thermal envelope, and it would cut latency by half. Heat management also works well in a rack. They can put 8 or 12 of those per U.

They will need a lot of ps4pro to fill out psnow, maybe they won't make any ps4pro slim for consumers, but they have an advantage to make a 7nm one for lower datacenter power cost.
 
You are probably right. But they didn't use 18gbps chips according to the github leak: 16gbps for Oberon A0 and around 17gbps (maybe a bit more) for Oberon B0.
And if they feel they need 18gbps for launch and they are going to be able to acquire sufficient quantity, they'll use that.

There's no design considerations, or fabrication changes that are needed.
 
The issue with XBO design was relative to PS4 in terms of price to GPU performance. This isn't the same thing...

These rumors put PS5 outside of the sweet spot of performance. Very few people would be saying much if the clock wasnt raising questions about how to keep system cool.
Pretty simple actually, MS design error, as well as GPU deficit, stems from choosing DDR3 memory as main memory duo to 8GB requirement. Back in 2011 nothing indicated 8GB of GDDR5 memory will be ready for 2013, even in 2012 Samsung couldn't deliver it. When they opted for slower memory, they also chose to dedicate big amount of already big die for ESRAM, pretty much giving GPU advantage for Sony on silver platter. Likely thinking Sony will have to opt for 4GB in that case, so it wasn't lose - lose.

What would be the picture if Samsung couldn't deliver 8GB of GDDR5 for Sony in 2013? Perhaps 4GB would be even bigger difference then 40% more TF.

In the end, similar situation could have happened here. When Navi and PS5 were in design phase, maybe internal targets pointed at higher frequency sweet spot where Navi GPU with 36CUs and 2.0GHz made ton of sense.

In this case, not only would Sony get most out of their silicon, but would have matched their own BC method requirement and likely be around magical 200W limit. Back then, going with 60CUs (and 4-6 deactivated) on 7nm die perhaps looked wrong. You would have to clock your console lower then you would with 36CU part, thus getting less bang for buck out of your chip, and still it would probably emit more heat and pull more watts from wall, rendering it too much for console. On top of that, you would have 15-20% bigger die on expensive process.

If 36CU at 1.8 - 2.0GHz turn out true, we should look into what triggered this decision back in 2017, and not why it doesn't make as much sense now as 56CU part at 1.7GHz does.
 
You are probably right. But they didn't use 18gbps chips according to the github leak: 16gbps for Oberon A0 and around 17gbps (maybe a bit more) for Oberon B0.
So, they either overclocked 16Gbps chips or actually went with downclocked 18Gbps chips (as rumored by PCB dev kit leak from May and Flute benchmark from July)?
 
And if they feel they need 18gbps for launch and they are going to be able to acquire sufficient quantity, they'll use that.

There's no design considerations, or fabrication changes that are needed.
mm... there's probably some consideration needed for the tracing & memory bus in supporting higher frequencies. :p So it depends on whether they went overboard on the mainboard.
 
As for the 2GHz,

We saw for RX 480 to RX 580, the improvement in clock speed on the same node with the same TDP was 12% in a year.

If the game clock for Navi 10 is 1755mhz, next year it might very well be 1966mhz.

It's clear to me that Sony pushed clocks beyond what's the standard even set by the 1X.

The same way that Xsx redefined the limits of overall console APU form factor and TDP, I think Sony is redefining console GPU clocks.

Narrow and fast vs wide and "slow".

Compute and BW advantage vs pixel, geometry, and texture advantage.

I think even with 12 vs 9, this is the closest that a MS and Sony console has been in terms of architecture and performance.
 
Last edited:
This GitHub leak is the most detailed leak we've ever gotten for consoles.

It is also the one DF thinks is the PS5.

If 36CU at 1.8 - 2.0GHz turn out true, we should look into what triggered this decision back in 2017, and not why it doesn't make as much sense now as 56CU part at 1.7GHz does.

If 56Cus @1700mhz is true, then i don't think about 300mhz more at 36 is so strange. Narrow/fast vs wide/slow(er).
 
That would follow. But who knows, maybe they'll bundle a massive motion-capture camera with it. :runaway:

On a more serious note, it think it more likely that Sony would include PSVR/PSVR2 hardware in the main console than Microsoft so that would be an additional cost that Microsoft may skip. I still think VR is cool tech but it's definitely not taken off as well as I thought it would. Sony may decide it's time to move on, or - like the camera which they've never given up on, they may just keep iterating. As long as it's not costing them, and the majority of the R&D is already done, why not.


Bundling VR and raising the price would be a mistake, like Kinect 2.
 
Pretty simple actually, MS design error, as well as GPU deficit, stems from choosing DDR3 memory as main memory duo to 8GB requirement. Back in 2011 nothing indicated 8GB of GDDR5 memory will be ready for 2013, even in 2012 Samsung couldn't deliver it. When they opted for slower memory, they also chose to dedicate big amount of already big die for ESRAM, pretty much giving GPU advantage for Sony on silver platter. Likely thinking Sony will have to opt for 4GB in that case, so it wasn't lose - lose.

What would be the picture if Samsung couldn't deliver 8GB of GDDR5 for Sony in 2013? Perhaps 4GB would be even bigger difference then 40% more TF.

In the end, similar situation could have happened here. When Navi and PS5 were in design phase, maybe internal targets pointed at higher frequency sweet spot where Navi GPU with 36CUs and 2.0GHz made ton of sense.

In this case, not only would Sony get most out of their silicon, but would have matched their own BC method requirement and likely be around magical 200W limit. Back then, going with 60CUs (and 4-6 deactivated) on 7nm die perhaps looked wrong. You would have to clock your console lower then you would with 36CU part, thus getting less bang for buck out of your chip, and still it would probably emit more heat and pull more watts from wall, rendering it too much for console. On top of that, you would have 15-20% bigger die on expensive process.

If 36CU at 1.8 - 2.0GHz turn out true, we should look into what triggered this decision back in 2017, and not why it doesn't make as much sense now as 56CU part at 1.7GHz does.
Again it's not the same thing because there was nothing fundamentally questionable about DDR3 memory working in a mass market device....

That's not the situation here, the heat associated with these rumored clocks could result in a lot problems. What many of us are wondering wrt these rumors is can you clock a console that high without issues.

Relative performance aside, even in isolation this decision looks odd.
 
Again it's not the same thing because there was nothing fundamentally questionable about DDR3 memory working in a mass market device....

That's not the situation here, the heat associated with these rumored clocks could result in a lot problems. What many of us are wondering wrt these rumors is can you clock a console that high without issues.

Relative performance aside, even in isolation this decision looks odd.
Heat associated with rumored clocks is problem NOW, but perhaps wasn't meant to be when it was designed. That is the point.

What if Navi was meant to be clocked like Turing cards and you could deliver 36CUs on 2.0GHz in ~200W box?

What if Samsung couldn't deliver 8GB GDDR5 in 2013 and only did it a year later? You would probably say "Choice to go with GDDR5 instead of safe DDR3 memory was illogical".
 
Status
Not open for further replies.
Back
Top