Just food for thought...
Die sizes :
PS2 - 519mm2 ↓
PS3 - 493mm2 ↓
PS4 - 348mm2 ↓
Pro - 320mm2 ↓
PS5 - ???mm2 ?
Clocks :
PS5 - ???GHz ?
Pro - 911MHz ↑
PS4 - 800MHz ↑
PS3 - 500MHz ↑
PS2 - 147MHz ↑
...as we go further down the manufacturing node, there is clear trend of chip sizes getting smaller and smaller, while frequencies are getting higher and higher.
This is no doubt result of much higher expenses for chip design and higher cost per mm2 of chip.
When people ask themselves why would Sony go for narrower and faster design (beside BC consideration - which are paramount to next gen success and ability to transition PS4 players to PS5) - here is your answer.
Well, at least 2000MHz do looks possible right now and Sony seems trying to reach it(according to taiwanese dude).Well, I remember another manufacturer making a wrong bet 2-3 years before console was launched that resulted in much bigger mistake then this one would be...
*cough* 8GB of DDR3 *cough*
I remember people saying "Why didn't they just go with GDDR5?" Yea, after the fact I am sure they would have gone with it, not in 2011 when it looked like there will be no chance high enough density GDDR5 will be on market in time of launch.
But yea, IMO we are talking about thin margins. Perhaps Navi was supposed to be ever so slightly better. Maybe frequency sweet spot target was not 1750MHz but 1900MHz, enough to turn entire strategy on its head. We know Navi was late, we know AMD was strapped for cash back in 2017 and 2018, I don't think it would be out of the ordinary if they just slightly missed the target - rendering PS5 hypothetical design "illogical", when it was anything but that few years ago.
They should call the devkit.... RT kit?
Oh, okay. Well that's something else really. Camera as an interface is dead. For streaming, sure, but that's not why it was invented and a far simpler solution could be used there, I'm sure. It certainly wasn't designed for streaming but for camera vision and object tracking. A streaming camera may be a thing.The camera is used by streamers playing games on PS4.
If Sony have any sense, they'll be using higher level APIs to ensure future BC isn't going to hamper them.I don't think there is so much afford necessary here. But DXR feels too high level and abstract, so they could get some advantage here.
You are probably right. But they didn't use 18gbps chips according to the github leak: 16gbps for Oberon A0 and around 17gbps (maybe a bit more) for Oberon B0.18gbps can't happen, the yield isn't there yet to even have it in sampling quantity. Once it does, the high end bin volume becomes tied to the worldwide demand for gddr6. There are expensive GPUs where they'll pay whatever it costs to get those 18gbps parts. So I don't see how a console selling 20 million per year can procure those chips, that would be a big chunk of the production and someone else would have to buy the lower speed bins that are in the middle of the bell curve.
I think 14 is the most logical, and 16 might be possible.
The issue with XBO design was relative to PS4 in terms of price to GPU performance. This isn't the same thing...Well, I remember another manufacturer making a wrong bet 2-3 years before console was launched that resulted in much bigger mistake then this one would be...
*cough* 8GB of DDR3 *cough*
I remember people saying "Why didn't they just go with GDDR5?" Yea, after the fact I am sure they would have gone with it, not in 2011 when it looked like there will be no chance high enough density GDDR5 will be on market in time of launch.
But yea, IMO we are talking about thin margins. Perhaps Navi was supposed to be ever so slightly better. Maybe frequency sweet spot target was not 1750MHz but 1900MHz, enough to turn entire strategy on its head. We know Navi was late, we know AMD was strapped for cash back in 2017 and 2018, I don't think it would be out of the ordinary if they just slightly missed the target - rendering PS5 hypothetical design "illogical", when it was anything but that few years ago.
If some random forum poster says 13TF with nothing to back it up, it is straight out true. If there are test docs released straight from AMD, and DF believes it is PS5, we fight to hell and back to try and disprove it, reason, it isn't what we want it to be.
Maybe those clocks are for psnow?You are probably right. But they didn't use 18gbps chips according to the github leak: 16gbps for Oberon A0 and around 17gbps (maybe a bit more) for Oberon B0.
And if they feel they need 18gbps for launch and they are going to be able to acquire sufficient quantity, they'll use that.You are probably right. But they didn't use 18gbps chips according to the github leak: 16gbps for Oberon A0 and around 17gbps (maybe a bit more) for Oberon B0.
Pretty simple actually, MS design error, as well as GPU deficit, stems from choosing DDR3 memory as main memory duo to 8GB requirement. Back in 2011 nothing indicated 8GB of GDDR5 memory will be ready for 2013, even in 2012 Samsung couldn't deliver it. When they opted for slower memory, they also chose to dedicate big amount of already big die for ESRAM, pretty much giving GPU advantage for Sony on silver platter. Likely thinking Sony will have to opt for 4GB in that case, so it wasn't lose - lose.The issue with XBO design was relative to PS4 in terms of price to GPU performance. This isn't the same thing...
These rumors put PS5 outside of the sweet spot of performance. Very few people would be saying much if the clock wasnt raising questions about how to keep system cool.
So, they either overclocked 16Gbps chips or actually went with downclocked 18Gbps chips (as rumored by PCB dev kit leak from May and Flute benchmark from July)?You are probably right. But they didn't use 18gbps chips according to the github leak: 16gbps for Oberon A0 and around 17gbps (maybe a bit more) for Oberon B0.
mm... there's probably some consideration needed for the tracing & memory bus in supporting higher frequencies. So it depends on whether they went overboard on the mainboard.And if they feel they need 18gbps for launch and they are going to be able to acquire sufficient quantity, they'll use that.
There's no design considerations, or fabrication changes that are needed.
This GitHub leak is the most detailed leak we've ever gotten for consoles.
If 36CU at 1.8 - 2.0GHz turn out true, we should look into what triggered this decision back in 2017, and not why it doesn't make as much sense now as 56CU part at 1.7GHz does.
That would follow. But who knows, maybe they'll bundle a massive motion-capture camera with it.
On a more serious note, it think it more likely that Sony would include PSVR/PSVR2 hardware in the main console than Microsoft so that would be an additional cost that Microsoft may skip. I still think VR is cool tech but it's definitely not taken off as well as I thought it would. Sony may decide it's time to move on, or - like the camera which they've never given up on, they may just keep iterating. As long as it's not costing them, and the majority of the R&D is already done, why not.
Better then this one?This GitHub leak is the most detailed leak we've ever gotten for consoles.
Again it's not the same thing because there was nothing fundamentally questionable about DDR3 memory working in a mass market device....Pretty simple actually, MS design error, as well as GPU deficit, stems from choosing DDR3 memory as main memory duo to 8GB requirement. Back in 2011 nothing indicated 8GB of GDDR5 memory will be ready for 2013, even in 2012 Samsung couldn't deliver it. When they opted for slower memory, they also chose to dedicate big amount of already big die for ESRAM, pretty much giving GPU advantage for Sony on silver platter. Likely thinking Sony will have to opt for 4GB in that case, so it wasn't lose - lose.
What would be the picture if Samsung couldn't deliver 8GB of GDDR5 for Sony in 2013? Perhaps 4GB would be even bigger difference then 40% more TF.
In the end, similar situation could have happened here. When Navi and PS5 were in design phase, maybe internal targets pointed at higher frequency sweet spot where Navi GPU with 36CUs and 2.0GHz made ton of sense.
In this case, not only would Sony get most out of their silicon, but would have matched their own BC method requirement and likely be around magical 200W limit. Back then, going with 60CUs (and 4-6 deactivated) on 7nm die perhaps looked wrong. You would have to clock your console lower then you would with 36CU part, thus getting less bang for buck out of your chip, and still it would probably emit more heat and pull more watts from wall, rendering it too much for console. On top of that, you would have 15-20% bigger die on expensive process.
If 36CU at 1.8 - 2.0GHz turn out true, we should look into what triggered this decision back in 2017, and not why it doesn't make as much sense now as 56CU part at 1.7GHz does.
Heat associated with rumored clocks is problem NOW, but perhaps wasn't meant to be when it was designed. That is the point.Again it's not the same thing because there was nothing fundamentally questionable about DDR3 memory working in a mass market device....
That's not the situation here, the heat associated with these rumored clocks could result in a lot problems. What many of us are wondering wrt these rumors is can you clock a console that high without issues.
Relative performance aside, even in isolation this decision looks odd.