Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
I just want to clarify for people who dont have Github docs, Oberon is almost certainly chip going into production and 2.0GHz is most definitely not "Pro boost". It clearly says "native", many times that is.

Given that it is regression test its only question if this is in fact 36CU Native or was version before it 36CU Native, and this one might not be. I am almost certain its 36CU, as other chips native do match with their actual hardware, but lets see. One thing is clear, it was not a stress or pro boost test.
images

It's all a ruse by Sony, they would have hid their final unit test data in more ways than just one.
 
Someone enlighten me. I don’t have a ton of knowledge when it comes to taking a chip from the drawing board to final hardware in full production.

But where does this belief come from that Sony is distributing dev kits to developers with early revisions of their apu 16-18 months before launch? An A0 part at that? Unless AMD got extremely lucky, how would such a part see the light of day outside of AMD? No need for base layer revisions or even a metal layer revision?

“This shit so good we are going to chunk these hoes into a dev kit!!!”

“Ohh noo...wait we need about another 4 major revisions!!!”

None of this sounds realistic to me.

My understanding (really simplistic but due to time and motivation constraints)...

Manufacture A0 chips. Ton of functional and operational testing to discover defects or errors. Frontend or frontend and backend issues? Major revision. Back end issues only? Minor revision. Rinse and repeat until you get an acceptable version of the chip.

Happy? Widen production volume and send chips to device vendors where additional testing is performed. First on prototype boards then inside of chosen form factors.

Happy still? Initiate mass production of chips.

Why would a software dev want to deal with early revision chips and try to developed software with unstable hardware? Is it my software’s fault, the SDK or the hardware? That’s a question I doubt any dev wants to continually ask during the software development process.
 
Last edited:
Someone enlighten me. I don’t have a ton of knowledge when it comes to taking a chip from the drawing board to final hardware in full production.

But where does this belief come from that Sony is distributing dev kits to developers with early revisions of their apu 16-18 months before launch? An A0 part at that? Unless AMD got extremely lucky, how would such a part see the light of day outside of AMD? No need for base layer revisions or even a metal layer revision?

“This shit so good we are going to chunk these hoes into a dev kit!!!”

“Ohh noo...wait we need about another 4 major revisions!!!”

None of this sounds realistic to me.

My understanding (really simplistic but due to time and motivation constraints)...

Manufacture A0 chips. Ton of functional and operational testing to discover defects or errors. Frontend or frontend and backend issues? Major revision. Back end issues only? Minor revision. Rinse and repeat until you get an acceptable version of the chip.

Happy? Widen production volume and send chips to device vendors where additional testing is performed. First on prototype boards then inside of chosen form factors.

Happy still? Initiate mass production of chips.

Why would a dev want to deal with early revision chips and try to developed software with unstable hardware? Is it my software’s fault, the SDK or the hardware? That’s a question I doubt any dev wants to continually ask during the software development process.
Back in early 2012 they were supposedly using equivalent PC parts in the mean time waiting for the real soc, and I knew people working at a major studio who got their real devkit with 8GB around december. No idea if that timing was early or late.

Revisions doesn't mean it doesn't work, nor that it's unstable, it could be power related, max clock related, some errata being handled by the API or compiler or microcode, etc... They would at least have the APIs ready to develop the game and the final optimization can wait for the final specs and silicon.

I don't know much about how the chips are designed and revisionned. In the past I used early MCUs chips with pages after pages of errata on the silicon, but unless you do pure assembly, the compiler takes care of most of it. Some errata need workarounds by the dev but as long as it's documented it just adds some slower temporary code which will be removed once that bug is fixed. I assume it's similar for game consoles, just a lot more erratas.
 
Back in early 2012 they were supposedly using equivalent PC parts in the mean time waiting for the real soc, and I knew people working at a major studio who got their real devkit with 8GB around december. No idea if that timing was early or late.

Revisions doesn't mean it doesn't work, nor that it's unstable, it could be power related, max clock related, some errata being handled by the API or compiler or microcode, etc... They would at least have the APIs ready to develop the game and the final optimization can wait for the final specs and silicon.

I don't know much about how the chips are designed and revisionned. In the past I used early MCUs chips with pages after pages of errata on the silicon, but unless you do pure assembly, the compiler takes care of most of it. Some errata need workarounds by the dev but as long as it's documented it just adds some slower temporary code which will be removed once that bug is fixed. I assume it's similar for game consoles, just a lot more erratas.

I see where u are coming from. But at least 4 major revisions before a mass produced part? And AMD is still willing to produce work arounds for errata just for some early dev kits?

Seems like the limited scale of these dev kits don’t warrant the effort.
 
I see where u are coming. But at least 4 major revisions before a mass produced part? And AMD is still willing to produce work arounds for errata just for some early dev kits?
Well it's impossible to know what the revisions solve or improve without the documentation. It could be anything. Does the naming of revisions even mean they all sequentially existed?

pickachu.png

I would be worried if they can't best Stadia.
In real world benchmarks I suppose they only need a 7TF RDNA1 or something.
 
My prediction is if it's 36cu Which seems to be the most solid leak I'm expecting 8-9 TFlops.
I always felt 10TFlops was the dream but wasn't going to be reached.

More than happy that we getting fast storage wasn't expecting that and hopefully proper 3d audio that will actually be used.

The CPU will be a major improvement.
 
Here are mine for PS5:

SoC (N7, later to be implemented as N6):
- 8x Zen2 cores with 8MB L3 @ 3.2GHz
- Total 40 CUs / 160 TMUs / 64 ROPs at 2.15GHz for 11 TFLOPs
- 16GB GDDR6 18Gbps using 256bit for 576GB/s
- Embedded custom RT - not the one used by Microsoft and future Radeon GPUs (which should be the same)

Note: at current 9% defect density in TSMC's N7, yield for 100% functional chips at 320mm^2 will be 76%. For a chip that has had many revisions for yield increase, this number might be significantly larger.



Southbridge + Storage Controller (16FF+):
- Connected to the main SoC through a custom IF or just PCIe 4.0 x24 connection of 48GB/s towards the main SoC.
- 2x ARM Core A55 @ 1.5GHz with dedicated 4GB DDR4 3200MT/s (12.8GB/s) for non-critical O.S. tasks, hybernated apps (netflix, spotify, etc.) and active standby features.
- Cluster of custom RISC processors for storage manangement, with dedicated 2*4GB DDR4 3200MT/s (25.6GB/s) for storage buffer
- PCIe 4.0 x8 NVMe for embedded 512GB - 1TB NAND.
- Slot for a M.2 2280 NVMe x4 for cheap expansion beyond the embedded NAND.

- Dual WiFi controller: one WiFI AX 2.4/5GHz for Internet, one WiFi AD 60GHz capable for untethered PSVR2
- Bluetooth 5.1 with compatibility for standard wireless headphones with AptX Low Latency
- 2x USB-C on the front
- 2x USB-C + HDMI 2.1 + PS camera input + Ethernet + optical on the back.
Really probable your prediction comes to be true... final clock is probably going to be evaluate now and next couple of months by what silicon comes out from foundries in quantities... don't belive in >2.0 ghz so 10.2 TF should be the Number....
 
I see where u are coming from. But at least 4 major revisions before a mass produced part? And AMD is still willing to produce work arounds for errata just for some early dev kits?

Seems like the limited scale of these dev kits don’t warrant the effort.

Well it's impossible to know what the revisions solve or improve without the documentation. It could be anything. Does the naming of revisions even mean they all sequentially existed?


In real world benchmarks I suppose they only need a 7TF RDNA1 or something.

The alternative is PC parts with equivalent compute power but a completely different architecture. How is that any more representative of their use case than early silicon with a few non-functioning features?

Also, with the advent of chiplets and multi-die packages, it’s easier to challenge the convention of a traditional dev kit. What’s to prevent them from constructing a custom APU that has vanilla Zen 2 and Navi 10 die on it?
 
Didn't that BIOS also jump the clock from 15xx range to 17xx range? It wasn't just memory bandwidth that was increased. This was talked about in the AMD PC thread.

Edit: Yes it was, here's Pharma's post on it... Speed of 1560 to 1700.

AMD reconfigures Radeon RX 5600 XT
Media worldwide is hard at with their 5600 XT reviews, and yes yesterday the media samples required a new firmware, I can confirm this personally. While we cannot confirm or say what triggered AMD to do that, we can confirm that there are significant changes in firmware configuration.
...
The new OC models will get a board power allowance of 180 Watts in total, the TGP (GPU power) is increased from 150 towards 160 Watts. Accumulated all together, that's a valid tweak as the boost clock just jumped from 1560 MHz advertised, towards wait for it ... 1700 MHz.

In the end, the consumer wins here, for AMD I am not sure how this will pan out, they are now actively cannibalizing their own Radeon RX 5700 (non-XT). Next week you can expect some reviews. But currently, we have to re-do all reviews from scrap as this reconfiguration changes everything from power measurements, thermals, and performance.
https://www.guru3d.com/news-story/n...icing-amd-reconfigures-radeon-rx-5600-xt.html
 
So how realistic is this reddit leak everyone's been talking about lately? Seems too good to be true as if someone's having a last hurrah with the leak before the actual reveal.
Have you considered a name change to ultraoptimist? :???:
 
Status
Not open for further replies.
Back
Top