Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
I heard people like to post 4chan leak here, hey i find a new one.
The against one:LOL:.
ErDddC2.jpg
 
As for speed of memory modules, lets just say PS4 used downclocked 6Gbps chips (5.5Gbps) in 2013, when top of the range TI GPUs from Nvidia used 6Gbps. PS4Pro on the other hand used downclocked 7Gbps chip (6.8Gbps) in 2016, when binned chips were 7-8Gbps max.

Early leaks of PS4 specs had the memory clocks at 6Gbps, but that later dropped. Probably not coincidentally around the time of the change to clamshell mode memory - I've seen memory chip spec sheets that describe a drop in memory clocks required for stable clamshell mode operation. Pretty sure some of the nerds here have talked about it in the past too. ;) Then you're stuck with those clocks even when larger memory chips mean you no longer actually need to.

I wonder if some of the development kits are using clamshell mode memory, or if Sony / MS are planning on using it early on to hit required quantities with sufficiently narrow buses (e.g. 256 bit bus with 8Gbit chips in clamshell for 16GB).

Sony and MS need huge quantities in steady supply by mid 2020. We're talking tens of millions of chips per month by something like July - holding out for 16 Gbit chips in quantities that dwarf high end graphics cards might mean missing the launch window.
 
... the penalty from clamshell mode on the PS4 was 6 Gbbps -> 5.5 Gbps.

If the same proportion of frequency loss was maintained for GDDR6 then 18 Gbbps -> 16.5 Gbps.

If 16 Gb / 2 GB chips are not available in the vast quantities needed then you simply have to go clamshell on the fastest 8 Gb chips you can get and take the frequency hit. :runaway:
 
... the penalty from clamshell mode on the PS4 was 6 Gbbps -> 5.5 Gbps.

If the same proportion of frequency loss was maintained for GDDR6 then 18 Gbbps -> 16.5 Gbps.

If 16 Gb / 2 GB chips are not available in the vast quantities needed then you simply have to go clamshell on the fastest 8 Gb chips you can get and take the frequency hit. :runaway:
May I introduce you to...


16 Samsung K4ZAF325BM-HC18 in clamshell configuration

Or this one that leaked month later...

AMD-Flute-semicustom-APU-Zen-2-konzole-UserBenchmark-2__01.png

Wow would you look at that...16 chips, 16GB, 528GB/s BW. And 13F9 (Oberon) to boot!
 
Last edited:
This rumor has several issues. First, why would they post all that technical detail, but no part number from the SoC?

Second, the PS5 has a custom SSD controller without the need for dedicated DRAM, per Sony patents.
Perhaps because that way they can track you easily? This way you can always say it was a guess.

Thats patent. We dont know what PS5 has in dev kits or retail console for that matter.

That first leak is interesting because it fits really well with timeframe of V dev kit release. When this was leaked (late May 2019) people said it cannot be true because its too early for APU dev kit release, and yet early summer we got confirmation of actual patented V style dev kits release.

It also fits with Flute benchmark which by all intents and purposes contains Oberon chip, 256bit bus and 16GB of GDDR6 RAM (at bandwidth which only >16Gbps can provide).

It fits really well, although it doesnt have to be true of course, but for me it is interesting because it came before V dev kits, it specified >16Gbps chips and 16GB of RAM before we got legit leaks confirming it (Flute and Git) + die size would fit like l glow for 36CU chip.
 
The other quirk around the GDDR6 is the extensive articles/PR over two years ago (January 2018) from Samsung that 18Gbps chips entered mass production yet still they don't seem to exist as such. No corrections were ever made to the articles I'm aware of.

Clearly something went wrong there.
 
Perhaps because that way they can track you easily? This way you can always say it was a guess.

Thats patent. We dont know what PS5 has in dev kits or retail console for that matter.

That first leak is interesting because it fits really well with timeframe of V dev kit release. When this was leaked (late May 2019) people said it cannot be true because its too early for APU dev kit release, and yet early summer we got confirmation of actual patented V style dev kits release.

It also fits with Flute benchmark which by all intents and purposes contains Oberon chip, 256bit bus and 16GB of GDDR6 RAM (at bandwidth which only >16Gbps can provide).

It fits really well, although it doesnt have to be true of course, but for me it is interesting because it came before V dev kits, it specified >16Gbps chips and 16GB of RAM before we got legit leaks confirming it (Flute and Git) + die size would fit like l glow for 36CU chip.

I don’t think that logic holds up. They went through the painstaking rigor of listing exact part numbers for all the major components on the board except the SoC, and you don’t think that would be enough identifying information is pinpoint the source?

They would have enough knowledge to know the SoC is a part number and not a uniquely identifying serial code, and as @bitsandbytes points out, 32GB doesn’t match the Flute benchmark. I think this is a case of confirmation bias and only looking at the corroborating info.
 
So OsirisBlack created an account on ResetEra. The mods noticed and told him to check PMs. The end result is that account is banned. The message to the posters in their speculation thread was:

"Let's not concern ourselves any further and get back to the fun speculation"
So does that mean they failed their verification process and everyone should place appropriate levels of credibility on everything he presented on GAF?


He lost (any?) credibility with me when he claimed a mid February Ps5 reveal, and that obviously didn't happen.

Anyways he was always nothing much, I've also read he was wrong about PS4 Pro specs a long time ago (I am sure he said it was much more powerful than it was, of course).

Anyways, this wait on PS5 is chinese water torture. I almost have fantasies of the day I groggily wake up and browse to reset era and there's some thread about a big credible breakthrough exposing ps5 specs.
 
Is it possible to make an educated guess about cost difference Oberon vs. Arden?
What bothers me: Oberon at 2 GHz may have worse yields than Arden, ending up having similar cost. And if so, Sony may have decided to go wide too?
AMD would only need to... 'scratch the XBox logo from Arden', so could deliver in time? (sort of serious sarcasm, this time.)
 
I don’t think that logic holds up. They went through the painstaking rigor of listing exact part numbers for all the major components on the board except the SoC, and you don’t think that would be enough identifying information is pinpoint the source?

They would have enough knowledge to know the SoC is a part number and not a uniquely identifying serial code, and as @bitsandbytes points out, 32GB doesn’t match the Flute benchmark. I think this is a case of confirmation bias and only looking at the corroborating info.
Well, yea, but all listed parts are available publicly, SOC id is not.

I already said, that leak doesn't really change much anyway since we got Flute, its just interesting occurrence considering when it was leaked. Perhaps benchmark from Flute ran in actual console mode and dev kit still has 32GB, although as I already said, it makes little difference given that Flute corroborates with Oberon test anyway.

Is it possible to make an educated guess about cost difference Oberon vs. Arden?
What bothers me: Oberon at 2 GHz may have worse yields than Arden, ending up having similar cost. And if so, Sony may have decided to go wide too?
AMD would only need to... 'scratch the XBox logo from Arden', so could deliver in time? (sort of serious sarcasm, this time.)
What bothers me about 2.0GHz is, why would MS go for 1.7GHz when they could get much more out of their SOC with 2.0GHz (even with lower CU count)? Either Sony aimed for very high clocks from the get go, thinking die costs at 7nm will not allow MS to go higher then low 40CU's, and therefore bunch of chip steppings are to make sure it runs as high as it can. Or its not 2.0GHz...

There is no publicly available official data on 7nm die costs, but AMD did say price per yielded mm2 on 7nm is almost twice that of 16nm for 250mm2 chip.

I already speculated that Sony might have been in similar situation duo to fact that 40CU RDNA will already be bigger then PS4 or PS4Pro GPU portion, ignoring the fact that there will be additional hardware logic associated with RT/VRS. From that point, 40CUs on die make alot of sense, and so do high clocks which they may have required to offset any MS advantage in CU count (that I think Sony never believed could be 56CU). Judging by AMDs data on 7nm process costs, 40CU APU would be at very least 50% more expensive then 16nm one in Pro, therefore exploiting die as much as possible and clocking it to high heavens perhaps made sense at the time.

But, this is baseless speculation, not long now till we know for sure what they picked.
 
Last edited:
? Either Sony aimed for very high clocks from the get go, thinking die costs at 7nm will not allow MS to go higher then low 40CU's, and therefore bunch of chip steppings are to make sure it runs as high as it can. Or its not 2.0GHz...
.

Sony could have simply guessed wrong. At the time the decisions about the design had to be made a lot of these variables were unknown. Last time, Sony gambled on the 8 GB of RAM and it paid off. This time, maybe their decisions won’t payoff. I don’t know.

I bet Sony didn’t think MS would allow the Xbox division to build such a large chip for XSX. All indications were that hardware wasn’t that important to MS.
 
Status
Not open for further replies.
Back
Top