Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
It's reasonable enough to explore. Cerny said they have hardware accelerated ray tracing.

I assume such solutions is to speed up some stage for their movie rendering. I have a hard problem seeing how such specific and most likely staged pipeline job matches with the complex task of game frames.

I'm sure nobody here expects consoles to run on movie render engines.
 
I guess so.

It'd just be 12Gbps GDDR6 on 384-bit bus then. The 12Gbps bin might not be in the catalogue for the 16Gbit chips at Samsung, but that doesn't mean they can't just clock it at that speed (especially since 12Gbps is an option for the 8Gbit bins).

Anyways. 384-bit bus means a larger chip.
Also 12gb density chips, which is also not at Samsung.
288 bit at 16gbps is exactly 576GB/s, and 18GB with 8gb chips. Is there anything in the architecture preventing 288bit with 9 chips? We are already supposing 320bit for XBSX anyway so it might not need to be either 256 or 384.

Even if there's a requirement to have 2 channels per memory controller (wasn't GCN 64bit per controller or something?), GDDR6 is two per chip, so the granularity would no longer needs pairs of chips.
 
288 bit at 16gbps is exactly 576GB/s. Is there anything in the architecture preventing 288bit with 9 chips? We are already supposing 320bit for XBSX anyway so it might not need to be either 256 or 384.

Even if there's a requirement to have 2 channels per memory controller (wasn't GCN 64bit per controller or something?), GDDR6 is two per chip, so the granularity would no longer needs pairs of chips.
From the whitepaper:

Each memory controller drives two 32-bit GDDR6 DRAMs

I suppose you could just leave half of an MC disabled, but that would be a waste of space, and would be...





... odd.

Four slices of the L2 cache are associated with each 64-bit memory controller to absorb and reduce traffic. The cache is 16-way set-associative and has been enhanced with larger 128-byte cache lines to match the typical wave32 memory request.

Not sure if the connection to L2 has other effects as well.
 
Some confusion could come from the system having 16GB GDDR6 18Gbps + 2GB DDR4, which could be compatible with the reddit devkit PCB leak, that mentioned:

- 16*2 clamshell GDDR6 named "HC18" from Samsung (there's a HC14 in their website pointing to 14Gbps memory, so HC18 would mean 18Gbps) -> 576GB/s
- 3 * 2GB DDR4 2400MT/s chips, of which 2 of them are "close to the NAND" acting as storage cache
- Meaning a single 2GB chip is directly accessible by the SoC.

That would mean the SoC in a diagnostics tool would see its total memory as 16+2 = 18GB, even if the devs can't ever access the 2GB DDR4 (nor the whole 16GB GDDR6 for that matter).
 
That "leak" is basically "make everything looks a bit better than XSX" and that's it.
What leak?!
The PCB leak I mentioned, which claims the PS5 is using a 316mm^2 die, meaning it's significantly smaller than the SeriesX's SoC?
 
See the follow up. I'm fine with speculating. I'm not fine with speculation that is deliberate trolling as opposed to speculation that is based on a genuine attempt to construct a reasonable result.
To be fair, this is a shit-posting thread. Proper consideration not based on wild rumours can and should be conducted in the Prediction thread. There's a difference between 'rumours' and 'baseless rumours', and robust rumours contributing to prediction should be held there rather than here. This thread has sadly become too much of a monster to moderate effectively unless one of us has a couple hours free...

That actually is a bit of a credit to B3D, that even if the face of stupid, baseless rumours, a good part of the discussion tries to be technical and analytical! ;)
 
288 bit at 16gbps is exactly 576GB/s, and 18GB with 8gb chips. Is there anything in the architecture preventing 288bit with 9 chips? We are already supposing 320bit for XBSX anyway so it might not need to be either 256 or 384.

Out of all the GPUs from Nvidia and AMD across history, only the 1080ti, and 2080ti have bus size that's not a multiple of 64 bit. (there are some 32 bit bus only GPUs) Idk why multiple of 64 bit is so common. Maybe even number of memory controllers are easier to layout on the chip?

EDIT: TY AISpark.
 
Last edited:
It's reasonable enough to explore. Cerny said they have hardware accelerated ray tracing. The github leak did not indicate they have it.
That leaves to obvious possibilities,
a) the test was non comprehensive (thus it's AMD and likely the same as MS)
b) it's a different vendor solution
c) they don't have it.

A & B are the most probable. But we have very weak data in option (b); linkedin resumes and job postings. This leaves (a) as the most probable since it should be the easiest to implement and if AMD can provide it for MS, it should be provision-able for Sony.

B goes against Occam's Razor without more data points. Seems like wishful thinking on behalf of others hoping there is a 1 uppage. But doesn't seem reasonable.
No, he didn't.
 
No, he didn't.
https://www.wired.com/story/exclusive-playstation-5/

Given the many questions he’s received since, he fears he may have been ambiguous about how the PS5 would accomplish this—and confirms that it’s not a software-level fix, which some had feared. “There is ray-tracing acceleration in the GPU hardware,” he says, “which I believe is the statement that people were looking for.” (A belief born out by my own Twitter mentions, which for a couple of weeks in April made a graphics-rendering technique seem like the only thing the internet had ever cared about.)
 
There are? In this thread?
Funny, what I see is people repeating the github leaks (i.e. The Gospel) ad nauseam as some sort of rock solid super definite proof that the PS5 will turn out significantly slower than SeriesX, with zero percent chance of it bringing anything other than 36 CUs at 2GHz max, and making their presence in the forums a personal crusade to discredit every insider/source/rumor/leak that suggests otherwise.

And the cherry on top are the ones who accuse people who dare to entertain the veracity of any leak that isn't 100% defined by The Gospel as being in denial and living a wet dream.

I guess we're visiting different forums.



As for me, I'd rather have both consoles getting identical capabilities and have them compete in games + services + features + gamer QoL, as that's where I think I stand to gain the most.
Probably because I'm just a consumer and I don't get paychecks from microsoft or sony.



Lol, the reason this thread was even created was because rumours of the PS5 being more powerful than Scarlett started appearing. That offended some people so much that they felt the need to contain that discussion away from the main thread for console predictions.

Just go check the first page in this thread and see for yourself.
Its a stupid thread tbh.

You have cheerleaders of one brand holding "2.1=5 > 2" posts from a role playing insider to be of same credibility as actual AMD internal testing leaked data which was deleted from Github and twitter, or things like Flute benchmark that were as well deleted the very next day they were leaked.

I mean, I don't know what should we discuss here? Riddles from GAF or verified users from Resetera that went from seeing vertical slice of next gen game to knowing strategic reasons why Sony cancelled 2019 release (?) and having specs on their hand (but asking to be banned from forum once things turned the other way). All of that while working night shifts in Alaska...?
 
I would just like to remind everyone that back in July, Flute benchmark has leaked.

AMD-Flute-semicustom-APU-Zen-2-konzole-UserBenchmark-2.png


A benchmark containing semi custom SOC Eng Sample: 100-000000004-15_32/12/18_13F9
  • 8 core Zen 2 CPU
  • 16GB of RAM on 256bit bus
  • iGPU with PCI ID - 13F9
This benchmark was deleted from Userbenchmark the very next day it was posted online.

Now we know 13F9 is in fact Oberon A0.

EN4CPe_VAAAUhhl.jpg:small



Now, I don't need to remind people that Github leak, official AMD testing data that leaked last month, contained SOC named Oberon. This chip contained BC1 and BC2 modes of hardware configuration matching PS4 and PS4 Pro "to a T", as well as 256bit bus in native configuration. We know Oberon is character from Shakespears Midsummers Night Dream. We know Flute, benchmark posted above containing iGPU with PCI ID 13F9 (Oberon A0), is codenamed after Shakespears character from Midsummers Night Dream - Francis Flute. We know PS5s "V" dev kit is codenamed Prospero, another character from Shakespears play.

So why should we now say, "Well yea, hard data and obviously legit leaked benchmarks show one thing, but there is a guy on GAF that talks in riddles and he says completely different thing so I'll go with that"? I mean, you can, for all I care, but why would people going by hard, legit data be called fanboys if they follow actual data? Its best we have now, by far, whats the reason to throw it all away just because the guy talking in riddles says differently?
 
I would just like to remind everyone that back in July, Flute benchmark has leaked.

AMD-Flute-semicustom-APU-Zen-2-konzole-UserBenchmark-2.png


A benchmark containing semi custom SOC Eng Sample: 100-000000004-15_32/12/18_13F9
  • 8 core Zen 2 CPU
  • 16GB of RAM on 256bit bus
  • iGPU with PCI ID - 13F9
This benchmark was deleted from Userbenchmark the very next day it was posted online.

Now we know 13F9 is in fact Oberon A0.

EN4CPe_VAAAUhhl.jpg:small



Now, I don't need to remind people that Github leak, official AMD testing data that leaked last month, contained SOC named Oberon. This chip contained BC1 and BC2 modes of hardware configuration matching PS4 and PS4 Pro "to a T", as well as 256bit bus in native configuration. We know Oberon is character from Shakespears Midsummers Night Dream. We know Flute, benchmark posted above containing iGPU with PCI ID 13F9 (Oberon A0), is codenamed after Shakespears character from Midsummers Night Dream - Francis Flute. We know PS5s "V" dev kit is codenamed Prospero, another character from Shakespears play.

So why should we now say, "Well yea, hard data and obviously legit leaked benchmarks show one thing, but there is a guy on GAF that talks in riddles and he says completely different thing so I'll go with that"? I mean, you can, for all I care, but why would people going by hard, legit data be called fanboys if they follow actual data? Its best we have now, by far, whats the reason to throw it all away just because the guy talking in riddles says differently?
So you are absolutely 100% sure the PS5 is 9.2TF as the github leak indicates?
Enough to say nobody should speculate anything else?
 
On B3D, we perhaps like to discuss numbers from spec sheets or like the github leak. Not 'insders with a friend' that say 'its powerfull, or good gfx for being what the specs are. Or even worse, 'they are close in power' which could mean anything.
 
So you are absolutely 100% sure the PS5 is 9.2TF as the github leak indicates?
Enough to say nobody should speculate anything else?
No, what I am saying is that hard data points to other direction. I have no issue speculating, but people are calling other people fanboys for going by hard data and not taking role playing insiders talking in riddles from other forums serious.

I dunno what to say. AMD's semi custom SOC benchmark named Flute, that leaked back in July, indicated 16GB on 256bit bus and fast memory setup (16Gbps+). Flute contains iGPU named Oberon. Oberon is legit semi custom SOC from AMD almost certainly associated with Sony. Oberon contains 256bit bus and +16Gbps memory modules.

Maybe its not 36CU chip and maybe 2GHz is not retail clock, not saying that is set in stone, but I don't see why should we throw away matching hard numbers from multiple legit sources in favor of a guy talking in riddles?

Thats where I stand. If tomorrow we get newer, different numbers from equally good source, I will be the first to align my view.
 
Which is fine, but there's no point belabouring the idea. This is a wild rumour thread, and if people want to believe in wild rumours, here's the place to do it. I've a crate of Kool-Aid here if anyone's thirsty. ;)
 
Some confusion could come from the system having 16GB GDDR6 18Gbps + 2GB DDR4, which could be compatible with the reddit devkit PCB leak, that mentioned:

- 16*2 clamshell GDDR6 named "HC18" from Samsung (there's a HC14 in their website pointing to 14Gbps memory, so HC18 would mean 18Gbps) -> 576GB/s
- 3 * 2GB DDR4 2400MT/s chips, of which 2 of them are "close to the NAND" acting as storage cache
- Meaning a single 2GB chip is directly accessible by the SoC.

That would mean the SoC in a diagnostics tool would see its total memory as 16+2 = 18GB, even if the devs can't ever access the 2GB DDR4 (nor the whole 16GB GDDR6 for that matter).
That would fit, but die size in PCB leak is too small for rumor of 13.4TF to be true. Its 316mm2.
Which is fine, but there's no point belabouring the idea. This is a wild rumour thread, and if people want to believe in wild rumours, here's the place to do it. I've a crate of Kool-Aid here if anyone's thirsty. ;)
I agree, and you are right, I just think even in baseless thread on B3D people should not insinuate other people are biased fanboys because legit, hard data is taken at face value vs random 3214th rumor posted today on GAF/Reddit.

On another note, Navi 10 visualized :

EOv5TgJXkAAz4wY.jpg


LINK
 
Last edited:
Status
Not open for further replies.
Back
Top