Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Anyone want to make a stab at the SRAM amounts across the chips?

76MB for everything....
Is this a lot of SRAM for current systems? I know the XB1 had 47MB's which at the time was considered a lot. Also I couldn't help but notice in the Digital Foundry article they mentioned backwards compatibility would be done on hardware and not a VM. I found that surprising...
 
Is this a lot of SRAM for current systems? I know the XB1 had 47MB's which at the time was considered a lot. Also I couldn't help but notice in the Digital Foundry article they mentioned backwards compatibility would be done on hardware and not a VM. I found that surprising...

Was that for Xbox One titles, and not for the X360/Original Xbox games? I mean the game still run within their own image, so no real need to run that image inside yet another image/VM.
 
They use and extension of the Direct X, Direct Storage and for texture a BCPack texture format for storage, I suppose all this technology will be available on PC and a standard will appear for gaming SSD on PC.
this is something I would expect. PCs will of course evolve with Direct X and other industry standards :)
 
Was that for Xbox One titles, and not for the X360/Original Xbox games? I mean the game still run within their own image, so no real need to run that image inside yet another image/VM.
"It likely comes as no surprise to discover that Series X can technically run the entire Xbox One catalogue, but this time it's done with no emulation layer - it's baked in at the hardware level."
Looks like it's just for XB1 which makes sense. Also unlike the XB1X they're allowing the full CPU/GPU to be available for BC. On the OneX only 50% of the GPU was made available.
 
Any idea how the RDNA2 RT hardware compares to Turing RT Alex? which implementation enjoys the superior performance?
I want to do a video on this when we have a bit more information and also have good analogues on the PC side, if possible. At the moment though, I do not think we are seeing something that is drastically different performance wise to what we have seen on Turing. Which Turing card and which type of RT? Well, we have to wait a bit IMO.
The Sbox Series X not lighting on fire and running single digit FPS in minecraft are all positive signs.
 
I don’t know. Can we trust a former games journalist with verified industry sources? Maybe he just got lucky!
Now let's see if Schreier got lucky as well when he said both are above 2080. What a lucky guess that was for the xsx, huh?

Seems a number of assumptions about XBSX were off, given a 360 mm² die as no larger than usual. 900 GB/s total RAM bandwidth. Hardware decompression on the SSD. It's a very strong platform!
To be fair, they were going off github and 60cu count.
 
Seems a number of assumptions about XBSX were off, given a 360 mm² die as no larger than usual. 900 GB/s total RAM bandwidth. Hardware decompression on the SSD. It's a very strong platform!
lmao so if it was 20gb on 320bit bus it would be 1120gb/s ?;)
 
DF said:
Microsoft's solution for the memory sub-system saw it deliver a curious 320-bit interface, with ten 14gbps GDDR6 modules on the mainboard - six 2GB and four 1GB chips. How this all splits out for the developer is fascinating.

"Memory performance is asymmetrical - it's not something we could have done with the PC," explains Andrew Goossen "10 gigabytes of physical memory [runs at] 560GB/s. We call this GPU optimal memory. Six gigabytes [runs at] 336GB/s. We call this standard memory. GPU optimal and standard offer identical performance for CPU audio and file IO. The only hardware component that sees a difference in the GPU."

In terms of how the memory is allocated, games get a total of 13.5GB in total, which encompasses all 10GB of GPU optimal memory and 3.5GB of standard memory. This leaves 2.5GB of GDDR6 memory from the slower pool for the operating system and the front-end shell. From Microsoft's perspective, it is still a unified memory system, even if performance can vary. "In conversations with developers, it's typically easy for games to more than fill up their standard memory quota with CPU, audio data, stack data, and executable data, script data, and developers like such a trade-off when it gives them more potential bandwidth," says Goossen.

Everyone should go read the article when they have a chance. What MS is doing with memory/SSD is really interesting and worth reading in detail.

https://www.eurogamer.net/articles/digitalfoundry-2020-inside-xbox-series-x-full-specs
 
Any idea where the internal SSD is? It's presumably just a single chip? Back of the SOC motherboard?

Edit: It was in the Austin Evans video. Where I thought it'd be.
 
Last edited:
That's wrong. It's 560GB/s total bandwidth, the bandwidth of the two pools isn't additive.
Ah, that makes a lot more sense! I did wonder, but the way it was phrased (two pools with separate BW, like PS3's split RAM) caught me off guard. Should pay more attention.
 
To be fair, they were going off github and 60cu count.
Die shot measurements and the like speculated 410+ mm². DF's article on 'breaking the rules' that looked at the case led us to imagine a huge, hot die and fancy cooling. Turns out the case is pretty mediocre in airflow design, from the looks of it, that had us scratching our collective heads. And then it turns out the die isn't huge and hot. It's a normal console, but engineering has allowed a decent generation advance within the same die size. Although cost wise, is it dependent on 100% perfect dies without redundancy?
 
Also, GDDR6 has to be really expensive for the decision to create separate pools and put two different kinds of chips on there to make sense. There have to have been a lot of heated meetings about the BOM before they made the decision to have 2 different sizes of DRAM chips on the bus. I'd be willing to bet money that the original design had 20GB, which they then cut back on due to costs.
 
Status
Not open for further replies.
Back
Top