Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
I'll admit this stuff goes way over my head, im going by estimates done by proelite and liabe brave both of who seem to include mc/phy in their estimates
See my edit. Your 30 sq mm is enough to cover 4 PHYs and 4 MCs from my estimations. Are these posters on Resetera? I did a huge breakdown there a while ago in the PS5 tech thread.
Yes im aware but my point was that PS5 IO would replace the GPU IO so that frees up some space
Now my question: Is the 6900xt io likely to be bigger or the same size as 5700, if its 37mm2 like 5700 that would leave us with 18mm2 after we halve it for PS5 IO

There are two IO blocks being discussed here. Southbridge IO and SSD IO. PC GPUs don't have SSD IO hardware acceleration blocks, so discussing SB IO for 5700 or 6900XT isn't ideal, and likely to have additional PC logic. The XSX IO block is a better starting point with SB IO and SSD IO (VA) already included. Just need to add extra for SSD IO Complex for PS5.

Yeah seems reasonable to add an extra 2-5mm2 for coherency engine, beefier decompression block & sram?

Before going further with calculations need an estimate of navi21 io size.
As mentioned above, you are better off with XSX IO size. I estimate that around 13 sq mm. You are left with:

- SSD IO Complex
- SSD IO SRAM
- Multimedia logic (could be off-die)

Choose what you need to hit 305 sq mm.
 
Are these posters on Resetera? I did a huge breakdown there a while ago in the PS5 tech thread.
Yes both do, and at least proelite posts here too. I saw that been following ree tech thread up until it got closed ;( Nice analysis
There are two IO blocks being discussed here. Southbridge IO and SSD IO. PC GPUs don't have SSD IO hardware acceleration blocks, so discussing SB IO for 5700 or 6900XT isn't ideal, and likely to have additional PC logic.
I get that but since the consoles io block cover everything i include the discrete GPUs io to free space

As mentioned above, you are better off with XSX IO size. I estimate that around 13 sq mm.
Knowing navi21 io block size would help knowing how much space can be freed
For example say navi21 io its 46mm2, that means 23mm2 is accounted for the PS5 IO in the halved 260mm2 estimate

BTW what's 13mm2 in your estimation? the excess of ps5 i/o over xsx's 18mm2?
 
Knowing navi21 io block size would help knowing how much space can be freed
For example say navi21 io its 46mm2, that means 23mm2 is accounted for the PS5 IO in the halved 260mm2 estimate
I see, you are looking to make adjustments. Do you have a high resolution die shot for Navi21 - ideally with its IO labelled like the XSX die shot from Hotchips?
BTW what's 13mm2 in your estimation? the excess of ps5 i/o over xsx's 18mm2
13 sq mm is roughly the IO block labelled on the XSX die. This should include SB IO and SSD IO (VA).
 
I was thinking about the series x having 14 cu's per shader array, which is substantially higher than the usual 10 for RDNA2, is it possible that microsoft expects that as the gen goes on that devs might choose to dedicate several CU's to machine learning/GPU compute tasks? I dont know if compute type tasks would put a lower load on the shader array, but it might explain the unusually high CU per shader array count.
 
Are Sony SDKs developed in Japan or USA?

Sony has quite a few teams across the world that handles development tools for all SIE Worldwide Studios and third-party publishers. SIE XDev Europe (Europe) and ICE Team (America) being two of them.

Edit: from Era...
NEW_era_first_party_thread_services_groups_2.0.png

Sony XDev Europe:
XDev Studio Limited collaborates with independent development studios across Europe and other PAL territories to publish content to PlayStation platforms all over the world. XDev has helped to create and publish, titles such as the LittleBigPlanet, Buzz!, MotorStorm and Invizimals series, Super Stardust HD, Heavenly Sword, Heavy Rain, Beyond: Two Souls, Tearaway and Resogun

In addition to funding projects, XDev offer full production, project management and game design support. Titles are also supported with community management, online production and dedicated outsourcing management facilities. XDev work directly with Marketing and PR teams in all Sony territories to promote and publish games worldwide.

Visual Arts Services Group:
The Visual Arts Services Group is a division of SCEA’s Product Development Services Group. VASG is a full-production studio located in San Diego that specializes in Animation, Motion Capture, Cinematics, Art and Scanning. Visual Arts Services Group, a multi-award winning team responsible for the cinematic performances in premier Sony PlayStation franchises like Uncharted, The Last of Us, Killzone, and other iconic series

Sony San Mateo:
Is a video game developer part of SIE Worldwide Studios which is owned by Sony Interactive Entertainment and established in 1998. As implied by its name, it is now based in San Mateo, California but it was relocated from Foster City, California. It is mostly responsible for overseeing the development of first party games by external developers. It co-developed the SOCOM US Navy SEALS series with Zipper Interactive, and the Sly Cooper series with Sucker Punch Productions.

SN Systems:
SN Systems is a provider of based development tools for games consoles, including the PlayStation 4, PlayStation 3, PlayStation 2, the original PlayStation and the Handhelds (VITA and PSP). SN Systems was acquired by Sony Interactive Entertainment Inc. in 2005 to provide tools for the PlayStation 3, and future consoles.
 
Last edited:
Interesting. So, Matt and Fafalada were right about the XBSX/S GDK being available for game developers way before June (see article date). Anyhow, from Codemasters perspective the GDK toolsets were more mature and stable than any prior Microsoft SDK tools.

https://news.xbox.com/en-us/2020/06/10/inside-xbox-series-x-optimized-dirt-5/
Q: What is it like developing on Xbox Series X?

A: Transitioning development to a new console platform, like Xbox Series X, is usually very painful. You have to deal with new tools, new workflows, new ways of thinking.

This time around the team at Xbox brought me a new toolset called the Game Development Kit, which they already had up and running on Xbox One.

This meant that we could make the transition much earlier. In fact, we started doing the groundwork for Xbox Series X development long before we even received the hardware. This kind of thinking from Xbox allowed us to get a real head-start on next-gen development, so after receiving our early Xbox Series X hardware, we were up and running really quickly.

For me, the most important thing in making a videogame is the relationships. Working with Xbox, is a partnership – the team at Xbox is committed to helping us make a great videogame and they’ve shown that to me again and again. That means being open and honest about our experiences; what we’ve loved and perhaps even what we’ve found difficult in development has had meaningful, visible impact on the updates that I get for the tools for Series X. (Shout out to our development partner at Xbox, Richard Hackett! Thanks Rich!)

I’ve never worked on a console launch where, while we’re still months away from release, the tools have been this mature, this stable, this easy to work with.
 
Last edited:
I was thinking about the series x having 14 cu's per shader array, which is substantially higher than the usual 10 for RDNA2, is it possible that microsoft expects that as the gen goes on that devs might choose to dedicate several CU's to machine learning/GPU compute tasks? I dont know if compute type tasks would put a lower load on the shader array, but it might explain the unusually high CU per shader array count.
MS speak about ML, so goes the INT4 and INT8, so more CU can help, specially if is not "really use" for over task?
 
I’ve never worked on a console launch where, while we’re still months away from release, the tools have been this mature, this stable, this easy to work with.

So Codemasters are happy with GDK, which makes sense because of all the launch games they are arguably have the most demanding and ambitious variety of performance targets across 60hz and 120hz with different settings for quality and across a scary number of platforms. If the SDK for Xbox (four console hardware targets) had an issue you'll think they would have just tossed out 120hz across the board - even for PS5, just for platform parity. And 120hz wasn't an eleventh-hour feature, they committed to 120Hz back on 3rd July so they would have needed to be confident about delivering before making that statement.

So where is this perception that GDK is behind coming from? Are these just old statements re-surfacing and no longer reflective of how GDK is now?
 
Interesting. So, Matt and Fafalada were right about the XBSX/S GDK being available for game developers way before June (see article date). Anyhow, from Codemasters perspective the GDK toolsets were more mature and stable than any prior Microsoft SDK tools.

https://news.xbox.com/en-us/2020/06/10/inside-xbox-series-x-optimized-dirt-5/
Q: What is it like developing on Xbox Series X?

A: Transitioning development to a new console platform, like Xbox Series X, is usually very painful. You have to deal with new tools, new workflows, new ways of thinking.

This time around the team at Xbox brought me a new toolset called the Game Development Kit, which they already had up and running on Xbox One.

This meant that we could make the transition much earlier. In fact, we started doing the groundwork for Xbox Series X development long before we even received the hardware. This kind of thinking from Xbox allowed us to get a real head-start on next-gen development, so after receiving our early Xbox Series X hardware, we were up and running really quickly.

For me, the most important thing in making a videogame is the relationships. Working with Xbox, is a partnership – the team at Xbox is committed to helping us make a great videogame and they’ve shown that to me again and again. That means being open and honest about our experiences; what we’ve loved and perhaps even what we’ve found difficult in development has had meaningful, visible impact on the updates that I get for the tools for Series X. (Shout out to our development partner at Xbox, Richard Hackett! Thanks Rich!)

I’ve never worked on a console launch where, while we’re still months away from release, the tools have been this mature, this stable, this easy to work with.

@BRiT has been posting for quite a while that the GDK has been available since last year, so that's been a known quantity for a while. June was the version capable of supporting a publishable game iirc (the table of releases had been posted a few times, I'll go back and look for it later).

I'd expect an interview for Xbox.com to be focused on the positives, and it sounds like he's had a good experience. Some developers have told Digital Foundry that they've really not liked the move to GDK however. Launch games might seem to back that up.
 
So where is this perception that GDK is behind coming from? Are these just old statements re-surfacing and no longer reflective of how GDK is now?

I had seen the concept of the GDK being behind stemming from discussions on other forums first before it started showing up in the B3D discussions. Though I can see how that could stem from developer comments. Or even from the leaked June 2020 GDK Release Notes themselves. There are multiple statements of "Performance for _ in the June 2020 release isn't indicative of the final expected performance and will be improved in future releases" or "We're planning to move to _ in a future release of the GDK."

Even with that, there are many unknowns. We don't know how much of a performance impact those situations account for, if they have been improved since then, if they completed the move to _ in future releases of the GDK, if they have newer GDK releases, what has been fixed, what has been improved, and also if the early cross-generation multi-platform games are using later GDKs.
 
So where is this perception that GDK is behind coming from? Are these just old statements re-surfacing and no longer reflective of how GDK is now?

Console Launch PTSD. Even if the SDKs were bum and the xbox was shown in a better light in head 2 heads, you think there would be this much drama over tools? I have seen this shit way way way too much. High expectations, failure to meet said expectations, then the coping process.
 
Console Launch PTSD. Even if the SDKs were bum and the xbox was shown in a better light in head 2 heads, you think there would be this much drama over tools? I have seen this shit way way way too much. High expectations, failure to meet said expectations, then the coping process.

Or maybe it's the understanding that even if GDK was available since Nov 2019, the maturity of each GDK iteration could have varied wildly between revisions and something being available at a certain point is not the same as something being mature/optimized at a specific point?

There's also the fact that not all devs would have gotten revision updates simultaneously, or perhaps due to needing to stabilize their code they could actually stick to older updates as newer ones might introduce features that break the stability of their code on the older versions? All factors to take under consideration; iroboto explained a lot of this in a lot more detail.

Make no mistake; MS cut it way too close to redesign their dev tools this close to a new console launch. Even without COVID, cutting a year ahead of a new system launch to deploy the new GDK, considering game dev is more complex than it used to be (and more platforms needing support in terms of 3P games)...just would not have been enough time. But I don't think acknowledging this means a need to underplay the state of possible devkit maturity on MS's end, considering this has been one of the more persistent rumors regarding their platforms since December 2019 (or at latest January 2020).

@BRiT has been posting for quite a while that the GDK has been available since last year, so that's been a known quantity for a while. June was the version capable of supporting a publishable game iirc (the table of releases had been posted a few times, I'll go back and look for it later).

I'd expect an interview for Xbox.com to be focused on the positives, and it sounds like he's had a good experience. Some developers have told Digital Foundry that they've really not liked the move to GDK however. Launch games might seem to back that up.

Not even just that; it was only maybe a month or so ago that several developers (anonymously) were sourced supposedly saying what a pain it was to develop on Series S, and I recall reading somewhere that there was not a Series S profile mode for GDK until very recently.

There's so many wild factors potentially at play here, kinda like (and not as a meme) trying to figure out what those beeps could possibly mean when your computer fails a POST. Too few beep codes, too many possibilities. Such a pain.
 
I don’t see what the issue is. Maybe my memory recollection is totally off but this seems par the course for MS.

Reports of esram bandwidth boosted to 204 GBs didn’t happen until after the initial One and e3 presentation. Discussions here about the poor state of dev kits in comparison to the PS4. Devs choosing the higher level version of the One’s api due to issues with MS trying to deliver a performant lower level api. Does no one remember MS having to customize it’s 3 OS system on the One which was somewhat hardware agnostic due to desire to increase performance prior to launch?

The only difference is we could easily point to the lack of hardware performance of the One as the culprit when we saw differences.
 
Last edited:
I think PS5 GPU design being a match with Navi 2x designs (SE layout, front/back end, high frequency design possibly enhanced with infinity cache) is whats allowing it to go head to head with a console that touts 20% higher theoretical peak compute performance. Granted XSX tools are less optimal than PS5 so equalizing that should iron out the more obvious performance dips. After all its said and done i expect both consoles to be extremely close ~5% within each other with each taking small leads (~5%) in scenes that cater to their strengths


I see, you are looking to make adjustments. Do you have a high resolution die shot for Navi21 - ideally with its IO labelled like the XSX die shot from Hotchips?
Yeah! I don't have any shots yet but im sure we'll get either a PS5 die shot or navi21 soon
Really wish Cerny would do another deep dive now after the games showed what it's capable
13 sq mm is roughly the IO block labelled on the XSX die. This should include SB IO and SSD IO (VA).
That's great news, that means extra 5mm2 for PS5 I/O in the 18mm2 estimate
I was thinking about the series x having 14 cu's per shader array, which is substantially higher than the usual 10 for RDNA2, is it possible that microsoft expects that as the gen goes on that devs might choose to dedicate several CU's to machine learning/GPU compute tasks? I dont know if compute type tasks would put a lower load on the shader array, but it might explain the unusually high CU per shader array count.
That setup is a cost saving measure to save die space (1SE), asynchronous compute should be more helpful to eek out extra performance
 
There is no need to speculate on Navi 21/PS5 layout when we know PS5 is 308mm² and XSX is 360mm². That means there is 52mm² difference. Same amount of SEs, 64bit additional bus, 16CUs extra.

Dual CU is around ~4.5mm² and additional 64bit memory bus is ~15mm². Add to this additional 1MB of L2 cache and it fits pretty much perfectly, I dont see where can you fit additional 32 or 64MB of Infinity cache (it needs to have controllers as well).
 
Is the IC that important? I've seen some benchmarks and yes, you get a little bump in performance but doesn't seem to great to be honest. If PS5 GPUs is not bandwidth starved with this memory configuration is it really worth the cost and added power budget?
 
Last edited by a moderator:
There is no need to speculate on Navi 21/PS5 layout when we know PS5 is 308mm² and XSX is 360mm². That means there is 52mm² difference. Same amount of SEs, 64bit additional bus, 16CUs extra.
PS5 is Navi21 design halved, same SA/SE layout and same high frequency pipeline. RDNA2 it's optimized for high frequencies getting the most utilization of hardware. In the slides is empathized in each part of the pipeline.
So far we know XSX SA/SE layout goes beyond 10/20 seen in Navi21, there's also some front end changes to the ROPs.
Dual CU is around ~4.5mm² and additional 64bit memory bus is ~15mm². Add to this additional 1MB of L2 cache and it fits pretty much perfectly, I dont see where can you fit additional 32 or 64MB of Infinity cache (it needs to have controllers as well).
I suspect infinity cache is in play though im not saying its a done deal, right now im focused on the possibility
I used some previously established die size estimates to make the case for IC

This are my preliminarily estimates:
Navi 21 die size halved (260mm2) should account for 64mb ic along with the needed mc, it should also account for >18mm2* worth of I/O space
260mm2 for GPU and IO + 40mm2 for CPU + 3mm2 TE + 30mm2 for 4PHYs = 333mm2
So we come up 25mm2 short using 64MB IC. Halving that should be more than enough to free 25mm2, perhaps it could even be closer to 40 or 50

*There are some unknowns that make it hard to determine how much IC can be used but based on preliminarily info there's more than enough for 32MB+ IC
Need to know navi21 & PS5 i/o block size, atm im using navi 10 i/o as a place holder (37mm2), for comparison xsx is 13mm2.
Is the IC that important? I've seen some benchmarks and yes, you get a little bump in performance but doesn't seem to great to be honest. If PS5 GPUs is not bandwidth starved with this memory configuration is it really worth the cost and added power budget?
AMD claims it helps with frequency scaling, getting more performance out of higher clocks that you would otherwise get.
image-137-1024x576.png
 
Status
Not open for further replies.
Back
Top