Digital Foundry Article Technical Discussion [2020]

Discussion in 'Console Technology' started by BRiT, Jan 1, 2020.

Thread Status:
Not open for further replies.
  1. Globalisateur

    Globalisateur Globby
    Veteran Regular Subscriber

    Joined:
    Nov 6, 2013
    Messages:
    3,994
    Likes Received:
    2,927
    Location:
    France
    Maybe the bottleneck is simply the whole design. No others AMD GPUs have so many CUs by shader array. Because of this we know the L2 caches are quite busy with Tflops (compared to PS5). And you have that unique split memory (that no others recent machine use for good reason because it causes additionnal memory contention). Finally there is some customization on PS5, some we know that helps with bandwidth, others stuff we probably don't know yet, particularly about the CPU caches.
     
  2. function

    function None functional
    Legend Veteran

    Joined:
    Mar 27, 2003
    Messages:
    5,698
    Likes Received:
    3,932
    Location:
    Wrong thread
    Not much, and not by much. I don't think it will be much of a factor at all for a properly structured game.

    I don't think the memory setup particularly causes contention issues, at least not so much more than normal. And I don't think MS/AMD would have made a larger chip with a fatter bus that performs like a smaller one (not even talking about the PS5 here - thinking of the 5700XT).

    Most likely thing is that early software is just that. Final Xbox dev kits were late, dev environment was new, unfamiliar and late (and not universally liked). MS being MS. They're run by diktat and make decisions that seem odd to most of us.

    Developers don't actually care about most of the things we talk about here. Their favourite platform is generally the best one for making their game on. At the start of last gen that was PS4. At the start of this gen it appears to be PS5.
     
  3. Rootax

    Veteran Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    1,958
    Likes Received:
    1,393
    Location:
    France
    But still, when I read xsx has more bandwitdh, eeeeh, it's a little more complex than that.
     
  4. PSman1700

    Veteran Newcomer

    Joined:
    Mar 22, 2019
    Messages:
    4,414
    Likes Received:
    2,038
    Maybe we can learn more from the dGPU variants soon, what their doing and performing. Atleast, they all seem (very) high clocked, or are able to.
     
  5. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    18,320
    Likes Received:
    1,980
    Location:
    Maastricht, The Netherlands
    London Geezer and DSoup like this.
  6. VitaminB6

    Regular Newcomer

    Joined:
    Mar 22, 2017
    Messages:
    269
    Likes Received:
    373
    The slower portion of memory in the XBSX is still pretty fast though. At 336 GB/s its still faster than the One X which was 326GB/s, so nothing to sneeze at I guess. I don't think we've heard from developers yet about XBSX's memory setup but I am curious what their thoughts are. It could be the slower memory is easily used up by things that don't require massive amounts of bandwidth, or maybe it does make things a little more difficult for devs. Either way it's something that will get worked out over time and at least it's no where near as complicated as last gen with Esram for XB1.
     
    thicc_gaf likes this.
  7. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    14,703
    Likes Received:
    10,842
    Location:
    London, UK
    NX Gamer is also confusing "4" and "5" in his videos. COVID-20 at work? Nobody dies but people suffer dyscalculia but only for these two numbers.
     
    turkey, London Geezer and BRiT like this.
  8. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    18,530
    Likes Received:
    20,562
    My response was not about your post, which is why it wasn't quoted.

    It was for the general discussion about UE5 and the next-gen games with crazy geometry and full GI. For the developers who don't have the time budgeted from management to create all of that for themselves, they will have to rely on when UE5 is available.

     
    pharma, PSman1700 and VitaminB6 like this.
  9. chris1515

    Legend Regular

    Joined:
    Jul 24, 2005
    Messages:
    6,087
    Likes Received:
    6,309
    Location:
    Barcelona Spain
    First Unreal 4.26 have some huge change in term of I/O stack and the engine part is open source for licensee they can change the I/O stack.
     
  10. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    18,320
    Likes Received:
    1,980
    Location:
    Maastricht, The Netherlands
    Yes but it should be clear that the new IO stack is there for next-gen purposes so that UE4 games can use the benefits right away. I’m sure you don’t get all possible benefits until you do some work, but the way UE4 is set up to be scaleable and abstract some parts, it can be updated transparently to some extent and I’m sure that work has been underway for a year or more.
     
    PSman1700, VitaminB6 and BRiT like this.
  11. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    17,610
    Likes Received:
    7,574
    Assuming rumors of the GDK not being in a good, possibly even useful, state prior to this summer, it'd be hard to show games running if developers only got a decent GDK sometime this summer.

    It'd be interesting if part of the delay for Cyberpunk 2077 was the seemingly rushed nature of MS' GDK.

    Regards,
    SB
     
    thicc_gaf likes this.
  12. Shompola

    Newcomer

    Joined:
    Nov 14, 2005
    Messages:
    186
    Likes Received:
    34
    If true that GDK is not up to snuff yet it gives me some hope... Then maybe even BC would be better than it already is? ironing out the those 50-60fps games such as Sekiro would be sweet.
     
  13. JPT

    JPT
    Veteran

    Joined:
    Apr 15, 2007
    Messages:
    2,315
    Likes Received:
    784
    Location:
    Oslo, Norway
    How would the gdk improve bc?
     
    milk and BRiT like this.
  14. Insight

    Newcomer

    Joined:
    Sep 30, 2020
    Messages:
    94
    Likes Received:
    273
    It'll add extra sauce :0
     
    JPT likes this.
  15. Karamazov

    Veteran Regular

    Joined:
    Sep 20, 2005
    Messages:
    3,677
    Likes Received:
    3,591
    Location:
    France
    if anything, PS5 seems to have the more "exotic" architecture, so the learning curve could be higher on PS5, but that's just supposition of course.
    It's said that devs are not even using the GE like they could and won't for at least one year.
     
    DSoup likes this.
  16. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    14,703
    Likes Received:
    10,842
    Location:
    London, UK
    The I/O system is supposedly transparent. Clock speeds changing are something most multiplatform developers will be used too, though, it's the norm on PCs and mobile devices.
     
  17. Karamazov

    Veteran Regular

    Joined:
    Sep 20, 2005
    Messages:
    3,677
    Likes Received:
    3,591
    Location:
    France
    great times ahead, and we still have to see a game approaching what we saw in the UE5 demo.
     
  18. Shompola

    Newcomer

    Joined:
    Nov 14, 2005
    Messages:
    186
    Likes Received:
    34
    I was assuming that whatever the BC team is doing in the software layer uses the GDK as well including compilers and libraries? Utilize the GPU better than it already does.
     
  19. thicc_gaf

    Regular Newcomer

    Joined:
    Oct 9, 2020
    Messages:
    262
    Likes Received:
    203
    I dunno; the thing with PS5 is, a lot of its "exotic" features are either simply Sony's variants for certain RDNA 2 features, or things like cache scrubbers which aren't an exotic concept all that much in the space of computing, though maybe for a games console they're new (I'd have to do some research and see if older games consoles had them or some variation of the concept implemented at the hardware level). The SSD I/O even isn't terribly exotic; there's a lot of mostly repurposed Zen 2 cores (most likely) for things like the Coherency Engines, and the technology at the end of the day is commonly understood; it's still interfacing with NAND chips, etc.

    Even Tempest Engine isn't too "exotic"; repurposed CU designed to act more like a SPE from Cell, I would say vast majority of 3P devs (and certainly Sony's 1P devs) are familiar to a large extent with Cell and the SPE after having worked on it for several years. There's always been some talk about to what extent it could be used for "boosting" graphics rendering but realistically how would that function in practice? There will always be some audio to work on, though I understand PS5 has a regular DSP in it too for more standard audio tasks. Even so, what could Tempest Engine actually do for graphics when it is a stripped down CU?

    From everything Sony have said regarding things like the SSD I/O handling a lot of the work for devs on its own (or automating a large part of that process), to the information (however scant) that PS5 devkits are a lot like supercharged PS4 devkits (and using similar, but much expanded, APIs), I don't see where the large learning curve is. Then again that is probably also because I'm not on the "GE has RDNA 3" train, either xD, because at least IMHO there are no logical grounds to conclude any specific RDNA 3 features in the GE as all speculation is incredibly barren, and I think the majority of it has come from people who were expecting Sony to talk on more of what features the system has after AMD's RDNA 2 presentation, or looking for confirmation of PS5 features in the revealed RDNA 2 presentation, and completely missed the obvious clues (AMD Smart Access, which seems like their version of PS5's variable frequency. Altho to more recent knowledge, underlying support for implementing a feature like Smart Access was always technically there in the PCIe 4.0 (and even PCIe 3.0) standard) for whatever reason.

    OTOH Series X is already a bit tepid out of the gate because the SDKs have been running late in maturity. The "split" (not really split in a way like XBO or PS3's memories were, let alone older consoles) memory might be a bit of a learning curve to it (though not very much; the slower pool should be used for CPU and audio like intended but if there's a way graphics assets can spool over into it, while I don't think content of bandwidth would be anything significant, it would have to be managed well by the developer). While I wouldn't say saturating a wider GPU is necessarily difficult (AMD, Nvidia, Intel etc. are all going wider and wider so there must be some benefits to workload scalability to encourage this otherwise why waste the money?), there may be certain workloads that require more optimization to do so versus having a narrower GPU that's clocked faster like the PS5's. And with XvA there are things like the mip-blending hardware in the GPU for SFS that has its own calculated risk since it needs a miss in order to trigger its use, that has to be calculated/accounted for by the developer on their end, that probably will need its own learning curve to master.

    Basically I think PS5 well captures the relative ease-of-use (and dev environment maturity for growth in optimizations) of the PS1 or 360. Not so much the Series X; I wouldn't say it's anywhere near PS3 levels of challenge potentially, but there's maybe not another system to directly compare it to. The PS2 wasn't necessarily easy-to-use but whatever challenges it presented were very much overcame relatively quickly. And while I don't think the Series X is at as much of a disadvantage as the Sega Saturn was in its day, I do think there are a few parallels atm. For starters, like Saturn the Series X had a somewhat tepid visual presentation coming right off of Sony having a very strong visual presentation (Halo Infinite vs. UE5 demo, caveats aside). Like Saturn, the SDK environment for Series X has been comparatively slow to mature compared to Sony's, who seem to have everything up and running quickly. And like Saturn, we're seeing the Series X perform a bit under-bar in some 3P multiplat in this early period, in a lot of ways due to that second point but also due to the potential factor that it's just a somewhat more difficult design to master. There's probably a few more parallels I could touch on but those seem like the main ones IMO.

    The big difference between Sega in 1995 and Microsoft today, though, is that Microsoft actually has a wealth of money and resources to drive at resolving all of these problems, and they're just a much larger company and have stronger presence in the console market (in terms of financial power at least). While there's maybe been some confusing marketing regarding the Series systems (analogous with some of Sega's confusing Saturn marketing at the start of that system's Western life), I think they're ironing it out now. I guess that means Microsoft needs their own "Virtua Fighter 2" moment, something to show the system can actually do all of the things it's said it can do, and outperform whatever's on Sony's platform at the time at least in some key visual metric. Natural instinct would be to assume that game is Halo Infinite, but realistically it could be BMI, Scorn, or Exo-Mecha (imo), it just depends on how much MS are helping those studios. It could also be their Series X port of FS2020, which hopefully should drop in H1 2021 (and with a bevy of extra content).

    I just don't want them to take too long before showing such a thing off, because Sony won't be waiting too long to give people R&C Rift Apart (which is the closest to a Pixar film I've seen any game look), or Horizon Forbidden West. Even GT7 is looking pretty nice graphically, though I think that's prob still coming to PS4 as well (nothing wrong with that; cross-gen isn't the death knell it used to be earlier in the year didn't 'ya know ;) ).
     
  20. Globalisateur

    Globalisateur Globby
    Veteran Regular Subscriber

    Joined:
    Nov 6, 2013
    Messages:
    3,994
    Likes Received:
    2,927
    Location:
    France
    I doubt MS can do anything about both CPU and GPU compilers.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...