Xbox Series X [XBSX] [Release November 10 2020]

Discussion in 'Console Industry' started by Megadrive1988, Dec 13, 2019.

  1. Jay

    Jay
    Veteran

    Joined:
    Aug 3, 2013
    Messages:
    4,032
    Likes Received:
    3,428
    Thanks that could be it.
    Don't suppose you remember what tile sizes are now?

    I remember seeing I think it was a PRT plane demo last gen, impressive and looked so promising, much like this demo.
    Hence why I hope version 2 is actually "fixed".
     
    #2621 Jay, Apr 23, 2021
    Last edited: Apr 23, 2021
  2. iroboto

    iroboto Daft Funk
    Legend Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    14,833
    Likes Received:
    18,633
    Location:
    The North
    no change, still 64K IIRC. @DmitryKo would know best off hand
     
    Jay likes this.
  3. Ronaldo8

    Regular

    Joined:
    May 18, 2020
    Messages:
    292
    Likes Received:
    358
    What the MS engineer is saying in the video backs up what Andrew Goossen, one of the system architects, said in the DF deep dive. Goossen also justified the split memory bus by talking about issues encountered with signal integrity during testing of GDDR6:

    "it sounds like a somewhat complex situation, especially when Microsoft itself has already delivered a more traditional, wider memory interface in Xbox One X - but the notion of working with much faster GDDR6 memory presented some challenges. "When we talked to the system team there were a lot of issues around the complexity of signal integrity and what-not," explains Goossen. "As you know, with the Xbox One X, we went with the 384[-bit interface] but at these incredible speeds - 14gbps with the GDDR6 - we've pushed as hard as we could and we felt that 320 was a good compromise in terms of achieving as high performance as we could while at the same time building the system that would actually work and we could actually ship."

    https://www.eurogamer.net/articles/digitalfoundry-2020-inside-xbox-series-x-full-specs

    Well, we just heard from said system team directly.
     
    VitaminB6 and Jay like this.
  4. Arnold Beckenbauer

    Veteran Subscriber

    Joined:
    Oct 11, 2006
    Messages:
    1,756
    Likes Received:
    722
    Location:
    Germany
    Pete likes this.
  5. see colon

    see colon All Ham & No Potatos
    Veteran

    Joined:
    Oct 22, 2003
    Messages:
    2,756
    Likes Received:
    2,206
    An APU without the GPU part? Wouldn't that just be a CPU?
     
    JPT likes this.
  6. I can't open anything intelligible to me from that link, but couldn't it be a Series S SoC instead?
    All they had to do was place 2*16Gbit chips in clamshell for each channel to reach 16GB GDDR6, and for a GPU-less SoC that's already more than enough bandwidth for the CPU cores alone.

    But the GPU-less part is intriguing. For all I know the Series SoCs only have 2x PCIe 4.0 lanes for storage, so it's either for a headless datacenter or it has only 2 lanes of PCIe 4.0 for a dGPU and then any local storage would need to use USB.
     
  7. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,465
    Location:
    Finland
    AMD already abandoned the APU term anyway. Regardless, it obviously has the GPU, it's just disabled for one reason or another.

    One of the marketing images confirm 10 chip memory config around the SoC just like XSX
     
    Deleted member 13524 likes this.
  8. Nesh

    Nesh Double Agent
    Legend

    Joined:
    Oct 2, 2005
    Messages:
    13,999
    Likes Received:
    3,720
    I want this translated to something that a non-techie can understand :p
     
  9. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,465
    Location:
    Finland
    Remember when Egon said never to cross the beams? The beams started crossing when they tried wide unified memory configuration instead of the split one they ended up using.
    I think.
     
  10. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,511
    Likes Received:
    24,411
    This still is unified memory.
     
    RagnarokFF likes this.
  11. see colon

    see colon All Ham & No Potatos
    Veteran

    Joined:
    Oct 22, 2003
    Messages:
    2,756
    Likes Received:
    2,206
    It would be bad.
     
  12. The bolded part?
    GDDR6's tight electrical signaling requirements (significantly tighter than GDDR5's) makes it harder / more expensive to build a PCB with many memory channels. That's why they reduced the bus width from the 384bit (12 x 32bit channels) GDDR5 on the One X to 320bit (10 x 32bit channels) GDDR6 on the SeriesX.
    But they still gained plenty of raw bandwidth anyway due to the latter's higher clock speeds. It wasn't really a tradeoff.
     
    Ronaldo8, Pete, dobwal and 1 other person like this.
  13. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    18,762
    Likes Received:
    2,639
    Location:
    Maastricht, The Netherlands
    Except of course it was a trade-off: ‘that would actually work and we could actually ship’. They’d much rather have doubled the bandwidth in all directions if they could have afforded it.
     
    DSoup, Silent_Buddha and mr magoo like this.
  14. It's a trade-off if you actually trade something off.
    Compared to the OneX, Microsoft didn't trade bandwidth off with the jump to GDDR6, nor RAM amount, nor latency. It was a win-win situation, except for cost obviously.
     
  15. Tkumpathenurpahl

    Tkumpathenurpahl Oil Monsieur Geezer
    Veteran

    Joined:
    Apr 3, 2016
    Messages:
    1,910
    Likes Received:
    1,929
    In a sense though, they did. An extra 4GB of memory would've pretty much guaranteed the XSX thrashed the PS5 for the entire generation. As it is, that have a system that's going to do increasingly better than the PS5 as the generation goes on, but still in a way that might not be all that visible to a layman.

    I'm glad that Microsoft have gone with a two tier launch, and I hope the XSS is successful because I want it to demonstrate that a two tier launch has value. But I would've much rather seen a less compromised duo of a 40CU XSS with 12GB GDDR6, and a 60CU XSX with 20GB GDDR6. I think that would've put Sony in a much more difficult position.

    But, of course, hindsight is 20/20 ¯\_(ツ)_/¯
     
    PSman1700 likes this.
  16. Rikimaru

    Veteran

    Joined:
    Mar 18, 2015
    Messages:
    1,060
    Likes Received:
    426
    While two tier system is a brilliant idea it hinders VR a lot. I'm glad PS5 did not go for it.
     
  17. Allandor

    Regular

    Joined:
    Oct 6, 2013
    Messages:
    842
    Likes Received:
    879
    Why?
    Works on PC as well, why should it make things worse for VR? (except for the visual quality, but still better than what PS4 can deliver)
    I'm not really a fan of VR so far. There is still the headset that is ... well disturbs my experience and well the simulator sickness is the other problem I have.
    Also VR demand 90-120 FPS to work well. So much of the extra power consoles have is split in half. But PS4 showed that VR was acceptable for many people with just a low-end console. But after all, it was so for just a gimmick for most people who just once bought it, had their fun and than almost never touched it again. Just like the eye-toy, kinect (v1), ... before, but still wasn't as successful. It is still a niche market and does no longer get as much attention as before. Sony could have prevented that if PSVR would have been directly compatible to PS5 games, but they didn't want that, so you can imagine, that it wasn't a financial success so far and they seem to want to give it another try in a few years with PSVR2. They think this might be the future (like 3D TVs ...), but so far, the market didn't fully accept it.
     
  18. This isn't a trade-off from adopting GDDR6. They could have had extra 4GB with the current 10-channel arrangement, simply by using 16Gb chips on all channels.
    It would also prevent the memory contention issues the platform is apparently having, as all memory would be accessed at the same 560GB/s bandwidth.

    Not getting 20GB GDDR6 was a cost decision, not an architectural limitation. I doubt it's a supply limitation considering the Series S and the PS5 are using plenty of 16Gb chips.


    I'm yet to see a single developer making such a statement. I've seen more than enough developers stating some engines will push more from the I/O and faster+narrower architecture on one console, and more from compute on the other console. None has specifically stated there's a clear long-term winner.
    Feel free to provide examples for your claims, though.
     
  19. Allandor

    Regular

    Joined:
    Oct 6, 2013
    Messages:
    842
    Likes Received:
    879
    Exactly. Also they have included technologies to save memory and bandwidth, so limiting the gpu in theory to "just" 10GB of fast memory wasn't really the problem. There is still plenty of data that needs to be in memory but isn't accessed that often (like sound data, world data, AI, ...). The only problem is, that it is not really a split pool but in theory it is. Memory has still the same latency but reading from the "extra" memory is just a bit slower.
    The only problem with the "10GB fast memory" is currently the new concepts must be adopted to actually save memory and bandwidth. And when that is done, the memory and bandwidth should be used much, much more efficient.
    For the Series S I look at it just like having a PC with an GTX1060. You can play with it, games are fun but just don't look that well.

    When the new technologies are adopted, I guess consoles can still look better (except for RT) than PC games running on "newer" hardware, just because the PC is just not ready to adopt the new technologies and it still needs some time until everyone has an nvme SSDs in their gaming-system. So developers still need to use much more memory on PC side to compensate this for this.
     
  20. Rikimaru

    Veteran

    Joined:
    Mar 18, 2015
    Messages:
    1,060
    Likes Received:
    426
    We already have games which render as low as 720p on S. S is too weak for nextgen VR.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...