Xbox One (Durango) Technical hardware investigation

Discussion in 'Console Technology' started by Love_In_Rio, Jan 21, 2013.

Thread Status:
Not open for further replies.
  1. BRiT

    BRiT (╯°□°)╯
    Moderator Legend Alpha Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    12,525
    Likes Received:
    8,738
    Location:
    Cleveland
    Now they can unlock the special special sauce!
     
    temesgen and shredenvain like this.
  2. pMax

    Regular Newcomer

    Joined:
    May 14, 2013
    Messages:
    327
    Likes Received:
    22
    Location:
    out of the games
    Cool, it sounds more like "thanks for being our OS updates beta tester for free so we save in QA" to me, but... who knows?
    (aka how a game that runs in R3 and under an hypervisor can break a console?)
     
  3. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    10,982
    Likes Received:
    5,799
    Location:
    London, UK
    Swings and roundabouts. It's an optional programme and part of the goal of the program is for users to help test new OS build. While I think it's unlikely that anything hardware-destroying would slip through Microsoft's Q/A, it's a little reassurance for those in the programme that should something bad happen, Microsoft have your back.
     
  4. pMax

    Regular Newcomer

    Joined:
    May 14, 2013
    Messages:
    327
    Likes Received:
    22
    Location:
    out of the games
    Then, I'd say it is needed. It may happen (not that the hardware breaks, software bricks are way more problable than that).
     
    shredenvain likes this.
  5. rekator

    Regular

    Joined:
    Dec 21, 2006
    Messages:
    779
    Likes Received:
    20
    Location:
    France
    Seem to be a very very Red Hot Chili Peppers sauce… Can break console… Whaou!!! May be is Flea who code so power slap the One!! :mad::cool:
     
  6. mosen

    Regular

    Joined:
    Mar 30, 2013
    Messages:
    452
    Likes Received:
    152
    I found some patents that may be related to XB1:

    INSTRUCTION SET SPECIFIC EXECUTION ISOLATION


    GPU SELF THROTTLING

    CPU-GPU PARALLELIZATION

     
    temesgen, shredenvain and Starx like this.
  7. Starx

    Regular Newcomer

    Joined:
    Sep 29, 2013
    Messages:
    294
    Likes Received:
    148
  8. powdercore

    Newcomer

    Joined:
    Aug 6, 2014
    Messages:
    41
    Likes Received:
    29
    What's the significance of these details? I find these sentences most interesting.

    "The graphics corecontains two graphics command and two compute command processors. Each command processor supports 16 work streams.The two geometryprimitive engines, 12 compute units, and four render backend depth and color engines in the graphics core support twoindependent graphics contexts."

    Does this mean the GPU can do two graphics/compute at the same time? How does this differ from the current AMD GCN APU?
     
    temesgen and mosen like this.
  9. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,742
    Likes Received:
    11,222
    Location:
    Under my bridge
    It's the same as current AMD GCN APU. There's nothing different about XB1's core architecture. There are a few anciliary additions/tweaks, but the basic operation, the job scheduling and execution, is 100% vanilla GCN as per PC components. For comparison, PS4 is exactly the same save for more queues to stack jobs for selection and maybe a couple other little variations, of no importance to this thread).
     
  10. Starx

    Regular Newcomer

    Joined:
    Sep 29, 2013
    Messages:
    294
    Likes Received:
    148
    A summary of the article :


    Main SoC


    • Unified, but not uniform, main memory
    • Universal host-guest virtual memory management
    • High bandwidth CPU cache coherency
    • Power islands matching features and performance to active tasks


    Main Memory

    • MMU hardware maps guest virtual addresses to guest physical addresses to physical addresses for virtualization and security.
    • The implementation sizes caching of fully translated page addresses and uses large pages where appropriate to avoid significant performance impact from the two-dimensional translation.
    • System software manages physical memory allocation.
    • System software and hardware keep page tables synchronized so that CPU, GPU, and other processors can share memory, pass pointers rather than copying data, and a linear data structure in a GPU or CPU virtual space can have physical pages scattered in DRAM and SRAM.
    • The GPU graphics core and several specialized processors share the GPU MMU, which supports 16 virtual spaces.
    • PCIe input and output and audio processors share the IO MMU, which supports virtual spaces for each PCI bus/device/function.
    • Each CPU core has its own MMU (CPU access to SRAM maps through a CPU MMU and the GPU MMU).
    • The design provides 32 GB/second peak DRAM access with hardware-maintained CPU cache coherencyfor data shared by the CPU, GPU, and other processors.
    • Hardware-maintained coherency improves performance and software reliability.


    CPU

    • The CPU contains minor modifications from earlier Jaguar implementations to support two clusters and increased CPU cache coherent bandwidth.


    GPU

    • The GPU contains AMD graphics technology supporting a customized version of Microsoft DirectX graphics features.
    • Hardware and software customizations provide more direct access to hardware resources than standard DirectX.
    • They reduce CPU overhead to manage graphics activity and combined CPU and GPU processing. Kinect makes extensive use of combined CPU-GPU computation.
    • The graphics core contains two graphics command and two compute command processors. Each command processor supports 16 work streams.
    • The two geometry primitive engines, 12 compute units, and four render backend depth and color engines in the graphics core support two independent graphics contexts.


    Independent GPU Processors and Functions


    • Eight independent processors and functions share the GPU MMU. These engines support applications and system services. They augment GPU and CPU processing, and are more power- performance efficient at their task




    Audio Processors

    • The processors support applications and system services with multiple work queues.Collectively they would require two CPU cores to match their audio processing capability.
    • The four DSP cores are Tensilica-based designs incorporating standard and specialized instructions. Two include single precision vector floating point totaling 15.4 billion operations per second.
     
    temesgen and mosen like this.
  11. mosen

    Regular

    Joined:
    Mar 30, 2013
    Messages:
    452
    Likes Received:
    152
    Are you sure? I thought at least supporting "two independent graphics contexts" is new. Or it's not?
     
  12. Starx

    Regular Newcomer

    Joined:
    Sep 29, 2013
    Messages:
    294
    Likes Received:
    148
    IMO The next generation of AMD APU Taking advantages of this feature.

    [​IMG]

    http://en.wikipedia.org/wiki/AMD_Accelerated_Processing_Unit#AMD_Heterogeneous_System_Architecture
     
  13. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    7,906
    Likes Received:
    6,192
    IIRC earlier, much earlier in this thread we did consider it new. But the second context IIRC was deemed for use as the GUI/3rd pane; I thought this was in reference to you hit the home button and it' pops out you see the xbox home, but you see your game running a different resolution from your game gui, which is running separately from the xbox home?
     
  14. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,742
    Likes Received:
    11,222
    Location:
    Under my bridge
    Nope. ;)
    As PS4 has two as well, I understand it's a standard GCN feature, although perhaps one that hasn't made it into the PC GPUs yet?
     
  15. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,742
    Likes Received:
    11,222
    Location:
    Under my bridge
    There's much talk of the second graphics command processor in this DX12 thread.
     
    mosen likes this.
  16. mosen

    Regular

    Joined:
    Mar 30, 2013
    Messages:
    452
    Likes Received:
    152
    It was in DX12 thread (thanks to Shifty) but at the time I was insisting that XB1 approach is different than PS4. On PS4 there are 2 graphics command processors (GCPs) but one of them is exclusively for System (HP) with reduced features, and the other one is for games with common features.

    On XB1 there are two customized GCPs with similar features that allows hardware to render game at high priority and system at low priority. There is no HP GCP or LP GCP in XB1, it's only the render priority.
    http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview

    From the article:

    If it was like PS4 they only need to give some CUs or some HW allocations to the System (as this anonymous dev mentioned). But they changed many part of the GPU to support 2 graphics contexts, so it seems to me that even games (at some point) could use this feature. However I don't know what would be it's benefit. :|

    Before it was about the number of GCPs (regardless of how they differ in abilities/responsibilities), but now we know that "two geometry primitive engines, 12 compute units, and four render backend depth and color engines in the graphics core support two independent graphics contexts". So, I think that it should be more than standard GCN feature (I'm not sure, but it seems new to me). Also, MS insisted that they made some customizations to the graphics cores/GCN before (see this slide, they said DX11.1+ means that there are some features above DX11.1 in XB1 HW), and they did it again in this article. Maybe they are referring to some DX12 features? :D
     
    #8096 mosen, Nov 28, 2014
    Last edited: Nov 29, 2014
    Starx likes this.
  17. dobwal

    Legend Veteran

    Joined:
    Oct 26, 2005
    Messages:
    5,019
    Likes Received:
    1,025
    When did 2 command processors become vanilla on GCN?
     
  18. temesgen

    Veteran Regular

    Joined:
    Jan 1, 2007
    Messages:
    1,536
    Likes Received:
    327
    They are doing a lot of things right ATM though, this is good not only for the brand but should help to keep arrogant Sony from returning. As consumers we all win regardless which platform we chose.
     
    shredenvain likes this.
  19. Betanumerical

    Veteran

    Joined:
    Aug 20, 2007
    Messages:
    1,544
    Likes Received:
    10
    Location:
    In the land of the drop bears
    My understanding was that GCN itself was DX11.1+ but there is no real point talking about that much in the PC space because the abstraction layers matter. On a console that is a different story though.
     
  20. mosen

    Regular

    Joined:
    Mar 30, 2013
    Messages:
    452
    Likes Received:
    152
    GCN may have more features than what DX11/11.1/11.2 requires, but the "+" in "DX11.1+" refers to additional features in XB1 HW/SW compared to standard DX11/11.1/11.2 (according to John Sell at Hot Chips conference) and his recent article:

    Also, it may not be related to technical hardware discussions in this thread (since it won't add any technical knowledge to our discussion), but I think Spencer once said that there are some DX12 features in XB1 (he didn't want to talk about them, but somehow confirmed their existence).


    (listen from 28:00)
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...