Xbox One (Durango) Technical hardware investigation

Discussion in 'Console Technology' started by Love_In_Rio, Jan 21, 2013.

Thread Status:
Not open for further replies.
  1. Pozer

    Regular

    Joined:
    Feb 9, 2005
    Messages:
    664
    Likes Received:
    6
    Location:
    Ohio
    Its been floating around these forums since atleat Febuary.. Guilty as charged im afraid.. http://forum.beyond3d.com/showpost.php?p=1714023&postcount=2955
     
  2. grndzro

    Newcomer

    Joined:
    Jul 11, 2013
    Messages:
    45
    Likes Received:
    0
    I think the best bet for a ram upgrade would be to replace the ram in the xbox with higher density chips/sticks and hope the firmware supports the upgrade.

    Because odds are it's the OS reserve that is specified and not the ram the game uses.

    I'm sure this console season will have unprecedented box modding due to the standardized nature of the hardware.
     
  3. damienw

    Regular

    Joined:
    Sep 29, 2008
    Messages:
    513
    Likes Received:
    61
    Location:
    LA
    Did you actually read the articles? Because they completely contradict each other and any argument of 12 GB.

    Link to Extremetech article

    Link to Examiner article
     
  4. XpiderMX

    Veteran

    Joined:
    Mar 14, 2012
    Messages:
    1,768
    Likes Received:
    0
    Both articles say that Dev Kits have 12GB of RAM. And, if true, that mean 12GB is (technically) feasible, not impossible at all .

    EDIT: This is interesting (from the examiner article):

     
    #5404 XpiderMX, Aug 7, 2013
    Last edited by a moderator: Aug 7, 2013
  5. DrJay24

    Veteran

    Joined:
    May 16, 2008
    Messages:
    3,894
    Likes Received:
    634
    Location:
    Internet
    The original dev kits (PCs) had 12GB which was most likely a triple channel configuration like 4GBx3DIMMs. There is no info on dev kits based on the final XB1 hardware.
     
  6. BoardBonobo

    BoardBonobo My hat is white(ish)!
    Veteran

    Joined:
    May 30, 2002
    Messages:
    3,605
    Likes Received:
    541
    Location:
    SurfMonkey's Cluster...
    Apparently the speed upgrade to the GPU was not a hardware one but a change to the OS similar to the WiiU I guess. Does this mean that it is just an overclock not an actual upgrade? And if you can do that through the OS then how long before someone hacks that to push the envelope?

    Source
     
  7. ERP

    ERP
    Veteran

    Joined:
    Feb 11, 2002
    Messages:
    3,669
    Likes Received:
    49
    Location:
    Redmond, WA
    On most of these devices, clocks are based on some multiple of some base clock, selected in the firmware, or bios.
    Depending on the device various components will have different clock domains. So the GPU clock may or may not be tied to the ESRAM clock or the CPU clock.
     
  8. Betanumerical

    Veteran

    Joined:
    Aug 20, 2007
    Messages:
    1,763
    Likes Received:
    280
    Location:
    In the land of the drop bears
    Someone correct me if I'm wrong but im pretty sure that clocks for at least any modern system are tied to some software (BIOS/firmware, etc) and not to physical hardware. I can imagine that dynamic clocking could actually be useful for them to a degree to keep sound and heat down in situations were games are not open.
     
  9. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    Yes of course. It's the same as how you'd overclock your CPU on your PC. You go into the bios.

    I think that's what the MS person meant, I guess they are somewhat correct but it's more like a firmware change. They are just simplifying for a mass audience and/or dont fully understand it themselves.

    Of course a chip's clockspeed is not some sort of physical property of it as it rolls off the line...
     
  10. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    Clocks can entirely be controlled by the OS layer if need be, the only elements that are set at the hardware level (which will be controlled by the PCB) is a default "boot" voltages and speeds which may not necessarily relate to any clocks that you would think of as running clocks.

    Our discrete GPU's actually have a microcontroller on them with arbitration code that dictates what clock/voltage state it should be operating at dependant on a number of parameters - i.e. if the activity counters on the chip are high it will push to a high clock state for peak performance, or if the activity is low it will drop to a lower clock/voltage to save power; alternatively if it is calculating that the activity is hitting a power threshold then it will modulate between states to keep the power in check. The microcode to do this is loaded into the GPU from a table in the BIOS when the driver is loaded by the OS but those limits can further be controlled by software in the OS.

    So when they are saying "the GPU is clocked higher" but also saying "we've not changed any hardware" these don't contradict each other; likelihood is that whatever part of the system dictated the max clock / voltage state of the GPU portion (be that firmware or OS) has just been updated.
     
  11. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    Yeah there are utilities to overclock CPU/GPU from Windows nowdays on PC.

    I guess it doesn't matter, point is more like "parameters get loaded on boot" whether "bios" or OS.
     
  12. Bigus Dickus

    Regular

    Joined:
    Feb 26, 2002
    Messages:
    943
    Likes Received:
    16
    There have been tools to access GPU clocks from within windows for nearly 15 years that I can recall, perhaps longer still?
     
  13. Ekim

    Newcomer

    Joined:
    Jul 3, 2013
    Messages:
    47
    Likes Received:
    0
    Out of interest and totally unrelated to any console : is it possible to disable/enable CUs via software?
     
  14. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    I'm not Dave but, you used to be able to buy a few GPU's and get more execution units out of them (they weren't CU's back then) by flashing them with certain Bioses IIRC. X800 was one back in the day...which would suggest it's possible.

    Now I think they "laser cut" off any units they dont want you to use, so they're basically hardware blocked, on PC. But that's to prevent people gaming the system, not sure if they'd do that on a console.
     
  15. GrimThorne

    Newcomer

    Joined:
    Mar 7, 2013
    Messages:
    221
    Likes Received:
    31
    You mean developers actually hacking the X1, pushing clock speeds without Microsoft's permission and possibly losing their license to develop for the platform? You really think they'd take a risk like that?

    If anything that article represents yet another example of the kinds of sites that are covering gaming and gaming technology. This Adam Barnes doesn't seem to have a single clue of how upclocking on a modern system occurs. Hmmmm.
     
  16. GrimThorne

    Newcomer

    Joined:
    Mar 7, 2013
    Messages:
    221
    Likes Received:
    31
    Well didn't Sony disable SPEs on the CELL?
     
  17. HeLL

    Newcomer

    Joined:
    Apr 1, 2013
    Messages:
    7
    Likes Received:
    0
    Looks like AMD will unveil Hawaii/Volcanic Islands tech in late September, launch at Q4.

    http://semiaccurate.com/2013/08/07/amd-to-launch-hawaii-in-hawaii/

    A little quote:


    I'm posting this here because I'm wondering if it's possible Durango (or Orbis) are fully/partially based on new tech, not Bonaire nor Pitcairn, just like Xenon had Unified Shaders more than 6 month earlier the PC product was launched. At least better HSA integration or efficiency improvements?

    At Q4, AMD 7970 will be 2 years old.
     
  18. BoardBonobo

    BoardBonobo My hat is white(ish)!
    Veteran

    Joined:
    May 30, 2002
    Messages:
    3,605
    Likes Received:
    541
    Location:
    SurfMonkey's Cluster...
    Not developers. Regular joe hacks who like to tinker with hardware or is it more likely that MS will lock that feature down in production?

    I get the way in which CPU\GPU's are upclocked but there were a few people who were talking about hardware respins etc And this kind of knocks that on the head.
     
  19. Ekim

    Newcomer

    Joined:
    Jul 3, 2013
    Messages:
    47
    Likes Received:
    0
  20. Jay

    Jay
    Veteran

    Joined:
    Aug 3, 2013
    Messages:
    4,033
    Likes Received:
    3,428
    To be fair it doesn't actually knock those on the head.
    The respins may have been what got the gpu to 800-853.

    The setting it in 'software', just means that they could settle on the final clock speed once all the validation, TDP, yield results, etc had been taken into account.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...