Xbox One (Durango) Technical hardware investigation

Discussion in 'Console Technology' started by Love_In_Rio, Jan 21, 2013.

Thread Status:
Not open for further replies.
  1. warb

    Veteran

    Joined:
    Sep 18, 2006
    Messages:
    1,057
    Location:
    UK
    More is always better, if it costs nothing to run and produce.

    You could gradually fill as much RAM as you can fit into a system and use the excess as a RAM drive for orders of magnitude faster access, over HDD/BD.
     
  2. fehu

    Veteran Regular

    Joined:
    Nov 15, 2006
    Messages:
    1,098
    Location:
    Somewhere over the ocean
    using the same not fast enough bus?

    NFE-bus is a nice name
     
  3. warb

    Veteran

    Joined:
    Sep 18, 2006
    Messages:
    1,057
    Location:
    UK
    Which bus is not fast enough, and for what?
     
  4. fehu

    Veteran Regular

    Joined:
    Nov 15, 2006
    Messages:
    1,098
    Location:
    Somewhere over the ocean
    the memory bus
    if you take out the os reserved memory / bw, and then the bw needed for a buffer, you get very little of that ~70GB/s on the NFE-bus
     
  5. warb

    Veteran

    Joined:
    Sep 18, 2006
    Messages:
    1,057
    Location:
    UK
    There's eSRAM to make that your FE-bus.

    ~100MB/s from the HDD versus 70GB/s. If whatever you wanted to load was already in main memory... But 8GB is what there is, with some of a 3GB reserve holding apps for the fast switching.
     
  6. fehu

    Veteran Regular

    Joined:
    Nov 15, 2006
    Messages:
    1,098
    Location:
    Somewhere over the ocean
    wow! why haven't you said first? we have the magic esram!
    now we can eliminate the NFE-bus to keep cost down! :D
     
  7. Lycan

    Regular

    Joined:
    Oct 20, 2005
    Messages:
    251
    Well, they communicated the 8GB of memory figure less than two months ago, and 3 months after the PS4 reveal. If they intended to up it in response to Sony's last hour decision, they would have gone public about it at their meeting in May, or even later on the E3 stage...
    It simply doesn't make sense...
     
  8. SenjutsuSage

    Newcomer

    Joined:
    Feb 25, 2013
    Messages:
    235
    I'm not sure I agree, unless of course you're not referring to an extra 4GB of ram being used as a faster data source in place of taking stuff off the Hard Drive. You more than likely don't need any extra bandwidth at all to handle the extra memory, as the extra ram in itself would be its own reward if it saves you trips to the hard drive and allows extra space to store important game data.
     
  9. (((interference)))

    Veteran

    Joined:
    Sep 10, 2009
    Messages:
    2,493
    FWIW that's what I've been hearing too - unrelated to Dave.

    I said 'perhaps' we'd hear more by Friday! - haven't heard any more yet unfortunately.

    Plus, I did say it was a rumour so neither the upclock or RAM increase are confirmed.
    And it's not just that the veracity of the rumour isn't confirmed, but that the rumour itself does not claim MS are definitely increasing the clocks or RAM.

    Apparently, the story is that MS has gone to devs for feedback on a few different things, two of them being an increased GPU clock and an increase to 12 GB of RAM (another thing on the list was storage for 'tombstoning' - I'm guessing flash cache)

    Devs have to list which spec changes they prefer in order of importance. So nothing is confirmed as of yet, just options MS is getting feedback on.

    However, I do have someone else telling me (also from a dev source) that we shouldn't be surprised to see 12GB of RAM in the final box and that it isn't as hard for MS to do as we think.

    For what it's worth, the most complaints MS received from devs around the XB1 was the large system reservation - it was a bigger issue than even the low GPU flops.

    Now, I don't know why devs won't more than 5GB of RAM (especially slow RAM), but it's not like MS doesn't have an impetus to increase the available memory.
     
    #4689 (((interference))), Jul 8, 2013
    Last edited by a moderator: Jul 9, 2013
  10. Cjail

    Cjail Fool
    Veteran

    Joined:
    Feb 1, 2013
    Messages:
    2,025
    If the problem is OS reservation then MS can still "free" that 1GB that they are now reserving for future updates; this would give devs more RAM for games and would cost MS nothing.
    Adding more RAM instead is going to cost a lot of money no doubt.

    It would be a good trade-off in my opinion.

    P.S.
    My theory still stands anyway.
     
    #4690 Cjail, Jul 8, 2013
    Last edited by a moderator: Jul 8, 2013
  11. Ekim

    Newcomer

    Joined:
    Jul 3, 2013
    Messages:
    47
    I just discovered the Engadged article about Xbox One's silicone lab. Very interesting stuff but what really made me wonder, was this:

    Photo of this chamber (you can also see some better shots in the video)
    [​IMG]

    The article states, that they were able to visit that lab a few days before the Xbox Reveal Event - that time, Beta devkits were already sent out to devs... so why still test different boxes? Maybe the beta kits differ the retail boxes in a large way? Was this already discussed?
     
  12. Ekim

    Newcomer

    Joined:
    Jul 3, 2013
    Messages:
    47
  13. Betanumerical

    Veteran

    Joined:
    Aug 20, 2007
    Messages:
    1,543
    Location:
    In the land of the drop bears
    They will be continually testing new prototypes till a new version of the box will not exist (i.e. when the next xbox comes out). Its not really surprising that they are testing how well a few designs of the box handle extreme temp ranges at all :).
     
  14. Ekim

    Newcomer

    Joined:
    Jul 3, 2013
    Messages:
    47
    Yeah - but why not just test the ones with the actual finalized (beta) board? It just doesn't make much sense to me to still test versions of the board that aren't going into production. But I'm probably missing something - I don't know a thing about this stuff.
     
  15. Brad Grenz

    Brad Grenz Philosopher & Poet
    Veteran

    Joined:
    Mar 3, 2005
    Messages:
    2,531
    Location:
    Oregon
    You might even say any suggestion of a secret, last minute upgrade defy any and all logic.
     
  16. Ekim

    Newcomer

    Joined:
    Jul 3, 2013
    Messages:
    47
    Probably :p maybe they just test different clock setups
     
  17. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    8,011
    Location:
    London, UK
    Developer response to such a question would be predictable without even doing it. Devs whose projects are memory bound will vote for more RAM and devs whose projects are GPU-bound - standing on their own or in comparison to PlayStation 4 builds - will vote for a faster clock. Any other options put on the table, unless the impact to performance is obvious without testing, are unhelpful.

    Basing a decision affecting a 5+ year lifecycle device on problems developers are having this month would be terrible. And if Microsoft are making engineering decisions based on popular voting, what is the chief engineer doing?
     
  18. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    9,266
    Location:
    La-la land
    I think the impetus would rather be to decrease reservation (which is merely software), than add more hardware into the box, a much more complicated and expensive solution that requires more effort, cost, verification and so on. And, 12GB is not a natural power of two, so there may be performance implications as well, both insofar as electrical timing goes (you will have two memory devices per channel rather than just one, may affect latency etc), and interleaving (different # of pages between devices).
     
  19. BeyondTed

    Newcomer

    Joined:
    May 20, 2013
    Messages:
    233
    Rel test and qual takes months. The lower the possible temperature acceleration the longer the required testing (and/or larger the sample size). It is exponential with temperature.

    With something like a console it is very difficult to get either sufficient temperature acceleration or quantity.

    Adding RAM (different density or more chips) is a much more minor change as far as the qualification does. The SoC, VRM, cooling solution and case design are much more serious contributors.

    Plus if you did end up with a dual foundry solution you have 2x the testing to do.

    Now the ICs and modules I work on are tested outside of the system in simpler test boards, making it easier. I don't know if MS is doing that, but we at least see the racks of systems. Quite possibly both are being done and both are still in progress.

    Maybe AMD is also doing tests on the SoC in simpler test boards but X SoC per board. Perhaps 10 per board in huge ovens/cooling chambers. (One of my higher power projects needed to be in cooling chamber due to total 1kW power dissipation. No need to run the oven to heat it up for temperature acceleration.)

    Still don't think these changes are much in the grand scheme of things. (An up clock << 1175 MHz and different memory density or number of modules. Hey, do you fear for the reliability of your laptop or PC each time you replace the memory with standard non-over clocked memory in a different density? No, you don't because those modules were qualified separately already and because your motherboard and APU or CPU were qualified for the maximum number of DIMMs and chips per DIMM already. And because there are industry standards for DDR3, etc.)



    Keep in mind that AMD and Nvidia are experts at re-binning, re-clocking, re-fusing, (re-branding :roll:) and matching the same original silicon with a new laser mark with all kinds of new combos of memory, clock, cooler, PCB, VRM, etc. And new "name/brand". They do it all day and it is not that hard to do the derivatives or the re-brand.



    The real difficulty would be a new SoC. THAT would be difficult. That rumor is fairly crazy but we can not say how crazy without access to insiders and the timeline. If it started early 2012 (almost two years before launch) (and if the rumor is not fake) then it is not so hard (not really any harder than the first SoC), especially if it is based upon the same blocks already fabricated and verified (that could actually make it *EASIER* than the first SoC design was!!!). (In other words more CU of the exact same CU and another ESRAM block of the exact ESRAM block.)

    So if it started early, no problem. If it started late, big problem. If that new one is not done then you *really* do not want to say anything. If you announce the new SoC and then have to ship the old one due to a glitch/problem with the new one you will *really* look bad. So a new SoC in process looks pretty much like a bad rumor. Nothing said except a leak or two which might be totally bogus too. And if the leak is from a developer half a dozen times removed from the MS or AMD labs then you can not expect enough accuracy in the leak to be able to tell if it is real or not.



    Not that I any lending any credibility to the pastebin or similar. Just filling in some design and industry related facts for fun, since I am a big hardware fan and a hardware designer. So don't go full fanboy on me if you don't agree. It is just a big interest for me. And if the hardware rumor thread causes you to go full fanboy then you don't need to read it do you? (Unless you are a reputation management professional.)
     
    #4699 BeyondTed, Jul 8, 2013
    Last edited by a moderator: Jul 8, 2013
  20. dumbo11

    Regular

    Joined:
    Apr 21, 2010
    Messages:
    425
    Do you know that is specifically complaints about the system reservation of DDR3?
    (it seems likely that Durango reserves a number of resources - including GPU time)
     
Thread Status:
Not open for further replies.

Share This Page

Loading...