Xbox One (Durango) Technical hardware investigation

Discussion in 'Console Technology' started by Love_In_Rio, Jan 21, 2013.

Thread Status:
Not open for further replies.
  1. patsu

    Legend

    Joined:
    Jun 25, 2005
    Messages:
    27,709
    Likes Received:
    145
    What's happening on March 6 ?
     
  2. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,516
    Likes Received:
    24,424
    Absolutely nothing.

    In relation to what he was claiming, it's when his bogus rumor is proven false. Though I expect he'll create more FUD saying things have been delayed.
     
  3. bkilian

    Veteran

    Joined:
    Apr 22, 2006
    Messages:
    1,539
    Likes Received:
    3
    I can't dismiss it out of hand. He very carefully stated that it is a decision only 2 months old, and I've been gone from the team for 3 months now. I still think it is bull, but I can't say I _know_ it's bull, since I don't. I've seen some of the executives make stupid decisions like this in the past, so it has a small chance of being true (but a very small chance).
     
  4. expletive

    Veteran

    Joined:
    Jun 4, 2005
    Messages:
    3,592
    Likes Received:
    69
    Location:
    Bridgewater, NJ
    Well sure but I suppose what really threw me off was your coming up with ideas as to why this would actually make sense (motherboard redesigns and such.)


    All that said, and this is to anyone, before this craziness started i posted a link to the system diagram which showed a block labeled "Display" which i assumed was for the HDMI output. However this block also has a read speed that is over triple what the write speed is. Wondering if this would represent the HDMI IN and why it would be triple what the HDMI out is unless theres more than one HDMI input. Anyone?
     
  5. RudeCurve

    Banned

    Joined:
    Jun 1, 2008
    Messages:
    2,831
    Likes Received:
    0
  6. bkilian

    Veteran

    Joined:
    Apr 22, 2006
    Messages:
    1,539
    Likes Received:
    3
    I would have been much more likely to believe it if he had said they were dropping in a second stock mobile APU, say a 4 core jaguar and 1 or 2 CU GPU, with it's own pool of DDR3, and then told developers, "Remember that system reservation we told you about? It's gone."
    Something like that would not be unprecedented, it's how the HD-A2/A20 and HD-A3/A30 HDDVD players worked.
    I'd go with display planes too.
     
  7. expletive

    Veteran

    Joined:
    Jun 4, 2005
    Messages:
    3,592
    Likes Received:
    69
    Location:
    Bridgewater, NJ
    I assume something like that (an OS APU) would be fairly easy to re-integrate into a more powerful single-APU in a future revision of the console?
     
  8. Xovek

    Newcomer

    Joined:
    Feb 10, 2013
    Messages:
    19
    Likes Received:
    0
    Location:
    Mty
    Actually, what you say is pointed in the Yukon scheme.

    By the way, in fact is there a system reservation?
     
  9. DopeyFish

    Newcomer

    Joined:
    Jun 20, 2004
    Messages:
    134
    Likes Received:
    1
    Outside of the barely rumored 2 cores and 3GB of ram... I haven't heard anything

    I still think Yukon is largely in play

    We won't know until someone details the system reservation for GPU... That's the most important part of Yukon. If there is reservation outside of display planes (which is helpful for merging 2 framebuffer a from 2 GPUs)- likely no xenos on board which also means likely no backwards compatibility

    Right now only thing that appears to have been changed from Yukon is clocks and ram (DDR4 to DDR3), even the 2 core reservation fits into Yukon. ESRam was even listed as an option in the leak (first reference of its possibility)
     
  10. RobertR1

    RobertR1 Pro
    Legend

    Joined:
    Nov 2, 2005
    Messages:
    5,852
    Likes Received:
    1,297
    As a developer that would be the elegant solution, no?

    You don't have to worry about shared resources and from a user perspective, your OS experience is never degraded regardless of what game you're playing.
     
  11. bkilian

    Veteran

    Joined:
    Apr 22, 2006
    Messages:
    1,539
    Likes Received:
    3
    Of course there's a system reservation. The PS3 has a system reservation, the 360 has a system reservation, the PS4 will have a system reservation, irrespective of what some people think, and whatever console MS comes out with will have a system reservation. (By system reservation, I mean resources usable only by the system and not accessible to games, whether it's memory, cpu cores, gpu time, or some piece of hardware)
     
  12. Osamar

    Newcomer

    Joined:
    Sep 19, 2006
    Messages:
    231
    Likes Received:
    43
    Location:
    40,00ºN - 00,00ºE
    More speculation. Xbox 360 SoC + APU. Is an Xbox 360 enough for new S.O. demands, even with more memory?
     
  13. Xovek

    Newcomer

    Joined:
    Feb 10, 2013
    Messages:
    19
    Likes Received:
    0
    Location:
    Mty
    Thank you for the answer.
     
  14. Cyan

    Cyan orange
    Legend

    Joined:
    Apr 24, 2007
    Messages:
    9,734
    Likes Received:
    3,460
    Apparently MS are having a meeting with developers behind closed doors.

    I for one kind of miss astrograd and interference. Reading Shifty's response to a question I've made, the reason for the ban doesn't seem to be that serious.

    I thought it must've been something really truly outstanding to warrant such a ban.

    And if they are banned I hope they are back in a couple of weeks or so, but banned til post-E3? That's a very long time.
     
  15. Exophase

    Veteran

    Joined:
    Mar 25, 2010
    Messages:
    2,406
    Likes Received:
    430
    Location:
    Cleveland, OH
    You're overcomplicating this. It wouldn't be two SoCs sharing 68GB/s DDR3 bandwidth, that's not possible because they'd each have a separate memory controller. It'd be two SoCs each with some bandwidth allotment (possibly still 68GB/s) to a static partition of DDR3 RAM (probably 4GB), where one has access to the other via a coherent interconnect like HTcc. This isn't a crazy exotic design, this is exactly what AMD has already done with its dual die server parts for years now. GPU sharing is also not especially complex because there isn't much data that has to be communicated between the two. The whole thing could be handled transparently in libraries if developers desired - both GPUs get the same primitives with different scissor windows, the same read assets duplicated to their eSRAMs, and different static partitions of render targets in their eSRAMs.

    I'm not saying this is a likely design at all but if MS wants to spend the money and deal with the cooling implications it's at least plausible.
     
  16. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    not really. There's one plane in front of the other. Unless you divide GPUs into fore- and backgrounds somehow, the display planes won't help. But the logic of the display planes suits one frontbuffer, one UI buffer, and one system buffer - three planes.
     
  17. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    Okay, but that's not what I'd call an efficient solution though, which makes a bit of a mockery of the other efficiency solutions. Surely MS would be better off with dual off-the-shelf 4 CPUs + 10+ CUs APUs? I don't see the sense of designing a system clearly to use a pool of DDR3 for assets and eSRAM from GPU working space, and support units to facilitate their operation, and then banging two together. ;) That said, the rumour wasn't two identical APUs, but two APUs, which is more plausible (if still unlikely).
     
  18. DopeyFish

    Newcomer

    Joined:
    Jun 20, 2004
    Messages:
    134
    Likes Received:
    1
    OS from one GPU layered on top of buffer from games GPU

    It explains why they did it in logic- at least one of the reasons
     
  19. jgp

    jgp
    Newcomer

    Joined:
    Feb 7, 2010
    Messages:
    15
    Likes Received:
    0
    The display planes vg-leak (IIRC) says the display hardware, apart from assembling the final displayed imagem from the various planes, which is then sent by the display hardware to the screen, can also output the same composite image to RAM.

    At 1080p@60Hz, 1.1GB/s allows for 8 bytes per pixel, or maybe 2x frame buffers including alpha, which could be the "title" (i.e., game) and the final (i.e., with OS overlay) -- that kind of makes some sense to me... -- or else for 3D support (if it's still alive by then :p).

    The 3.9 GB/s could then be the corresponding 3 display planes input, triple the output plus some more for overhead.

    Let me add that I'll believe the HDMI input when I see it, and I'm glad that the published specs from Sony have already shot it down for the PS4. Not because it's impossible, but rather that a) never seen it done before apart from AV receivers, and b) can't imagine it makes business sense (but I'm sort of limited in this department).
     
  20. Exophase

    Veteran

    Joined:
    Mar 25, 2010
    Messages:
    2,406
    Likes Received:
    430
    Location:
    Cleveland, OH
    What off the shelf APU are you thinking of? Trinity isn't even GCN, Richland and Kabini aren't out yet. More to the point.. AMD's SKUs for consumer products don't align well with what Sony and MS would prefer for a game console. Hence why they're using a large number of relatively weak Jaguar cores instead of a smaller number of high clocked Piledrivers, because this is a better way to spend their power budget. This doesn't change if you go with a one chip or two chip solution. AMD hasn't shown any public interest in bringing out an APU that marries Jaguars with anything even remotely as powerful as even a 10CU GCN part like you propose. Kabini is something like 2 or 4CU.

    Furthermore, it's not like MS wouldn't want their own customization and IP in the SoC, not least of which includes the eSRAM, but would go far beyond that. They're also probably getting it lower cost by footing some of the chip design on their end and merely licensing the IP they need from AMD, which for large volume is cheaper than buying a complete design from them.

    MS would end up with two display controllers this way too. Maybe one would be disabled, but even two active could have uses.

    The DDR3 + eSRAM design is completely orthogonal to a multicore design. If anything, local memory oriented solutions can scale nicely. IMG's GPUs do just fine scaling cores with fast tile SRAM.

    IMO two different APUs is less plausible because it negates the cost benefit of only having to design one chip. 16 CPU cores is far fetched but if MS has plans out the gate to use a few of them for their own purposes then it isn't that crazy.

    If MS did this it may not have been their first choice and could be a reaction to Sony's plans (although it would have needed to be made a long time ago, of course). MS has more resources than Sony to put into design and they have more cash to burn on early console sales; gambling it on a more powerful CPU + GPU makes sense since there's a fairly reliable timeline for cost reduction on those parts. But doing a 500mm^2 SoC is out of the question.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...