Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
Basically for the same reasons that Microsoft has to have a Kinect in every box, Sony has to have a camera system + something approximating Move controls in every box. To not include them would mean motions controls would fail for that respective console right off the bat as the game developers would then have no incentive to even think about using them.
SB

So the only difference is that one option adds up on the gaming hardware while the other substracts from it ... am i correct ?

(with Kinect being a bit more expensive than Move).
 
Actually Shifty, I beg to differ.

For people discussing whether or not MS will increase the clocks etc. MS's impetus for doing so will obviously be related to whether or not they were expecting PS4 to be more powerful and whether they have already taken that into account in their business strategy.

So if this thread is about what hardware will most likely be in the final box, it is important to know what MS were thinking when they designed it.
That can be considered as a variable without having to discuss what MS's reasons are in depth. Pages and pages of people saying, "MS are reacting to Sony, no they're not, yes they are, they can't compete on power, yes they can, no they can't, they can compete on price, no they can't, yes they can," doesn't get us anywhere. We can talk about a low BOM for a cheap console, or an overclocking to get more power, without having to talk about the competition and business. Whether MS have a master plan they are sticking to, or are cluelessly chopping and changing hardware design in a frantic panic to compete, we have rumours to track and interpret and insane, unjustified guesses to throw out as options.
 
Hehe, that is a pretty big coincidence when laid out like that.

Perhaps it's not so much that the Durango GPU is based off of Bonaire, but that the Bonaire GPU is based off of the Durango GPU. Then it makes a bit more sense, as AMD were looking for a product to fill the space between the 7770 and 7850 2 GB. Basically to fill the space that the 7850 1 GB would be vacating due to memory manufacturer's stopping production of the low density memory chips it was using.

If you then look at the features that were added to Bonaire versus Cape Verde, things start to fall into place. Better support for HSA features. 2x ACEs as well as 2x primitives per clock to ensure it wouldn't be at a disadvantage versus the 7850 1 GB. It may not generally be as fast as the 7850, but it ends up filling that price gap quite well.

Even if you think about it as just a really beefed up (more CU's etc.) than what is going to go into Kabini and Temash (the PC Jaguar based parts), it still indicates this is likely more similar to what is in Durango than something based on Cape Verde.

Now before too many people get excited. It's still not going to improve performance much over Cape Verde for the majority of graphical workloads unless those 2 primitives/clock come into play.

And as always, since people have a tendency to jump on anyone just throwing out idle speculation, this is PURELY SPECULATION. :) It may be it is more similar to Bonaire, or it may be more similar to Cape Verde, or it may bear no similarities to either and its entirely custom. :p Though that last is very unlikely. :D

There is one other bit that seems fairly interesting. Dave mentioned in the GPU architecture thread that a 192 bit bus was originally planned for Bonaire but that he championed the 128 bit that it eventually used. I wonder how long ago that was in the development stage. As it seems to imply that bus width is perhaps easier to change than otherwise thought. Then again if this was over a year ago then yeah, it's as hard as thought. But I have to wonder just how long Bonaire was in development.

Regards,
SB
That's what I have been trying to say all along. Bonaire could have its foundations on Durango's GPU. It makes a lot of sense if you consider that AMD was very happy with the design, and they just adapted it to the PC world.

My theory is that they just increased the GPU clock and the number of CUs to match and emulate the extra performance you get on a console -well, still Durango's GPU can potentially enjoy some more bandwidth-, when draw calls aren't as heavy on the processors and you don't have to go through some software/hardware layers.

This might indicate that the performance of Durango is closer to the 7850 than it is to the 7770, pretty much like the 7790.

EDIT: This is an extract from the article where it mentions why AMD could be recycling certain technology in their designs, especially if they are very happy with them and work great, it makes sense.

http://www.extremetech.com/gaming/151367-amd-launches-radeon-7790-meet-the-xbox-720s-gpu

It’s no coincidence that Bonaire answers some of the questions we had after the Xbox Durango GPU leak early last month. According to VGLeaks’ data, Durango’s front end was capable of issuing up to two primitives per clock like Tahiti and Pitcairn, but the memory bandwidth figures pointed to a 128-bit bus. Now we have Bonaire — a 128-bit GPU that merges those two capabilities in a single part.

Spinning a new GCN part for Microsoft allows for a smaller die, lower manufacturing costs, and explains why AMD CEO Rory Read calls AMD’s console SoC’s “semi-custom” designs. It makes no sense for AMD to build and launch another GCN part just to hit a market target — but it makes a lot of sense for a cash-strapped company to design a new GPU that can target multiple markets simultaneously.
 
Last edited by a moderator:
That's what I have been trying to say all along. Bonaire could have its foundations on Durango's GPU. It makes a lot of sense if you consider that AMD was very happy with the design, and they just adapted it to the PC world.

My theory is that they just increased the GPU clock and the number of CUs to match and emulate the extra performance you get on a console -well, still Durango's GPU can potentially enjoy some more bandwidth-, when draw calls aren't as heavy on the processors and you don't have to go through some software/hardware layers.

This might indicate that the performance of Durango is closer to the 7850 than it is to the 7770, pretty much like the 7790.

Its more likely that Durango is based on Bonaire rather than the other way around. AMD isn't trying to emulate a console's performance with Bonaire, they are just adding a SKU to fill a price/performance market they currently don't have a product for.
 
@Cyan:
What extremetech wrote doesn't make sense at all.

AMD has several building blocks: CUs with 1:16 DP rate, CUs with 1:4 DP rate (maybe even 1:2 CUs, but they didn't use them so far), setup/raster+tesselator blocks, command processor + ACEs, ROPs, L2 tiles + memory controller partitions, slow GDDR5 PHY (<5 GBps), fast GDDR5 PHY (6+ GBps), VCE block, PCIe controller + PHY, display engines + output PHYs, DMA controllers, and so on. They also have two versions of the caches, register files, and memory controller, one with ECC support and one without.
Then they have different iterations of the tech in the CUs, the ACEs, and memory controllers, currently probably (at least) two: the original GCN and the updated GCN1.1 which supports some more instructions (and removes a few), supports the FLAT memory model, more compute queues as well as a unified virtual address space with the host.

Having all this, one can basically play mix'n match with the blocks. Of course one still needs to connect all of it to a working GPU, for instance one needs to put crossbars with the right amount of ports between the flexible number of CUs and the flexible number of L2 tiles and ROPs, just to name one example. And very likely one has to do a number of adaptions so that everything fits nicely together. That means it's not exactly just a copy & paste action.

And for Durango (as well as for the PS4) they needed to extend their palette of building blocks a bit and added some custom stuff (on top of leveraging their CPU blocks and north- and southbridge functionality also in use for their own APUs). It makes no sense at all, that the Bonair choice of a certain combination was connected in a causal way to Durango or the other way around. The only connection would be a generally valid reasoning about the balancing of the chip, i.e. what amount of units in which combination makes sense for the variety of expected workloads given a certain die size and power budget to achieve the best possible performance under these constraints. And there is only a limited amount of integer numbers (and even less powers of two) in the range of reasonable small numbers to come up with. To construct some causal connection from this simple fact is just irrational.
 
Last edited by a moderator:
@Cyan:
What extremetech wrote doesn't make sense at all.

AMD has several building blocks: CUs with 1:16 DP rate, CUs with 1:4 DP rate (maybe even 1:2 CUs, but they didn't use them so far), setup/raster+tesselator blocks, command processor + ACEs, ROPs, L2 tiles + memory controller partitions, slow GDDR5 PHY (<5 GBps), fast GDDR5 PHY (6+ GBps), VCE block, PCIe controller + PHY, display engines + output PHYs, DMA controllers, and so on. ....

This is not software.. It is not that easy to integrate these stuff together and turn it into a GPU.. You have to think carefully about the layout, timings, interference between modules, power distribution, clock distribution, cooling, etc.. If you can put these things together which satisfy many real-world constraints, you will definitely like to recycle it as much as you can. For this reason, it may not be very fat fetched that Durango and Bonaire are very similar. In fact, it is certainly possible that in order to maximize the yield at launch, MS may decide to give up 2 CUs and lower the clock frequency to 800 Mhz. However, I really wish that it is not the case...

Btw, I think that both Sony and MS is shooting for a similar priced console. Both have almost exact same CPU (there might be some differences between the two that we do not know), one has 18 CU while the other has 12 CU + 32 Mb EDRAM. One decides to invest in fast but expensive memory + cheap dual camera system, while the other goes with cheap memory but expensive 2.5D camera system.

I think at the end, it all boils down to the services that these boxes will provide. We know some of the services that Sony is investing (cloud gaming and sharing features) and there may be still some. I am sure MS is also investing some as well. Especially, if they can pull off this natural user interface (NUI) thing flawlessly this time and show some core games that really uses this effectively, they may have a real chance to have a big seller as Nintendo did it with Wii. And, the difference in power between Wii and PS3/360 is way more than what it is between Durango and PS4 (if we assume the leaks are correct).

We are getting closer each day. We will all know in a couple of months at E3.
 
This is not software.. It is not that easy to integrate these stuff together and turn it into a GPU.. You have to think carefully about the layout, timings, interference between modules, power distribution, clock distribution, cooling, etc.
Have you read the middle part? I explicitly said it's not just copy&paste. ;)
If you can put these things together which satisfy many real-world constraints, you will definitely like to recycle it as much as you can. For this reason, it may not be very fat fetched that Durango and Bonaire are very similar. In fact, it is certainly possible that in order to maximize the yield at launch, MS may decide to give up 2 CUs and lower the clock frequency to 800 Mhz. However, I really wish that it is not the case...
Of course you reuse a lot of stuff, but a lot of them come in rather rigid blocks and can likely be reused as those blocks. For instance, I would be surprised, if the CU building blocks of 3 or 4 CUs didn't already include fairly well optimized clock distribution networks (as you brought this up). Sure, you always can (or have to) do some changes and tuning on the global scale. But that's not the point. As the PS4 SoC and also Durango are completely different chips than Bonaire anyway (they include two CPU modules, northbridge + southbridge, a wider memory interface, Durango also 32MB eSRAM and a different memory interface, the PS4 chip more ROPs), AMD has to do that anyway. That's also not just copy & pasting a Bonaire chip into an SoC. That works neither. One always needs careful planning.
As said, the only relatively loose connection are generally valid considerations about what amount of which units make sense given a certain memory bandwidth, die size and power consumption budget. Similar requirements result usually in similar solutions. That's all.
As a comparison look at planes. Most of them look fairly similar if you define that they should be able to carry 200 passengers over 5000 miles with a speed of 550 mph. That doesn't exclude differences one sees on a closer look. ;)

Dave may correct me if that is wrong.
 
Maybe. MS might well pull a Nintendo and decide specs are immaterial, and just show features and services and content.

I'd bet when MS does show their next product it will be a full featured reveal including actual working hardware. They might deflect from specs if they don't favor MS, but I don't think they'll avoid entirely. See the Surface RT reveal.
 
Its more likely that Durango is based on Bonaire rather than the other way around. AMD isn't trying to emulate a console's performance with Bonaire, they are just adding a SKU to fill a price/performance market they currently don't have a product for.

Or MS has populated dev kits with Bonaire to serve as a transitional part until final kits are ready. And AMD is using the opportunity to place Bonaire into retail since they are manufacturing the parts anyway.
 
Well you seem convinced, but I'm not. Sony could have the new PSEye lined up as a launch peripheral along with the old Move controllers (with day one SDK support). They will have optional OS features and some 1st party games supported for those who want them, while keeping them out of the console BOM and giving the core what they want for their $400. When the PS4 price drops they can push causal dance games and the like just like both MS and Sony did this gen. While the Kinect will be more closely coupled to the console and utilized in more games and the OS, most people don't seem that interested right now. They might feel a bit bitter having to buy something they don't want, something that kept the console specs low to be price competitive. I'm guessing MS made its decision when Kinect was riding the wave, but that wave is gone now and they have to hope it comes back or make it come back.

Or Sony is chasing the same old Kinect wave and will shove a bunch a stuff in the box and jack the price up. Who knows.

In many ways, I don't think we've seen what Kinect can be on a console yet. It never had the chance on the Xbox 360. On the new xbox, developers will know that every owner of that console has Kinect. That will change things.

That said, the Durango GPU doesn't have to be based on Bonaire, and Bonaire doesn't have to be based on Durango's GPU, but one thing is certainly clear. Bonaire is the closest thing that we can appropriately compare the Durango GPU to. Maybe they seem similar by chance or perhaps they truly are related. Either way, I think Microsoft has themselves a nice chip that will probably surprise us over the life of the console.
 
I'd bet when MS does show their next product it will be a full featured reveal including actual working hardware. They might deflect from specs if they don't favor MS, but I don't think they'll avoid entirely. See the Surface RT reveal.
I expect a full reveal too with final product, but no-one should be expecting full hardware knowledge. If we get it, we're lucky. Hardware know-how might only come from post-release tear-downs. Those managing to hang on because they think it'll only be a couple more months need to reevaluate their coping strategy, and be ready to face another 6+ months of not knowing. ;)
 
That's what I have been trying to say all along. Bonaire could have its foundations on Durango's GPU. It makes a lot of sense if you consider that AMD was very happy with the design, and they just adapted it to the PC world.

My theory is that they just increased the GPU clock and the number of CUs to match and emulate the extra performance you get on a console -well, still Durango's GPU can potentially enjoy some more bandwidth-, when draw calls aren't as heavy on the processors and you don't have to go through some software/hardware layers.

This might indicate that the performance of Durango is closer to the 7850 than it is to the 7770, pretty much like the 7790.

EDIT: This is an extract from the article where it mentions why AMD could be recycling certain technology in their designs, especially if they are very happy with them and work great, it makes sense.

http://www.extremetech.com/gaming/151367-amd-launches-radeon-7790-meet-the-xbox-720s-gpu


Well at 14CU 1075mhz and more than 800SP it close to the 7850,but Durango has been say to be 1.2TF,which doesn't fall in line with the 7790.

As well as 12CU 800mhz which i consider a significant drop from 1075mhz,and 768 SP...

I think in the end it will be the same is not a 7770,but is an heavy under clock 7790 with another few cuts,as well as some improvements.
 
I expect a full reveal too with final product, but no-one should be expecting full hardware knowledge. If we get it, we're lucky. Hardware know-how might only come from post-release tear-downs. Those managing to hang on because they think it'll only be a couple more months need to reevaluate their coping strategy, and be ready to face another 6+ months of not knowing. ;)

But Sony told us a lottttt. Sure, MS may not follow, especially since they may be inferior, but still.

There is a Microsoft event rumored for April 26 similar to the PS4 unveil.
 
Either way, I doubt MS would change their planned reveal because PS4 was anounced early.

We'll know all about it months before it's released.
 
But Sony told us a lottttt. Sure, MS may not follow, especially since they may be inferior, but still.
Every scrap of info you feed to PR has to be part of your marketing strategy. If releasing spec info doesn't gain you anything, why do it? I see nothing MS can profit from by releasing specs unless they have a more powerful machine than PS4 and trump it. If the specs are inferior, even if the machine is as capable thanks to design efficiencies, Joe Public won't comprehend anything other than bigger numbers. So instead I'd feed hardware info in PR terms. "Unique efficiency gains." "Far greater utilisation." "Dedicated high performance, low latency RAM." "Custom hardware." Just chuck some buzzwords out there and then your loyal, clueless fanbase will take up the PR challenge of trying to convince forum goers that the hardware is magically more capable than they believe it to be based on rumoured (accurate) specs.

Nintendo is a perfect example. We've had two consoles with pretty clear rumours, yet Nintendo fanboys have snatched at any loose PR term or developer phrase and run with it, to the point of totally unrealistic ideas like a physics processing unit in Wii, or a console with less than half the power draw of PS360 managing to be far more powerful.

Assuming the rumoured specs are accurate or close to, MS will be far better off withholding technical details. They'll release a few specs that go toe-to-toe with the competition (8 core CPU, next-gen AMD graphics architecture) and couple it with PR speak (custom hardware, highly balanced system) and call it a day. The only people who'll be adversely affected are those with a technical interest. Fanboys will talk crap and live in denial regardless of specs, so it makes no difference to them. ;)

We just have the rumours, and then a teardown, IMO. I'll be gobsmacked if we get any technical info from any reveal. I don't see any public info broadcast that'd warrant talking about memory engines, processor interface with eSRAM, eSRAM latencies, etc. That's all info for the developers. We can only hope some leak info if we're to know.
 
@Cyan:
What extremetech wrote doesn't make sense at all.

AMD has several building blocks: CUs with 1:16 DP rate, CUs with 1:4 DP rate (maybe even 1:2 CUs, but they didn't use them so far), setup/raster+tesselator blocks, command processor + ACEs, ROPs, L2 tiles + memory controller partitions, slow GDDR5 PHY (<5 GBps), fast GDDR5 PHY (6+ GBps), VCE block, PCIe controller + PHY, display engines + output PHYs, DMA controllers, and so on. They also have two versions of the caches, register files, and memory controller, one with ECC support and one without.
Then they have different iterations of the tech in the CUs, the ACEs, and memory controllers, currently probably (at least) two: the original GCN and the updated GCN1.1 which supports some more instructions (and removes a few), supports the FLAT memory model, more compute queues as well as a unified virtual address space with the host.

Having all this, one can basically play mix'n match with the blocks. Of course one still needs to connect all of it to a working GPU, for instance one needs to put crossbars with the right amount of ports between the flexible number of CUs and the flexible number of L2 tiles and ROPs, just to name one example. And very likely one has to do a number of adaptions so that everything fits nicely together. That means it's not exactly just a copy & paste action.

And for Durango (as well as for the PS4) they needed to extend their palette of building blocks a bit and added some custom stuff (on top of leveraging their CPU blocks and north- and southbridge functionality also in use for their own APUs). It makes no sense at all, that the Bonair choice of a certain combination was connected in a causal way to Durango or the other way around. The only connection would be a generally valid reasoning about the balancing of the chip, i.e. what amount of units in which combination makes sense for the variety of expected workloads given a certain die size and power budget to achieve the best possible performance under these constraints. And there is only a limited amount of integer numbers (and even less powers of two) in the range of reasonable small numbers to come up with. To construct some causal connection from this simple fact is just irrational.
That sounds very reasonable, Gispel, from a theoretical point of view. Practical raw numbers say the similarities between both GPUs are more than a coincidence.

There is another article backing up the theory which says that Durango's GPU is based on the Bonaire, or vice versa.

http://www.expertreviews.co.uk/graphics-cards/1298821/amd-launches-radeon-hd-7790-architecture-links-it-to-xbox-720-gpu

THE XBOX 720 CONNECTION

How does this relate to the next Xbox? Typically, AMD scales down its products rather than redesign its architecture to plug price holes like the one it claims to be filling with the 7790. A significant amount of work, time and investment must have gone into developing the Bonaire architecture - the kind usually reserved for a major upgrade, not just a gap-filler. It would need to have a secondary purpose to justify such high development costs.

When talking about his company's system-on-chip designs, AMD CEO Rory Read called them “semi-custom” - creating a new part for the next Xbox lets AMD concentrate on lowering manufacturing costs and working on smaller die sizes, without having to worry about direct sales of graphics cards.

The leaked Xbox Durango development kit backs this up - Durango is apparently able to issue two primitives per clock (like the Tahiti and Pitcairn cores) but only has a 128-bit memory bus. Until the HD 7790 was announced, no such combination existed.

The only person who could know something is bkillian, but I think that the console is being made in near total secrecy to the point that he doesn't know what it is inside the console.

I think he just worked on the audio chip, without knowing exactly which are the other chips the console is going to have.

I'm not saying that he doesn't know, the possibility is there no matter how little, but I think the different Xbox departments don't know what the others are doing, only their supervisors -the bigwigs- know what the console is going to have inside.

So bkillian has to go with the flow and he barely flies in the face of rumours.
 
Assuming the rumoured specs are accurate or close to, MS will be far better off withholding technical details. They'll release a few specs that go toe-to-toe with the competition (8 core CPU, next-gen AMD graphics architecture) and couple it with PR speak (custom hardware, highly balanced system) and call it a day. The only people who'll be adversely affected are those with a technical interest. Fanboys will talk crap and live in denial regardless of specs, so it makes no difference to them. ;)

Well , sooner or later we'll have side by side videos/images .... ;)
 
That sounds very reasonable, Gispel, from a theoretical point of view. Practical raw numbers say the similarities between both GPUs are more than a coincidence.
What "practical raw numbers"? :rolleyes:
There is another article backing up the theory which says that Durango's GPU is based on the Bonaire, or vice versa.

http://www.expertreviews.co.uk/graphics-cards/1298821/amd-launches-radeon-hd-7790-architecture-links-it-to-xbox-720-gpu
That's just another article parroting the same stupid stuff. And they are wrong with their "facts" too, btw. They claim Durango is the same combination of 2 prims/clock frontend with a 128bit memory interface as Bonaire. But we all know the memory system of Durango to differ significantly. It's a combination of a 256bit DDR3 interface with an additional eSRAM memory pool. And on top of it Durango also has a CPU cache coherent interconnect through the northbridge to memory. In the light of this I feel quite confident to say that what "expert"reviews writes is simply ridiculous.
 
Status
Not open for further replies.
Back
Top