Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
There's a possibility of increasing clocks if the cooling solution is not finalized. Could be that they picked something lower for dev kits with the expectations of higher clocks in final units. Not sure how likely that is, but it isn't as unreasonable as last minute changes to the apu.
 
There's a possibility of increasing clocks if the cooling solution is not finalized. Could be that they picked something lower for dev kits with the expectations of higher clocks in final units. Not sure how likely that is, but it isn't as unreasonable as last minute changes to the apu.

It's possible, but any change like that needs to be planned months in an advance and not simply be reactionary if they want good yields otherwise come launch time, they will have shortages. And this time around, shortages are a serious problem because if your product isn't in stock, someone will easily just go buy a competitors product that's also brand new and shiny.
 
A big thing being overlooked is that the new Xbox need not be more powerful than what we already know. Hardware specs alone don't make games, developers do.

The currently rumored, and widely expected, specs for Durango are more than enough for incredible games that are a pretty big jump over what we are getting from the 360. And what it all ultimately boils down to is that it's quite simple enough to address certain performance related concerns by lowering the resolution. 1680x1050. I say get use to it or some form of it now, as we'll probably be seeing plenty of it on Durango.

For a number of reasons much too long to go into, I'd say the chances of a higher clock on the GPU are pretty high. I don't think Microsoft has ended up in the range that they're currently in right now with their GPU by coincidence. If we're wondering what possible extra planning Microsoft might have done with regards to potential late in the game changes to account for newer circumstances, I'd say that the easily achievable and safe clock speed evident on this GPU is something that Microsoft would have carefully planned around.

http://www.amd.com/US/PRODUCTS/DESKTOP/GRAPHICS/7000/7770/Pages/radeon-7770.aspx#2

I believe between 900-925MHZ are very strong possibilities for the Durango GPU. I think they picked a GPU range that conveniently allowed them some extra breathing room as it pertains to proven safe clock speeds. The inclusion of the ESRAM should also be a net positive as far as concerns regarding power savings and thermal go. I believe they possess the right balance to easily exceed the rumored 800MHZ core clock for their GPU, and I will actually be pretty surprised if it isn't at least 100-125MHZ higher.

Microsoft knows what they're doing. Toss in the various measures that can be taken on the developer side of things, in addition to the gains that will come from focus on a single spec, and I expect we find a far more capable system than people are giving it credit for. It's irrelevant how their specifications compare with the plans of a competitor as long as they're providing developers with enough power to do what they want, and they most certainly are.

It has been suggested that there have been challenges manufacturing things on Microsoft's end, and it's likely to do with the SRAM. Many look at the SRAM as a last minute solution to a problem they created by choosing DDR3, but I think it's actually apart of a more comprehensive strategy than Microsoft are being given credit for right now.
 
Last edited by a moderator:
Its all fine and good to believe the beta kits are the final hardware, but they dobt have to be. Many people claim that ms would be reacting to sonys press announcments. This doesnt have to be the case.

The beta kits coukd simply have early chips in them. Perhaps what amd thought they coukd fab quickly and in the needed quanitys. This could mean a quicker apu will apear in final dev kits.

The same could be said for clock speeds.

The beta 360 kits had diffrent hardware than the final kits
 
A big thing being overlooked is that the new Xbox need not be more powerful than what we already know. Hardware specs alone don't make games, developers do.

The currently rumored, and widely expected, specs for Durango are more than enough for incredible games that are a pretty big jump over what we are getting from the 360. And what it all ultimately boils down to is that it's quite simple enough to address certain performance related concerns by lowering the resolution. 1680x1050. I say get use to it or some form of it now, as we'll probably be seeing plenty of it on Durango.

For a number of reasons much too long to go into, I'd say the chances of a higher clock on the GPU are pretty high. I don't think Microsoft has ended up in the range that they're currently in right now with their GPU by coincidence. If we're wondering what possible extra planning Microsoft might have done with regards to potential late in the game changes to account for newer circumstances, I'd say that the easily achievable and safe clock speed evident on this GPU is something that Microsoft would have carefully planned around.

http://www.amd.com/US/PRODUCTS/DESKTOP/GRAPHICS/7000/7770/Pages/radeon-7770.aspx#2

I believe between 900-925MHZ are very strong possibilities for the Durango GPU. I think they picked a GPU range that conveniently allowed them some extra breathing room as it pertains to proven safe clock speeds. The inclusion of the ESRAM should also be a net positive as far as concerns regarding power savings and thermal go. I believe they possess the right balance to easily exceed the rumored 800MHZ core clock for their GPU, and I will actually be pretty surprised if it isn't at least 100-125MHZ higher.

Microsoft knows what they're doing. Toss in the various measures that can be taken on the developer side of things, in addition to the gains that will come from focus on a single spec, and I expect we find a far more capable system than people are giving it credit for. It's irrelevant how their specifications compare with the plans of a competitor as long as they're providing developers with enough power to do what they want, and they most certainly are.

It has been suggested that there have been challenges manufacturing things on Microsoft's end, and it's likely to do with the SRAM. Many look at the SRAM as a last minute solution to a problem they created by choosing DDR3, but I think it's actually apart of a more comprehensive strategy than Microsoft are being given credit for right now.

yes. this is the reasonable account of how things are. There is no need for MS to "react" to Sony's announcement, that's fan fodder.

they have engineered a system that they and developers know will make great games able to go toe to toe with other systems. Agreed that come game time that theory will be borne out
 
It's possible, but any change like that needs to be planned months in an advance and not simply be reactionary if they want good yields otherwise come launch time, they will have shortages. And this time around, shortages are a serious problem because if your product isn't in stock, someone will easily just go buy a competitors product that's also brand new and shiny.

I would expect that if clock are increasing, that they would have been planning that all along. The beta kit clocks would be a minimum in a targeted range. Whether this actually happens is probably less likely than the clocks staying the same.
 
Not exactly a wholly appropriate comparison, but these benchmarks for the 7770 give you a pretty good ballpark for the strength of the GPU, and there are very legitimate reasons for expecting the Durango GPU to exceed these performance numbers, even more so if Microsoft opts to pursue easily achievable -- in terms of stability, power consumption and thermal output -- higher core clock speeds for the GPU.

As in higher than the rumored 800MHZ, not higher than an actual 7770GHZ.

http://www.tomshardware.com/reviews/radeon-hd-7770-7750-benchmark,3135-6.html

To be honest, if they opted for such a nicely positioned, in my view, GPU, but then opt NOT to take advantage of the higher than 800MHZ core clock that is well within their ability to achieve without issue, then I will legitimately view that as a missed opportunity on their part. This range of GPU can get away with those higher clocks very easily. It wouldn't even be accurate to call it overclocking, because it wouldn't be.

That class of GPU is officially designed to ship 100% stable under those higher clock speeds. 800MHZ simply doesn't make sense for me, which is precisely why I don't expect the GPU to actually be 800MHZ. I expect a minimum of 900-925MHZ, potentially even higher at 950-975MHZ. I'm of the opinion right now and practically predicting that Microsoft intended on greater than 800MHZ for their GPU all along, and we will more or less find out that this is what they did when they reveal Durango. Even in a lot of cases, you get the sense that while sources can confirm to specific sites that the specs are "more or less" accurate, and I believe they indeed are, it never really strikes me as 100% definitive, meaning even some developers or whoever these sources might be, aren't willing to say with absolute certainty that nothing might be different. This is, after all, how we were so broadsided by 8GB of GDDR5. A higher than expected core clock on the GPU doesn't come anywhere close to being as exotic a difference from the expected in my view, particularly when we have clearly established and well documented precedent for such higher clocks on this range of GPU from AMD.

to be a full 200MHZ below a 7770GHZ would strike me as rather odd, unless there's some different GPU core clock SKU strategy that I'm not yet aware of and that Microsoft fully intends to market the hell out of, and I doubt that to be the case.

Would one be Durango Silver (800MHZ) and the other Durango Gold (900MHZ+) ? I don't know. I know the actual name won't be Durango, but I'm just tossing out theories here. 800MHZ strikes me as odd for this class of GPU.
 
Last edited by a moderator:
You're forgetting that the same chip has 32MB of RAM an a full CPU on board, too. If they were designing it for 800Mhz, they MIGHT clock it higher with little repercussions, but it would be UNLIKELY to happen this easily (i.e. they would've done so before they knew anything PS4).

There might also be other problems, like size. A PC card usually has a lot of room to breathe. In a console, it's MUCH more like a laptop. Makin the HSF bigger isn't really an option here. And loudness WILL factor is just as well. They won't run a 60mm fan (or whatever) at full speed, just to be able to "keep up". The negative PR in this regard cannot be ignored.
 
You're forgetting that the same chip has 32MB of RAM an a full CPU on board, too. If they were designing it for 800Mhz, they MIGHT clock it higher with little repercussions, but it would be UNLIKELY to happen this easily (i.e. they would've done so before they knew anything PS4).

There might also be other problems, like size. A PC card usually has a lot of room to breathe. In a console, it's MUCH more like a laptop. Makin the HSF bigger isn't really an option here. And loudness WILL factor is just as well. They won't run a 60mm fan (or whatever) at full speed, just to be able to "keep up". The negative PR in this regard cannot be ignored.

Loud noises hardly hurt the 360. That said, what you say with regards to the other aspects of the console, I still don't see that being enough to present a serious obstacle to the core clock increase. If we were talking a 16CU and above part, I would say it's much less likely in a console sized box, but I really do see it as being possible for this specific setup.

Unless I'm mistaken, the choice of ESRAM might make this further possible, because it may end up saving even more power and produce even less heat than what they could have otherwise gone with. The ESRAM may not be an obstacle to this, it may be a one of the primary reasons that Microsoft is able to take advantage of higher core clocks for the GPU. The combination of ESRAM + higher than expected GPU core clocks also sounds like it would lead to a few manufacturing problems.

Why would the developers potentially have 800MHZ GPUs as opposed to the higher clock of the final? Well, maybe there were so many manufacturing challenges, that they figured it was best keep the clocks lower so they could simply produce the necessary hardware to get the machines in developer's hands, and then have time sort out their yield related issues for the higher clocked parts. Developers, as far as their concerned, it makes no nevermind to them. They develop using the 800MHZ part, and if they eventually get something a little faster, then that's great, but it doesn't exactly create major challenges for their development.

After all, a lot of devs were making PS4 titles on quite a bit less memory than what they would eventually get their hands on, weren't they?
 
Are 12 or 14 Bonaire CUs at 1ghz inherently hotter than 18 Pitcairn CUs at 800mhz though? Point is we really dont know how hot either will be but i think we can safely assume that MS will not make an unreliable console, nor will they make one that sounds like a vacuum cleaner.

The question is if they weren't already planning on this chip at 1ghz, can they change it now? I think if Sony is able to double their memory so late in the game,which likely required them to rework their mainboard layout, then certainly MS can explore a more robust cooling solution (if such a change would even be necessary). Chances are they already looked at a number of possible cooling solutions throughout this process and its just a mixing and matching the parts they would now need.
 
Is it possible the GPU speed is tied to another component?, i.e ESRAM or whatever it's being called. This may complicate issues of running at a higher clock, just from some dirty math...the 800mhz fits well with the reported bandwidth.
 
I also think it would be pretty nice if devs were allowed to play around with clocks under the TDP. A 2000mhz cpu clock with a lower gpu clock or a 1000ish gpu clock with a lower cpu clock would seem to come in handy...for both systems in fact.
 
That's true, we really don't know what it is for a Bonaire class GPU, but I suspect that it's still quite a bit cooler than a Pitcairn class GPU. It's far too similar, I think, to Cape Verde for it to run that much hotter by comparison, which means that a lot of the TDP savings should still very much be in play for Bonaire parts. And if the TDP savings are in play, then so, too, should the core clock speed benefits.

I just find it particularly strange that Microsoft would even think about leaving nearly 200MHZ of that clock speed on the table. It just wouldn't make any sense, unless Microsoft truly are putting way too much focus into areas that really won't gain them much of anything. At the very least, it wouldn't be close to what the extra core clock on the GPU could net them.

I'm also not entirely certain about this, but I would have to imagine the ESRAM also provides them pretty important TDP savings, even more so than the GDDR5 that may otherwise take its place in a more straightforward design, such as a Bonaire desktop part.
 
Are 12 or 14 Bonaire CUs at 1ghz inherently hotter than 18 Pitcairn CUs at 800mhz though? Point is we really dont know how hot either will be but i think we can safely assume that MS will not make an unreliable console, nor will they make one that sounds like a vacuum cleaner.

The question is if they weren't already planning on this chip at 1ghz, can they change it now? I think if Sony is able to double their memory so late in the game,which likely required them to rework their mainboard layout, then certainly MS can explore a more robust cooling solution (if such a change would even be necessary). Chances are they already looked at a number of possible cooling solutions throughout this process and its just a mixing and matching the parts they would now need.

Everything I have seen so far has indicated that the xbox 720 GPU is GCN1.0 so that would indicate that it is Pitcairn and not Bonaire.
 
Everything I have seen so far has indicated that the xbox 720 GPU is GCN1.0 so that would indicate that it is Pitcairn and not Bonaire.

What exactly though? One version of the rumored 7790 (non-xt) has 12CUs, 768 stream processors, 16 ROPs and 102.4 GB/s of bandwidth. There's something in pitcairn that seems more likely to you? Or theres something about Bonaire that screams GCN 2.0?

I understand 3dlinettes supposition that any of these specs arent drawn from an infinite amount of combinations (theres a limited pool of combinations that "make sense" at this point in GPU tech), but to me this bundle of coincidences at least warrants exploration as a distinct possibility (i.e. DGPU based on Bonaire).
 
And even if GCN 1.0, the TDP benefits for GPU in the range of the rumored Durango GPU are fairly well documented. And I'm not quite sure exactly what qualifies for GCN 2.0 at this point. Is there anything in particular that should scream GCN 2.0. We know very little about Bonaire, but I don't see anything about it that particularly screams GCN 2.0, either.
 
What exactly though? One version of the rumored 7790 (non-xt) has 12CUs, 768 stream processors, 16 ROPs and 102.4 GB/s of bandwidth. There's something in pitcairn that seems more likely to you? Or theres something about Bonaire that screams GCN 2.0?

I understand 3dlinettes supposition that any of these specs arent drawn from an infinite amount of combinations (theres a limited pool of combinations that "make sense" at this point in GPU tech), but to me this bundle of coincidences at least warrants exploration as a distinct possibility (i.e. DGPU based on Bonaire).

It was my understanding that Bonaire was GCN2.0, if not my mistake.
 
It will be interesting to see if Microsoft is really willing to leave all that achievable clock speed on the table without taking advantage of it.
 
Status
Not open for further replies.
Back
Top