Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
I don't really see much of a difference between that and the GPU embedded in Durango.
2 extra CUs we speak of only a 16% increase, not negligible for sure but the biggest reason behind the difference in FLOPS is the clock speed, 1GHz vs 800GHz =>25% increase.

Durango at the same clock speed would push 1536MFLOPS not a crazy difference here. THe other way around that HD7790 would have pushed 1434MFLOPS @800Mhz, even less of a difference but at which cost the extra 200MHz comes? I guess it is a consistent increase.

I think that Durango has to be really low power, closer to the HD 7750 than to the 7770 as a xx% increase in clock speed seems to have a greater impact on power consumption than the sane increase in the number of CUs.

I do believe that power may have been the primary concern of MSFT, let hope at least it reflects nicely on a sleek form factor.


Isn't the 7790 clocked at 1075mhz.?
 
Because developers reading about an online rumor and actually adjusting their workflow based on it are two COMPLETELY different things.

The rumour came from a developer, urgo, developers knew about the possibility.

The way some go on, it was like a bolt out of the blue.
That Bonaire card should be a sweet choice ....
Had a thought earlier that it might be the PC equivalent card, run the 720 games on your windows 8 PC.

The extra grunt would make up for the lack of additional hardware like move engines and the extra layer of DX stuff it would have to wade through on PC.

Would make Windows ports incredibly easy.
 
Last edited by a moderator:
Timing for such changes are off though because rumors for the PS4 GPU specs have existed for at least a year. I found this from April of last year where the GPU specs were largely correct (AMD GPU, 18CUs, 1.84Tflops): http://www.ps3news.com/playstation-...ation-4-ps4-orbis-processing-specs-uncovered/

And if that showed up on some rumor site, then certainly Microsoft was aware of it, but still felt no need to change what they were working on.

Well I'm approaching it strictly from the point that now that we have the entire Bonaire family for comparison, we can see what Durango may have been based upon. Or possibly the other way around. We can also see the obvious downclocking that's going on with Durango.

So anyone hoping for Microsoft to overclock would need to realize that it's NOT something they want to do at this point. But because the Bonaire family fits so well within Durango's profile, Microsoft could have nearly chosen any one of them for the final hardware.

I know how flakey that sounds, but I'm not as thoroughly convinced with the leaked specs as others are, and even the most knowledgeable seem to stop just short of giving a straight answer.
 
Shifty, what specific spec changes are you classifying as 'extremely implausible'? Adjusting clocks wouldn't be at all implausible, depending on how much they would be changed. Also, there is some reasonable skepticism about how old the VGLeaks info is and how much circular rumor mongering is actually happening vs independent sources corroborating info. SuperDaE was the primary source for VGLeaks, DF, and EDGE. His info was from Feb 2012.

So the question is how rigidly did EDGE/DF actually nail down the timelines based on info from their dev sources? DF says their info was only able to be tied back to May of last year. What about EDGE? Without knowing how they phrased their questions to their sources it's tough to know how accurate the 2012 specs were and how up to date they were.

What do we know about the dev kit scheduling? When were the alphas sent out? Betas?
 
So we have known about GCN a while now, which as mentioned is scalable, and has already been in many configurations above 800mhz.

So ATI brings out a card with specs close to those rumored in the next xbox and this "that must be IT/OVERCLOCKK" hysteria starts?
 
So we have known about GCN a while now, which as mentioned is scalable, and has already been in many configurations above 800mhz.

So ATI brings out a card with specs close to those rumored in the next xbox and this "that must be IT/OVERCLOCKK" hysteria starts?

It's all valid speculation until someone says this MUST be what it is. Which thankfully, no one appears to be doing.

And whatever Durango ends up being it isn't going to be an overclock, obviously. If Durango launches with a 2 Ghz Jaguar and 1 Ghz GPU, then that's what it is. Nothing was overclocked. If it's 1.6 Ghz and 800 Mhz are currently rumored, then that's what it is.

It is possible that final silicon is able to clock higher than expected. In which case clocking it higher than the original clocks released to developers is NOT an overclock. It just is what it is.

As an example, AMD's Radeon 7970 when it launched had a target of 925 Mhz. When they got final silicon yields were quite good, almost all cards could reliably hit 1 Ghz or more. But since 925 was the target they stuck with it. If they had launched it at 1 Ghz it would not have been overclocked, it just is what it is.

Anyway, personally, I'm not putting any stock into the people speculating that it could be higher. I'm doubtful that MS would do that even if yields on final silicon turned out to be far better than their wildest projections.

That said, I also wouldn't be surprised if it happened.

Either way, whatever clocks are released are going to be the clocks that are deemed to have the greatest stability and yield at the required performance. And until otherwise noted, that appears to be 1.6 Ghz for the CPU and 800 Mhz for the GPU.

Regards,
SB
 
Shifty, what specific spec changes are you classifying as 'extremely implausible'? Adjusting clocks wouldn't be at all implausible, depending on how much they would be changed. Also, there is some reasonable skepticism about how old the VGLeaks info is and how much circular rumor mongering is actually happening vs independent sources corroborating info. SuperDaE was the primary source for VGLeaks, DF, and EDGE. His info was from Feb 2012.

So the question is how rigidly did EDGE/DF actually nail down the timelines based on info from their dev sources? DF says their info was only able to be tied back to May of last year. What about EDGE? Without knowing how they phrased their questions to their sources it's tough to know how accurate the 2012 specs were and how up to date they were.

What do we know about the dev kit scheduling? When were the alphas sent out? Betas?

February last year sounds about right to go from no silicon to a release machine this year, consoles take a long time to make. Any changes that they would have made could have caused potential delays, also the Beta dev kits with the final GPU silicon went out a month or two ago iirc, any changes at this stage of the game means not launching this year. And also probably a more expensive machine.
 
Any changes that they would have made could have caused potential delays, also the Beta dev kits with the final GPU silicon went out a month or two ago iirc, any changes at this stage of the game means not launching this year. And also probably a more expensive machine.

Improving clocks would necessitate delays? Wouldn't that be almost entirely up to MS and their yield tolerance? Sure, that equals $$$ obviously (and # of units manufactured by year end), but they may feel it is worth it.

Either way, I'm sure if devs felt their machine would be at a meaningful disadvantage compared to PS4 MS would make the necessary adjustments to satisfy devs as they have done in the past.
 
Improving clocks would necessitate delays? Wouldn't that be almost entirely up to MS and their yield tolerance? Sure, that equals $$$ obviously (and # of units manufactured by year end), but they may feel it is worth it.

Either way, I'm sure if devs felt their machine would be at a meaningful disadvantage compared to PS4 MS would make the necessary adjustments to satisfy devs as they have done in the past.

No, but if the yields are bad the launch may as well not happen. You're assuming devs could talk to microsoft about the PS4, it would probably be a massive massive violation of the NDA. Also, if you really want Microsoft to make changes, they can, but they won't be launching this year.

How long do you think a console takes to go from paper to machine, because I am lead to believe with all the design work, all the manufacturing and respins it takes a very very long time and they are going to need to start manufacturing them before the launch date, probably months before.
 
So it HAS been demonstrated that the console manufacturers will keep final decisions on hardware from developers. Is Microsoft doing the same? Doubtful, but it's possible.

Of course it's possible, lots of things are possible - the question is not whether it's possible or not, but whether it's likely or not, given what we know.

MS apparently knew that the PS4 was in all likelihood going to be more powerful than their box (since they'd have a high % of their BOM going to Kinect) and would have developed their strategy accordingly.

So they're definitely not going to be running around flailing their hands because the PS4 will have games that run at 30% higher resolution or something.
 
Last edited by a moderator:
Would overclocking the gpu actually help?

I am assuming that MS finely tuned the performance of the gpu with the performance of their memory setup. And given that gpus tend to bandwidth starved, the proper course of actions for increasing performance would seem to be to fiddle with the memory setup or a combination of both not just a simple OC of the GPU.

The 7790 is clocked higher but the 102 GBs of bandwidth it has access to is totally dedicated to the gpu unlike Durango, which shares it memory with the cpu. It seem a simple OC would just starve the gpu even more with the simple accomplishment of having the ALUs spending more time waiting around to be fed.
 
I don't see much point to overclocking, since short of a completely new chip it's still going to be weaker than the PS4. Overclocking will just make the yields worse and their chips more expensive.

This isn't like the Wii U, where the machine is too weak to get ports, the vast majority of people would be unable to differentiate between say BF4 running on PS4 and 720.

If I was MS I'd keep my BOM costs down as much as possible so I could undercut the PS4 on pricing.
 
Would overclocking the gpu actually help?

I am assuming that MS finely tuned the performance of the gpu with the performance of their memory setup. And given that gpus tend to bandwidth starved, the proper course of actions for increasing performance would seem to be to fiddle with the memory setup or a combination of both not just a simple OC of the GPU.

The 7790 is clocked higher but the 102 GBs of bandwidth it has access to is totally dedicated to the gpu unlike Durango, which shares it memory with the cpu. It seem a simple OC would just starve the gpu even more with the simple accomplishment of having the ALUs spending more time waiting around to be fed.

The 102GB/s isn't shared with anything on Durango. It belongs 100% to the GPU and memory clients that are apart of the GPU, such as the Move Engines. It isn't shared with the CPU at all. The 68GB/s of DDR3 is shared with the CPU.
 
Durango may well be less bandwidth starved than the 7790, PS4 or just about anything else. A GPU overclock may well also demand the esram is overclocked too if there isn't any kind of flexibility with bus multipliers or whatever.

The GPU is fine though. It's the CPU that matters as that may lead to lower frames rates which can't easily be rectified and can easily be noticed, while the GPU will just require lower resolutions, which almost no-one will spot and only fanboys will care about (despite not being able to spot it either).
 
I don't see much point to overclocking, since short of a completely new chip it's still going to be weaker than the PS4. Overclocking will just make the yields worse and their chips more expensive.

This isn't like the Wii U, where the machine is too weak to get ports, the vast majority of people would be unable to differentiate between say BF4 running on PS4 and 720.

If I was MS I'd keep my BOM costs down as much as possible so I could undercut the PS4 on pricing.


yes, yes, yes, it will be less powerful component-wise than PS4... crisis averted. ;)
 
I think everyone is running around in circles. Lets just define some stuff so it doesn't get out of hand.

At some point last year, Microsoft had an idea of X CPU and Y GPU they were going to use at C1 and G1 clock speed respectively. At some point before the end of last year, that silicon really needed to be finalized in order for testing to be completed and production ramp up to be good enough for yields to launch this year. This event would have had to have happened before the Playstation Meeting. Whatever actual silicon they produced, X and Y chip designs cannot have changed before the end of 2012. Period.

However, clock speeds are flexible if and only because they are basically a function of money, not time. If you increase X and Y clock speeds even in May, you only need to worry about yields (money) and redesigning cooling (some time, some money, mostly noise). Almost no launch games will fully take advantage of those speeds besides better frame rates.

All chips are designed with "optimal frequencies" based on their performance/power usage curve. It is not "overclocking" when Microsoft changes the clock speeds of their chips until they get outside of this optimal range. For the most part, the clock speed chosen is never close to the high end of this range, which, if they desire, leaves them some room to let it go higher. This still messes with yields. I make it similar to how all Cell processors are designed with 8 SPEs, but Sony only validates systems to have 7 working for use in the PS3. They very well could have in July said "Ok, PS3 now has 8 SPEs!" Because the silicon didn't change, but yields would have sucked (costing money and a time delay).

Good enough?

tl;dr: Anything changing yields costs money, anything changing silicon costs time. It is too late to change silicon.
 
Status
Not open for further replies.
Back
Top