Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
Fifty-three mhz?? What an arbitrary number. I think the better news is not necessarily the 80 GFLOPS or so, but the fact that chip must be stable and running cool for them to even consider this.
Actually the round numbers you often see are pretty much just as arbitrary.
 
Hi guys
This may be a dumb question and theory but could this just not be a by product of design.
By that I mean designing a graphics chip to run at a certain voltage and then finding out that a that voltage your chip is running at at higher frequency that plan for .

And if so would this explain the esram increase as well .

After all all we really know is that Microsoft's engineers have worked on every chip in there box ...I.E making changes to base designs and as in the case of shape designing there own chips .

If this has been the case and we have no reason not to believe Microsoft on this then really we can't compare the graphics chip inside the Xbox one to any chip on the market until its released because we have idea as to how ...where and in what way Microsoft played about with there graphics solution .

After all Microsoft are closing out hot chips in august which can mean only one of two things in my book there going to talk about kinect or there apu and graphic solutions .
Sorry if this is a dumb question I'm not a tech guy so please forgive me :)
 
Actually the round numbers you often see are pretty much just as arbitrary.
Yes, they're only 'round' because we have a decimal system of representing value. If our numbering system was base 6, XB1's original clockspeed would have been 3412, and we'd be saying what an odd number, why not 3400 (792 MHz in base 10)?

MrFox's explanation sounds very good. My question is why 6%? Why not a larger amount? What's the limiting factor that a little nudge more isn't possible (true of any chosen clockspeed)? Also why wouldn't the CPU get a boost? The heat increase from 1.6 GHz from 8 small CPU cores (25 mm^2 I believe) to 1.7 GHz can't be that much. :???:
 
This is a 6% clock adjustment, it's a non-story.

The real story is why did they announce it? They announce a clock increase to an unannounced GPU clock rate? Just trying to make the news? They just let a lot of people down, people in these very forums were talking 1000-1100Mhz not to long ago, now the clock rate is known they have to move their hopes elsewhere.
 
Yes, they're only 'round' because we have a decimal system of representing value. If our numbering system was base 6, XB1's original clockspeed would have been 3412, and we'd be saying what an odd number, why not 3400 (792 MHz in base 10)?

MrFox's explanation sounds very good. My question is why 6%? Why not a larger amount? What's the limiting factor that a little nudge more isn't possible (true of any chosen clockspeed)? Also why wouldn't the CPU get a boost? The heat increase from 1.6 GHz from 8 small CPU cores (25 mm^2 I believe) to 1.7 GHz can't be that much. :???:

If those small CPU cores also took a .05 v increase (or more) to reliably hit that clock you are approaching what some may consider 'much'.
 
Yes, they're only 'round' because we have a decimal system of representing value. If our numbering system was base 6, XB1's original clockspeed would have been 3412, and we'd be saying what an odd number, why not 3400 (792 MHz in base 10)?

MrFox's explanation sounds very good. My question is why 6%? Why not a larger amount? What's the limiting factor that a little nudge more isn't possible (true of any chosen clockspeed)? Also why wouldn't the CPU get a boost? The heat increase from 1.6 GHz from 8 small CPU cores (25 mm^2 I believe) to 1.7 GHz can't be that much. :???:

Someone mentioned the clock bump being the highest they could go without increasing voltage. That would be a pretty good reason to not push it further.
 
My question is why 6%? Why not a larger amount? What's the limiting factor that a little nudge more isn't possible (true of any chosen clockspeed)? Also why wouldn't the CPU get a boost? The heat increase from 1.6 GHz from 8 small CPU cores (25 mm^2 I believe) to 1.7 GHz can't be that much. :???:
If I were to take a guess (and it is just that), I'd say they are looking at the (clock) margins applied to the original target yield point number and reviewing them at a system level - for a given reliability target system level elements such as the quality of voltage delivery from the platform, actual temperatures being run, etc., etc. can make a difference. Some yield data from more silicon lots may also be coming back suggesting that they can target a slightly higher clock without impacting the expect yield as well.

So, why 6%? Yes, almost certainly because its "free" and in which case, why not?
 
This is a 6% clock adjustment, it's a non-story.

Yes and no. For every up-clock rumour (positive) there was a counter down-clock rumour (negative), this sets the record straight.
 
If I were to take a guess (and it is just that), I'd say they are looking at the (clock) margins applied to the original target yield point number and reviewing them at a system level - for a given reliability target system level elements such as the quality of voltage delivery from the platform, actual temperatures being run, etc., etc. can make a difference. Some yield data from may also be coming back suggesting that they can target a slightly higher clock without impacting the expect yield as well.

So, why 6%? Yes, almost certainly because its "free" and in which case, why not?
The $64,000,000 question is how cautious Microsoft have been with this clock increase, due what I can only guess is probably a low quantitive sampling of chips. I wonder if they have scope to monitor consoles electronics and thermals remotely and, perhaps, with sampling of millions of units actually being used by users for a few weeks, to crack the clock up another 50Mhz with a firmware update.
 
Yes I consider it a major story since it quells many of the ridiculous rumours that have been floating around for a while. All we need now is confirmation that it only has 8GB and a real indication of the esram impact and we can all get on with our lives.

Incidentally, the clock increase improves the setup rate of the XB1 over the PS4 if you're interested in that kind of thing. Not that I'd expect that to have an appreciable impact on games.
 
My best theory so far is that MS made a pact with the devil. 853.33 is in fact 6.66% which was part of microsoft's incantations as a last ditch effort to increase sales. His real name is Steve Ba'almer. :devilish:

Or... considering PS4 had the exact same clock planned, it's probably AMD that made the 800 recommendation as an estimate for the best efficiency, they are the ones who know best. Now that they have the real final production started, if it still has good yield for 853.33 without increasing the voltage, that's a no brainer. I wonder if that means Sony published their final clock too quickly, and if they'll stick with it. They sure are bound to almost the same variables as Microsoft.
 
Yes I consider it a major story since it quells many of the ridiculous rumours that have been floating around for a while. All we need now is confirmation that it only has 8GB and a real indication of the esram impact and we can all get on with our lives.

Incidentally, the clock increase improves the setup rate of the XB1 over the PS4 if you're interested in that kind of thing. Not that I'd expect that to have an appreciable impact on games.

I think i'd only ever heard of 50-100mhz anyway so this falls in line with that, a lot of people were actually calling ANY up-clock ridiculous. I do think it confirms upward changes to RAM as more possible now, as it seems MS has shown a willingness to deviate from their original specs.


Or... considering PS4 had the exact same clock planned, it's probably AMD that made the 800 recommendation as an estimate for the best efficiency, they are the ones who know best. Now that they have the real final production started, if it still has good yield for 853.33 without increasing the voltage, that's a no brainer. I wonder if that means Sony published their final clock too quickly, and if they'll stick with it. They sure are bound to almost the same variables as Microsoft.

I remember reading that the CUs are the biggest heat generators on an APU. So even at the same voltage, wouldn't running the chip faster generate relatively more heat from the CUs? (50% as people enjoy pointing out ;) )
 
GB: Post-Xbox One launch and when this system is available, is there a reason for people to have proper Xbox One development kits? Is there a significant difference between what the developers get access to in terms of building their games?

Whitten: Our goal is for you to be able to have full access of the system and the services on Xbox Live. Also, this is a dev kit. This is the way that we will think about dev kits for people on my team that are working on Xbox One. There’s no “this is a second class sort of experience” type of thing. Right now, obviously, in the build-up to a platform launch, there’s lots of special builds and lots of special kits and all that kind of stuff, but that’s more time and place.

GB: But this isn’t a situation where, if you just pick up an Xbox One at Target, you’re only going to be able to access certain parts of the memory, certain parts of the graphics processor? This is going to allow you, at least eventually, once it’s all put into place, to be able to do everything that someone like Respawn is doing?

Whitten: That’s right.

This makes me feel that retail and dev kits will be indistinguishable. And if that's the case 12GB upgrade for the retail system is incoming.
 
^^^
At today MS still says retail kits have 8GB and we don't know for sure if devkits have 12Gb.
Costs and time also play a role in this matter.
 
Last edited by a moderator:
I remember reading that the CUs are the biggest heat generators on an APU. So even at the same voltage, wouldn't running the chip faster generate relatively more heat from the CUs? (50% as people enjoy pointing out ;) )
To me, the biggest glaring difference is actually the memory PHY with PS4 being GDDR5 and XBOX being DDR3, that's a whacking great big variable. Outside of that, decision points on operating temps, power supply components, even things like the number of PCB layers are variables that will play in to whether similar things can be done elsewhere.
 
Status
Not open for further replies.
Back
Top