Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
Not sure where to post that, I would have gone with the now defunct "predict the next generation etc." thread, but reading article about Haswell and the embedded eDRAM included with the GT3e version of the chip, I can't help but fell like somehow MSFT (but it applies to Sony too) fell on the wrong side of the technology curve.

They fell on exactly the only spot they could given the time-frame they're launching in. They picked the absolutely worst time window to launch in (2013). Significantly greater technology is so close on the horizon; I would have preferred a system launch in mid-2015.
 
They're on the "don't have Intel's engineering or process tech" curve, and they weren't going to spend the billions of dollars Intel invests in its 22nm process and fabs.
What was doable? An IBM 32nm SRAM sized to 128MB, or 32nm eDRAM that is best known for being used for chips that cost as much as a small car?
I think it was doable by involving IBM in the project, it seems that they were willing to lend a contract (on top of Nintendo) in the console realm. I often read that they have extra fab capacity and even without managing to sell a POWER based CPU they may have been OK to make some money out of those fabs and get some extra money (R&D) to participate in the project.
I would not have been free for sure, that is an R&D expense, then you have the cost of the chip by it-self but I don't think that a +/- 300mm^2 chip on TSMC 28nm process, and a 256 bit bus and all the R&D expense associated to the eSRAM+move engine came for free.

IBM is used to deal with off chip L3 and I would think that they would have find a workable along AMD engineering team. Durango looks like a massive engineering effort already with custom sound processing block, moves engines, etc. Definitely using another contractor on top of AMD (and in house efforts) would have raised the bill but by how much? The solution they chose comes with it own expense (more on the production side but I guess quiet some money get into R&D).

I'm not sure it is fair to compare the price IBM sells its CPU (top of the line at what they do, more than often backed with proprietary softwares ) with the price they could have sell a eDRAM chip in a sector they are not competing for making some money out how what seems not that busy foundries (though not unprofitable but extra money does hurt either).

Anyway just wondering, the solution may prove less performant, IBM may have been unwilling to let either Sony or MSFT access its 32nm process (without at least securing a POWER based CPU in next gen consoles), etc. still it looks like a pretty tempting solution especially as on contrary to what was speculated it doesn't seem that haswell relies on a "fancy/complicated" interposer supporting a wide 512 bit bus but on a narrower high frequency link.
 
Last edited by a moderator:
They fell on exactly the only spot they could given the time-frame they're launching in. They picked the absolutely worst time window to launch in (2013). Significantly greater technology is so close on the horizon; I would have preferred a system launch in mid-2015.
I do agree up coming memory technologies could be game changers not too mention the possible jump "over" the ~22nm node to finer lithography+finfet.
I always hoped for a major system to be release this fall and I think that if Nintendo were to use 45nm thy should have released a system fall 2011.
Now both the ps/xbox are to launch well into the life span of 28nm process (which could have a long life) but indeed really close to what could be consistent improvements /a good hit at the memory wall.
 
Last edited by a moderator:
So we're back to debating completely unsubstantiated spec-bump hypotheses?

heh, it wasn't long ago most people were attacking the rumored specs. "We know nothing, they're just rumors"

Now they're canonized I guess.
If the release something above what vgleaks suggests is that a spec bump or were they just wrong?

Guess that depends how closely the majority of the system tracks. If say, just the clocks or ram quantity are different, it suggests a bump.
 
heh, it wasn't long ago most people were attacking the rumored specs. "We know nothing, they're just rumors"

Now they're canonized I guess.


Guess that depends how closely the majority of the system tracks. If say, just the clocks or ram quantity are different, it suggests a bump.

THe rumours from vgleaks are not canonized at all, but they are far more likely considering the PS4 specs turned out to be very accurate. Those rumours are at least a good starting point. If we're going to play the "what if?" game, then there's a million things we could talk about, mostly without reason.
 
Here's another from the "truckload of salt" dept, from ekim at GAF. I'm not sure what ekim does, but I want to say he's vaguely connected to the industry somehow, or he at least knows his tech stuff/does programming. Anyway he's one of the heavyweights in the spec threads.

http://www.neogaf.com/forum/showpost.php?p=56750682&postcount=1906

Ok - another one from me :
You remember Master_Jo who was banned in the 6-Months-Delay thread?
He was banned because he added "This is not the full story" to Thuways leak of heating problems with devkits.

He contacted me after his ban on Twitter and elaborated on his statement.
It's basically that he works for a big partner (nothing gaming related) of Microsoft and has heard from MS employees, that there are indeed heating issues but that the reason of this is an upgrade of the specs.

I don't know what to make out of this but I guess a mod could confirm if he is really working for an MS partner by the mail address he used for registration.

But yeah, that's about as hearsay as it gets, kind of debatable whether even worth posting. But it could give another possible angle on things.
 
I guess you can afford to be more economical in your approach when there isn't much pressure on the graphics performance of your design.
Haswell should do quiet well especially taking in account the fact that power consumption was a really strong concerned during the design (25 Watts for the GT3e BGA version?). Imo Intel GPU are under estimated, I read again realworldtech article about both Sandy and Ivy Bridge GPU and I think that they have pretty awesome tech in their hands. I find the way that the GPU can act either as SIMD2x4 or SIMD8 or SIMD16 and deal with different workloads efficiently or the whole memory hierarchy/register files andhow the shader cores communicates with it and/or the fixed function hardware neat.
Actually for my untrained brain I would be close to think that they could very well have the best architecture in the world, both perfs per mm^2 and perfs per Watts seems pretty awesome to me, then if you focus on compute performances they are in a league of their own.

Putting aside the difference in GPU raw arythmetic power for a second, Haswell should have +100GB/s of bandwidth to play with. Durango has (significantly) more to play with but the system may have to move render targets every one in a while, may read texture from the main most of the time, etc. On the other hand the big pool of L4 Haswell should allow to free the link to main memory from most graphic related traffic. The extra bandwidth in durango comes at a cost 256 bit bus, 32MB of Embedded eSRAM in the time it is unclear how fast the link in Haswell could be pushed (I would think it is scaled/set for a lesser GPU than in Durango).
 
Last edited by a moderator:
Here's another from the "truckload of salt" dept, from ekim at GAF. I'm not sure what ekim does, but I want to say he's vaguely connected to the industry somehow, or he at least knows his tech stuff/does programming. Anyway he's one of the heavyweights in the spec threads.

http://www.neogaf.com/forum/showpost.php?p=56750682&postcount=1906



But yeah, that's about as hearsay as it gets, kind of debatable whether even worth posting. But it could give another possible angle on things.
Truck load of salt aside, looking at timeline and the issue being overheating, I would think of a beefy overclocking instead of more profound rework of the design.
We know that a GPU like Bonaire burns some power and can handle the heat, I'm mostly ignorant so I ask bluntly could some others parts of the designs could have have been designed in such a way that they either can't handle the extra heat generated by the GPU? Or that depending on how many "voltage/domain" (or clock speed domain) in the chip they have to deal with more voltage that they were designed too with an adverse effect on the heat they generate? So in our case I would not expect the extra heat to affect the GPU or CPU, I would think they both should be able to handle more voltage and extra heat provided with a better cooling solution but other parts of the chip designed to handles less tension(/heat) and that would start to heat too much ultimately resulting in software errors/failures.

EDIT
Sorry my posts are shitty... I do a really bad job at making my point clear.
Say there are 2 clock domains in the chip but only one voltage domain (at least at a given time):
1.6GHz for the CPU
0.8GHz for the GPU and DSP, eSRAM.
Now you raise clocks and voltage like:
2GHz for the CPU, 1GHz for the GPU +DSP
the CPU and GPU could deal with it to a given extend but the DSP now have to with both extra tension and increased clock speed and just can't handle it.
Troublesome, you can't add clock or power domains to the chip without a major rework (could be more of a rework than adding CPU or GPU cores).
 
Last edited by a moderator:
Assuming this rumor is true (which it probably isn't)

I don't believe for a second that the devkits heatsinks weren't already built with margins allowing a modification of voltage/clock, they probably didn't know what the final voltage or clock would be when they designed the devkits, and were waiting for the final silicon data. Considering they had a problem with heat on the 360, it's fair to say they'd be much more careful this time (blah blah solder, yeah sure). It would also mean the highest fan speed isn't even enough, while it's supposed to be a noisy fan speed which, in consumer electronics, is there "just in case" for high altitude and hot climates.

So.... here's a conspiracy theory... (my favorite!)

Maybe AMD had surprise power issues with the final silicon, and MS had to redesign the box with a better heat sink, causing a delay. When Sony announced the PS4, MS people joked something like "they didn't show the box, I wonder why?". If MS had power consumption problems on their SoC, it's pretty sure Sony did too, and they obviously knew that, because the designs are so similar. I.e. both companies suffered a similar delay, but reacted differently to it.
 
THe rumours from vgleaks are not canonized at all, but they are far more likely considering the PS4 specs turned out to be very accurate. Those rumours are at least a good starting point. If we're going to play the "what if?" game, then there's a million things we could talk about, mostly without reason.

I'm not going to say they're totally air tight but faking the documentation we got so far would have been one hell of a lot of work. Far beyond what you see with most leaks which are either rumors that got mangled by several layers of poor communication or someone who doesn't know anything looking for attention. If they're fake they were probably deliberately set loose by professionals, maybe MS themselves (although I have no idea why this would happen except maybe to throw Sony off)

I don't think I've ever seen something like that happen before.
 
I'm not going to say they're totally air tight but faking the documentation we got so far would have been one hell of a lot of work. Far beyond what you see with most leaks which are either rumors that got mangled by several layers of poor communication or someone who doesn't know anything looking for attention. If they're fake they were probably deliberately set loose by professionals, maybe MS themselves (although I have no idea why this would happen except maybe to throw Sony off)

I don't think I've ever seen something like that happen before.

My main issue has always been that Microsoft has never gone after them. When the Yukon/Roadmap was leaked, MS went after those sites aggressively to get it taken down. They really don't seem to care what VGleaks is putting out there.

When Adam Orth ran his mouth, he got fired and Microsoft made a statement. Again, with VGleaks, nothing. No action, not even a request to have it taken down. They went after SuperDaE with detectives and local authorities, but no interest in the site that's supposedly putting ALL of their plans out there for everyone to see?

I'm willing to believe that VGleaks has about half the truth. Maybe half.
 
I think it was doable by involving IBM in the project, it seems that they were willing to lend a contract (on top of Nintendo) in the console realm.
Says who, and even if it were offered, why exactly would they have more to offer?

I would not have been free for sure, that is an R&D expense, then you have the cost of the chip by it-self but I don't think that a +/- 300mm^2 chip on TSMC 28nm process, and a 256 bit bus and all the R&D expense associated to the eSRAM+move engine came for free.
Higher R&D for the on-die SRAM pool is a one-time cost, and pretty much everybody knows how to handle SRAM on-die. SRAM can be engineered for high redundancy, so yield impact is not as bad as raw die area would suggest. There's just one chip on the package, and the SRAM will shrink with any node transition.

A separate eDRAM die like Intel's is sold for maybe an extra $50 for every unit, and it will not integrate with a die with a standard logic process, meaning it's an extra component even if the main APU shrinks.

I'm not sure it is fair to compare the price IBM sells its CPU (top of the line at what they do, more than often backed with proprietary softwares ) with the price they could have sell a eDRAM chip in a sector they are not competing for making some money out how what seems not that busy foundries (though not unprofitable but extra money does hurt either).
This is correct, IBM has a yield tolerance that is buffered by selling its chips for that much money and selling services for more money than they'd get selling my organs on the black market. Microsoft wants an affordable component.

Anyway just wondering, the solution may prove less performant, IBM may have been unwilling to let either Sony or MSFT access its 32nm process (without at least securing a POWER based CPU in next gen consoles), etc.
Are you certain they'd want to use IBM's process?
 
So.... here's a conspiracy theory... (my favorite!)

Maybe AMD had surprise power issues with the final silicon, and MS had to redesign the box with a better heat sink, causing a delay. When Sony announced the PS4, MS people joked something like "they didn't show the box, I wonder why?". If MS had power consumption problems on their SoC, it's pretty sure Sony did too, and they obviously knew that, because the designs are so similar. I.e. both companies suffered a similar delay, but reacted differently to it.

Does that really make sense to anybody for the vgleaks Durango specs? The 1Ghz 7770/7790 TDPs are around 80/85W and 7750 at 55W I would assume Durango to be less than 100W TDP.
 
Does that really make sense to anybody for the vgleaks Durango specs? The 1Ghz 7770/7790 TDPs are around 80/85W and 7750 at 55W I would assume Durango to be less than 100W TDP.

somebody told me the durango psu is rated at 220 watts. higher than the 360 at launch. no idea if it has any validity.

at the least you're probably looking at a fair amount more than your calculations. you have to throw in esram, kinect stuff, usb stuff, hdd, blu ray, monster audio chip, 8 core cpu, etc etc etc.
 
there are indeed heating issues but that the reason of this is an upgrade of the specs.

It is more an upgrade of the clocks, I suppose, the 2 GHz on CPU are trivial, this can't help the system, I think it's a 960-1050 MHz on GPU, this is the very normal range for 77x0 gpu from AMD, nothing extraordinary to do, but a nice +25% in gpu power
 
The only weirdness about such a thing is I presume any rejiggering of Durango specs came in response to Sony's 8GB announcement in late feb.

But nothing about the overheating rumors suggested it's only something that just now came up or is connected to a spec change *shrug*
 
@GrimThorne

The first Amendment protects the press so MS is not allowed to shut news sites up even if it wants to.
Also the FBI went after SuperDae anyway.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top