Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
It could also be about energy efficiency. That could involve both the eSRAM and the audio DSP. And perhaps even more such as any energy efficiency benefits of the move engines or customization of the GPU.

If it turns out that the GPU is not stock then they might talk about how they achieved certain performance versus energy consumption improvements.

http://www.vgleaks.com/world-exclusive-durangos-move-engines/

(Suggesting energy efficiency from some of the MS research papers that were floating around.)

Very unknown right now but it holds the promise to be quite interesting if they spill the beans on some of the internal details related to some of the debated questions.
 
And XBox One isn't running at a nominal 2.3Ghz (3.3Ghz on Turbo), or running two 7200rpm drives. The systems aren't directly comparable, but there are components in the MM running at much higher speeds than the Xbox one. The problem for both boxes is the same.

If we dive into the world of PC hardware for a moment.

A Core i7-3790X (Intel rated 150 watt TDP) at 3.5-4.0 Ghz (depending on load) and 6 cores with 12 threads draws ~147 watts for the system when 1 thread is loaded and ~265 watts when 100% loaded (all 4 cores). That is for the entire system with minimal load on the GPU (in idle state). [http://www.xbitlabs.com/articles/cpu/display/core-i7-3970x_7.html#sect0 ]

Total system power with a 7790 when running a game (in this cases Crysis 3) which doesn't fully load the CPU has a load power consumption of 377 watts. Since it isn't loading the CPU's fully as in the above (basically identical testbed hardware) then the powerdraw due to the 7790 is between 230 watts and 112 watts. Since Crysis 3 is good at using multiple cores it's likely closer to 112 than 230. [http://www.xbitlabs.com/articles/gr...-hd-7790-geforce-gtx-650ti-boost_8.html#sect0 ]

I used the 3970X initially in order for us to somewhat isolate the potential power draw of the 7790 as they are using the same basic hardware platform.

That certainly makes it seem like the CPU is drawing the majority of the power, but what if we compare it to a currently high end CPU that isn't being factory clocked at such an extent that the power target is well past the bend in the power curve, thus power requirements shooting up far faster than clock speed ramping.

Take a Core i7-4770k (Intel rated 84 watts TDP) at 3.7-3.9 Ghz with 4 cores and 8 threads (still far more powerful than the CPU core in the next gen consoles) which draws 95 watts with 1 core loaded and 164 watts with 100% load. Now, it's starting to look more equal. [http://www.xbitlabs.com/articles/cpu/display/core-i7-4770k_11.html#sect0 ]

Power requirements for the CPU in the next gen consoles is going to be far lower than the Core i7-4770k. Likely AMD rated somewhere between 10-20 watts if AMD rated it in isolation from the GPU core. The majority of the rest of it is going to be consumed by the GPU core. Which makes sense as minus the board, component, and memory power requirements of a discrete card, that'd be right about where you would expect it to be for the GPU alone with the specs that it has.

Now, go to the PS4 GPU where you have 50% more compute units (the most power hungry units in a GPU) and IF those are pushed to the max the SOC is going to consume significantly more power than the Xbox One SOC when pushed to the max.

As for a "2d idle state" where you are just playing a 1080p video. A 7850 still requires a fan for cooling when playing 1080p video. You likely could play 1080p video passively cooled with a 7850, but it would require a heatsink much larger than what you could comfortably fit in the PS4 enclosure, especially when you take into consideration that there isn't a lot of vents to prevent heat from pooling within the case without active airflow. Combine that with the fact that there isn't a lot of empty space in the PS4 enclosure (compared to a PC enclosure with lots of space available to handle passive cooling of moderately powerful GPUs) to absorb that heat before heat buildup starts to compound leading to a rapid cascading increase in internal case temperatures and thus chip temperatures.

Basically the PS4 case was designed more around the esthetics. Then a cooling system developed to enable that. The cooling system could be quite efficient, but due to the design of the case, it requires active cooling at all times. It is similar in philosophy to a fully shrouded GPU cooler. It is very efficient for the space involved and especially when you do not know the dimensions (and thus available space or airflow within a case to accommodate an open cooler) but requires active cooling without which the GPU would quickly fry itself even doing something as simple as video playback. An open (minimal shrouded) cooler on the other hand can be quieter with greater cooling in a large case with lots of open air (possibly even passively cooled at lower load states), but could be disasterously bad and noisy if put into a case with limited internal space and limited airflow (like the PS4).

Looking at the case design of the Xbox One combined with the lower power SOC means that it is possible to passively cool something like that in lower load states. With the PS4 case design combined with a more powerful and power hungry SOC, that just isn't going to be the case.

Regards,
SB
 
If we dive into the world of PC hardware for a moment.
Sorry, I didn't get your point? It's moot to discuss desktop CPUs and GPUs when the Xbox One is based based Jaguar which is, quoting AMD, aimed at "low-power, subnotebook, netbook, ultra-thin and small form factor markets". Ignoring the PS4 stuff, the mods will only trim it.
 
Sorry, I didn't get your point? It's moot to discuss desktop CPUs and GPUs when the Xbox One is based based Jaguar which is, quoting AMD, aimed at "low-power, subnotebook, netbook, ultra-thin and small form factor markets". Ignoring the PS4 stuff, the mods will only trim it.

Power requirements don't change. You can scale those somewhat by designing for a specific target. Except that the power for the GPU resources aren't going to change significantly compared to their desktop counterparts. While the CPU is obviously a mobile design, thus low power, the GPU configuration isn't when configured to be powerful enough to run next gen games.

Hence, the power combined with the form factor of the two consoles are going to dictate the cooling required for specific needs and whether or not a console design is going to require some level of active cooling at all times or not.

Regards,
SB
 
Basically the PS4 case was designed more around the esthetics. Then a cooling system developed to enable that. The cooling system could be quite efficient, but due to the design of the case, it requires active cooling at all times. It is similar in philosophy to a fully shrouded GPU cooler. It is very efficient for the space involved and especially when you do not know the dimensions (and thus available space or airflow within a case to accommodate an open cooler) but requires active cooling without which the GPU would quickly fry itself even doing something as simple as video playback. An open (minimal shrouded) cooler on the other hand can be quieter with greater cooling in a large case with lots of open air (possibly even passively cooled at lower load states), but could be disasterously bad and noisy if put into a case with limited internal space and limited airflow (like the PS4).

Looking at the case design of the Xbox One combined with the lower power SOC means that it is possible to passively cool something like that in lower load states. With the PS4 case design combined with a more powerful and power hungry SOC, that just isn't going to be the case.

Regards,
SB

So many assertions without evidence. Unless you were on the design team, I'm not sure you would know how or why the PS4 case was designed the way it was. You also don't know which APU uses more power, though you assert the XB1 is lower power.
 
You also don't know which APU uses more power, though you assert the XB1 is lower power.
Unless there is a severe leakage problem with the eSRAM or Durango and Orbis are produced on significantly different processes or MS decided for a significant upclock in the last minute, that's a quite fair assumption given the similarity of the building blocks of both designs.
 
Unless there is a severe leakage problem with the eSRAM or Durango and Orbis are produced on significantly different processes or MS decided for a significant upclock in the last minute, that's a quite fair assumption given the similarity of the building blocks of both designs.

If it such common knowledge, I'd love to see a source or a derivation based on the various components of the APU. I've only seen various people say they are both ~100W consoles, but that is just a guesstimate I think.
 
You also don't know which APU uses more power, though you assert the XB1 is lower power.

For desktop GPU's the only benchmarks that can push a GPU above and beyond safe operating temperatures (before both companies started implementing robust hardware and software guards for it) were ones which fully loaded the shaders (or compute portions as they are now days) like Furmark. Doing so rapidly escalated the amount of power consumed and heat produced. Considering the PS4 SOC has 50% more shader (compute) resource, I think that's a fairly safe assertion.

I suppose Sony could have asked AMD to aggressively throttle the GPU above a certain power threshold in order to keep it in a similar power envelope as the Xbox One SOC, but then why bother having 50% more compute resources?

Other than that the CPU portion is basically the same. The only other major difference is the ESRAM, and that isn't going to consume nearly as much power as 6 CUs worth of compute.

There's no magic fairy dust. And considering the similarities of the two architechture's building blocks you either have less powerful hardware and less power consumption or more powerful hardware and more power consumption. It isn't like we have one on 28 nm and another on 20 nm. Or different architechtures (Nvidia versus AMD or VLIW 4/5 versus GCN).

Regards,
SB
 
Other than that the CPU portion is basically the same. The only other major difference is the ESRAM, and that isn't going to consume nearly as much power as 6 CUs worth of compute.

Regards,
SB

That great, but I'd like to know how you know this, are you repeating a source you won't disclose, did you derive it?

Move engines, SHAPE, eSRAM all contribute to the very large >400mm^2 APU, they must also contribute the the heat generated. I'd like to know the numbers, not just more assumptions.
 
Nothing's come up to justify why those things would be larger power contributors than a 50% increase in CUs and a GDDR5 bus.
Unless something is seriously different from the large caches in CPU, the eSRAM shouldn't draw a large amount of power.
The audio hardware and move engines shouldn't require too much area and a good fraction of the hardware is very specialized. The move engines should be almost negligible.
 
I don't think so. Even if it doesn't somehow magically make up for some performance difference between the two machines, that in itself is pretty irrelevant nonetheless. If the ESRAM implementation is somehow deemed significant enough warranting a full on disclosure as to how it was incorporated into the GPU and what the benefits of that are going to be, then it matters very little if the Xbox One's ESRAM solution is ultimately deemed more complex or inferior to what another console might be doing. I've always hated comparing the two directly, because I think the Xbox One is its own very capable system and what's more relevant is that devs have plenty enough power to do amazing things with the system, and that it's a big enough jump over what the 360 was capable of. It meets both those standards quite easily.

while i do agree with the fact that all kinds of breakthroughs should get their own chances of explanations, Microsoft themselves have been very quite on a lot of the architecture. Going by one of their employees statements it's because it remains with little importance, as they wish to focus on just the games.

.........the assumption can go both ways; in Microsoft's strategic favor, or condoning to the rumors of the setup being not too grand.

I've been wondering about the events that were as far back as when the XB1's launch was stalled a month later. I've been thinking what was the reason for the stall? because the rumors remain unchanged in the months difference from sony's reveal, when they had enough time to change.
 
If the rumor is true that MS is upping the CPU speed to 2gz it would cause even more binning headaches since the chips would need to be binned for CPU speed/GPU speed/voltage requirments/and eSRAM heat tolerance.

If the current binning problems include eSRAM heat issues then adding another 20w to the problem could be disastrous.
 
There is no such rumour that they are upping CPU speed to 2 GHz...

There is a rumour that they're looking at increasing the GPU clock but I haven't heard that they're also looking at doing this for the CPU as well.
 
Nothing's come up to justify why those things would be larger power contributors than a 50% increase in CUs and a GDDR5 bus.
Unless something is seriously different from the large caches in CPU, the eSRAM shouldn't draw a large amount of power.
The audio hardware and move engines shouldn't require too much area and a good fraction of the hardware is very specialized. The move engines should be almost negligible.

Are there any links or sources that say the CU type/architecture is the same as PS4?

Or it is just an assumption that it is the same?
 
It's an assumption based on AMD not having any other CU or CPU architectures that'd fit the bill. There may be little modifications like cache sizes or register counts or memory buses, but there aren't going to be any major differences in logic silicon and what the components can do. XB1 isn't going to have GCN2 or 8 Steamroller cores.
 
Are there any links or sources that say the CU type/architecture is the same as PS4?

Or it is just an assumption that it is the same?

Its assumption based off what AMD/ATI has at the moment and also based off what the vgleaks documentation has told us, that its pretty much, aside from the eSRAM stock GCN.
 
It's an assumption based on AMD not having any other CU or CPU architectures that'd fit the bill. There may be little modifications like cache sizes or register counts or memory buses, but there aren't going to be any major differences in logic silicon and what the components can do.

Do you know if this 1024 bit wide (4x256 bit wide) L2 interface fits with the GCN assumption?

I ask since someone suggested to me that GCN & PS4 has 64 wide L2 interfaces, not 256. (And further suggested that this is a hint of a more advanced version.) Is this incorrect?

http://www.vgleaks.com/durango-gpu/cache/



XB1 isn't going to have GCN2 or 8 Steamroller cores.

The 8 steamroller cores seems reasonably off the table but how can you be so sure about GCN2?

GCN showed up January 2011 and it is nearly 2 full years from January 2012 to November 2013.

Can one really make the assertion that it is GCN1.0 and not 1.5 or 2.0 or heavily customized? They had two more years to work on it.



I wouldn't be shocked either way. GCN 1.0 is not unreasonable but I do not think the other options can be so easily dismissed without sources as opposed to assumptions.
 
Status
Not open for further replies.
Back
Top