Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
"Andrew said it pretty well: we really wanted to build a high performance, power-efficient box,"

"Having ESRAM costs very little power and has the opportunity to give you very high bandwidth. You can reduce the bandwidth on external memory - that saves a lot of power consumption and the commodity memory is cheaper as well so you can afford more. That's really a driving force behind that... if you want a high memory capacity, relatively low power and a lot of bandwidth there are not too many ways of solving that."

That's OT platform comparison. Irrespective of what rivals are doing, MS had a choice for their console whether to go high power draw, high performance, or low-end, and they have chosen the low end. This article admits that, which helps understand some of the technical choices (like no second GPU ;)). Whether that's the right choice or not is a business discussion rather than technical investigation.

High performance low power consumption...not low performance low power consumption....it's not a WiiU...
 
Are XB1 and WiiU operating in similar power envelopes? I expected the XB1 to be more potent than Wii U and closer to PS4 in terms of graphic fidelity at HDTV resolutions. Does anyone happen to have relative TDP of these systems?
No. I was only referring to a similar design philosophy regards platform power. Nintendo could have gone with a more powerful machine but chose low power consumption. MS have done similar. PS4 isn't much different either. All-in-all there's been a complete break with console tradition this last wave of hardware, a transition which isn't worth discussing here.
 
I really think a good portion of the X1 design was driven by future costs and power draw in years 5+.

At 14/16nm, the size of the chip should be in the realm of high end mobile chips (< 150mm) and the power draw should be less then half of what it's now if not closer to 1/3 of what it will be. I think they'll be able to take this design to lower price points (and sooner) than they could with the 360.
 
No. I was only referring to a similar design philosophy regards platform power. Nintendo could have gone with a more powerful machine but chose low power consumption. MS have done similar. PS4 isn't much different either. All-in-all there's been a complete break with console tradition this last wave of hardware, a transition which isn't worth discussing here.

Well, it depends on relative to what. The problem is more that in 2005, PC GPU's just kept going up in power draw (to about 300 watts anyway, where it seems they were forced to plateau as well). OTOH Consoles are stuck at a reasonable envelope.

So in 2005 we could have consoles GPU's near top PC's but now we cant.

I just dont think low power draw was the principal consideration here though, it was probably a key factor, along with cost. The other thing is MS have made their decisions, so now they have to defend them. Had they made other decisions they would be pointing out different positives.
 
On sleeping on the article there seems a clear takeaway that MS is basically saying 12-14 CU's is basically the usable limit due to "balance". But balance with what? MS says there are dozens of possible bottlenecks, but it seems like the obvious choice is CPU.

But it's a bit odd they wont come out and say it's CPU. Though I guess you can argue they sort of did.

The thing is AMD doesn't have a lot of good CPU choices, Piledriver is very large and very power hungry. If we rule Intel out, the choices were basically Nvidia+Arm or AMD, and all AMD really had to offer was Jag. ARM would have been even worse presumably.

So that may be why these systems are in the low CPU position they are.

The other point that came to me is MS said based on profiling software, the GPU upclock gained more than enabling 2 CU's. However it seems like if you profile software written for 12 CU's, of course that will be the case. It may have been different had the software been shader limited.
 
On sleeping on the article there seems a clear takeaway that MS is basically saying 12-14 CU's is basically the usable limit due to "balance". But balance with what? MS says there are dozens of possible bottlenecks, but it seems like the obvious choice is CPU.

But it's a bit odd they wont come out and say it's CPU. Though I guess you can argue they sort of did.

The thing is AMD doesn't have a lot of good CPU choices, Piledriver is very large and very power hungry. If we rule Intel out, the choices were basically Nvidia+Arm or AMD, and all AMD really had to offer was Jag. ARM would have been even worse presumably.

So that may be why these systems are in the low CPU position they are.

Good point...explains why they dedicated a monster DSP for audio to free up additional CPU burden. Basically they looked at the main bottleneck (CPU) and designed a system around that. I think they did a darn fine job all things considered ie cost, power consumption etc.:D
 
Was this ever in doubt? I assumed the system was perfectly flexible as the GPU has read/write to both pools. It'd be odd to limit the GPU to only outputting to ESRAM.

Even knowing the GPU can output to both pools I can see some people making the assumption that each buffer had to be entirely on either of them... Which I guess it would be the case if the memory access wasn't all virtualized even on different tiles of the same texture.

Did you miss that part where DF mentioned Wipeout HD, a PS3 game? This technique is not new.



Well why is your resolution dynamically dropping if you are CPU bound? It is the GPU that is the issue in these cases.

Is it possible that the cpu is the bottleneck and leaves little time for the gpu to do it's job, but solving the bottleneck on the cpu side is a hard task so they decide to drop the resolution so the gpu can have enough time to still maintain the target framerate?

Doesn't seem like the most straightforward way to solve a bottleneck, but who knows XD
 
I think in most cases games are CPU bound but there will always be corner cases where the game will be GPU bound which is why dynamic resolution was added.
 
I sort of preferred the quantum gravity explanation to be honest, LOL!

Seriously, I think that dev-controllable hardware scaler might be the most important piece of tech in the Bone.
 
Doubtful. The ROPs can handle 4x16bit pixel formats at full tilt. The ESRAM matches this perfectly with 109GB/s bandwidth. On XB1 writes from the ROPs to the ESRAM are basically free (ie. will almost never stall).

The only obvious place where more ROPs could be useful is in shadow buffer creation. However, if shadow buffer rendering shifts to PRT based techniques, already demoed by MS, the burden is likely to shift somewhat away from the ROPs.

Cheers

These are non-trivial statements. The placement of hardware based or tier 2 PRT then becomes an extremely critical component of their design implementation.
 
The only obvious place where more ROPs could be useful is in shadow buffer creation. However, if shadow buffer rendering shifts to PRT based techniques, already demoed by MS, the burden is likely to shift somewhat away from the ROPs.

Cheers
Is there a paper or slides for the MS demo?
 
I think in most cases games are CPU bound but there will always be corner cases where the game will be GPU bound which is why dynamic resolution was added.

We'll their spin is working, you guys are running away with this narrative. I'd like some data that says games are generally CPU bound, that certainly isn't true on most PCs and wasn't true last gen on even less capable CPUs.

Sub-hd, no AA, low res textures, etc are common and they are to cut back on GPU load, not CPU.
 
I sort of preferred the quantum gravity explanation to be honest, LOL!

Seriously, I think that dev-controllable hardware scaler might be the most important piece of tech in the Bone.
Somewhat agree.

Couple days ago I was asking if anyone knew if it had a dedicated scaler or if it was just part of a standard gpu.

All things equal, native resolution is best. Thought I'd say that first.
But:

If it never had a scaler then it may have had to output at 1080p just to look as good as say 900p upscaled with all that entails resources wise.

We have to wait to see just how good the scaler actually is, but with it also being dev controllable adaptive etc could be a huge benefit to the XB1.
 
We'll their spin is working, you guys are running away with this narrative. I'd like some data that says games are generally CPU bound, that certainly isn't true on most PCs and wasn't true last gen on even less capable CPUs.

Sub-hd, no AA, low res textures, etc are common and they are to cut back on GPU load, not CPU.

From what I've seen, the games that run on PS3/360/PC are more GPU bound if anything, apart from the few titles that limit the player numbers due to CPU issues (yes I'm looking at you BF3).

I can hardly think of a recent game that really stresses the CPU more than the GPU recently apart a few that suffer from badly written code/engines that even top of the line I7s can't solve.
 
From what I've seen, the games that run on PS3/360/PC are more GPU bound if anything, apart from the few titles that limit the player numbers due to CPU issues (yes I'm looking at you BF3).

I can hardly think of a recent game that really stresses the CPU more than the GPU recently apart a few that suffer from badly written code/engines that even top of the line I7s can't solve.

ERP spoke to games being potentially cpu bound this generation earlier in the summer IIRC
 
I can hardly think of a recent game that really stresses the CPU more than the GPU recently apart a few that suffer from badly written code/engines that even top of the line I7s can't solve.

You mean PC game. That's because PC games need offer a similar experience on the lowest common denominator. That means extra CPU grunt are only used for physics eye candy, not gameplay and to support advanced rendering techniques.

It's kind of silly to talk about balance and which platform has the better one. Developers will pile on features until performance potential void is filled.

Cheers
 
I've heard that pretty much any quad core 3Ghz chip is enough to stop most games being CPU bound.

But that's PC's and I don't think you can say that the next gen cpu's are anywhere close to that.

Will the consoles be fine, yes they have to be so they will be, it doesn't mean that their cpu's aren't the relatively weak link though.

Comparing against PS360 is probably less valid than comparing against PC's as that's what there basically modelled after...
 
I sort of preferred the quantum gravity explanation to be honest, LOL!

Seriously, I think that dev-controllable hardware scaler might be the most important piece of tech in the Bone.

Glad ya remembered my field of research! :D



Charlie over at S|A had suggested the display planes (the scalers you are talking about) might be a bigger deal than anyone expects, as per insider info. He claimed that after HotChips. I mention it only because he also claimed to have other insider info saying the real world eSRAM bandwidth was 140GB/s-150GB/s, which is verbatim the range given off hand by Baker in the DF article.

Just something I thought I'd note...I had personally heard 142GB/s for eSRAM but I found it interesting Charlie's specific range was quoted by Baker precisely. Wonder if Charlie is right about the display planes too.



Slightly off topic, and mods are welcome to edit or move this part if need be, but does this info about how a 53MHz clock boost did more to improve real game code than 2 CU's (in terms of graphics rendering which is what MS was testing) suggest anything useful for the 14/4 speculation in PS4's setup?

If PS4 is balanced at 14 CU's, that's 1.435Tflops of raw performance, but if the gain thereafter for every 2 CU's is similar to what MS's testing showed (in real game code I mean, not on paper spec) then by my calculations you'd be looking at a huge falloff for CU raw spec utilization of around 60% (1.6Tflops for 18 vs 1.435Tflops for only 14). Maybe that is what Sony/Cerny meant in that regard? Not sure what thread this part should go in...so apologies in advance. :/
 
Status
Not open for further replies.
Back
Top