Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
The bad technical news just doesn't stop concerning the XboxOne. With all the leaked overhead its starting to feel that One is about half as powerful as PS4 in overall raw gaming power. The physical design doesn't inspire either. It's going to be interesting how the public reacts and the pricing. I'm surprised more devs aren't speaking up.
I would be surprised, nay, I would be Shocked!, if the PS4 does not reserve some amount of GPU also. You're sounding like the folks who were insisting games got to use the full 8GB on the PS4 too. From all indications, it seems they're reserving a chunk of memory and what might be 2 cores as well.

Remember, they're claiming to provide most of the features MS is. Game DVR, stereo camera with some sort of skeletal tracking, multitasking, instant switching. It would be folly to assume they can provide the same features without the same costs.
 
Now that the unveil has come and gone I think I should summarize this for those having a hard time. bkillian can tell me how wrong I am :D

MS back in 2009-2010 when was setting their targets. Fabs were having issues and certain technologies continued to slide (DDR4, FINFET, silicon interposers, stacked memory, etc). The current generation was running long in tooth and Sony and Nintendo surely going to release before the other technologies materialized. So like Sony they had to go with what was on the board.

They clearly set out with a number of goals. First was the next Xbox was to be what the vision was for the first Xbox platform: a central entertainment hub for all things digital media. This clearly means the focus of the platform is squarely Xbox LIVE moreso than the hardware delivery. The next issue was costs (BOM as well as TDP), specifically cost reduction--if they want it to be a media hub and to displace/compliment cable boxes, be integrated into other devices, and be so cost effective to eventually squeeze out the Rokus of the world cost reduction was vital.

So they walk into a room with AMD and get (a) the same general technology options and (b) the same BOM / TDP metrics. They both chose APUs for many reasons so their choices became really obvious.

MS chose the following:

- MS decided to fully embrace the media box it needed 2 distinct OSes, hence the hypervisor and a Win8 variant and GameOS. This would require a lot of memory.

- DDR3 offers high densities and is very cheap (but slow). At the time it looked like MS could get away with 8GB of DDR3 versus 2GB of GDDR5. The trade off was speed for space. As the APU was going to be a big chip pad limits weren't a huge issue over the lifetime of the design. The bet was they could reserve 3GB of memory to the OS and be at 5GB vs. 2GB. Or even if Sony went with 4GB at least 500MB was going to the OS so it would still be a 1.5GB advantage.

- Embedded memory to resolve bandwidth shortages. A separate eDRAM module requires another bus and likely Xenos issues so that was off the board as it distracted from the SOC goal. eDRAM on die would be costly due to process issues. ESRAM was the next best option (as other variants were off the table for licensing?) The problem is ESRAM takes up a lot of real estate. The good news is that it scales well with process shrinks. The ESRAM doesn't appear over engineered (e.g. just enough bandwidth for what the system can use). Surprisingly, probably due to how complex these chips are these days and AMD's limited resources, ESRAM is not shared between the GPU and CPU.

- Storage. For the media and gaming goals they need a delivery system and local storage. Online is still too slow and lacks penetration so BDR was the easy choice--and have consumers pay the licensing fee for movie playback. SSDs are too expensive/small so a single platter HDD would do.


This is where things went wrong: GDDR5 densities grew allowing Sony to match memory footprint.


So back to MS's platform. They obviously have a HUGE multitasking/multimedia edge--the entire platform is built around apps.

On the reverse side the technology budgets clearly show gaming was not the first priority with "everything else having to fit in after that."

Anyways it is a cascade effect. MS has fewer CUs because they have ESRAM taking up die space. They have ESRAM to compensate for slow DDR3. And they have slow DDR3 because [strike]they have a TON more memory[/strike]. And they have a [strike]ton more memory[/strike] so they can compete in gaming while also serving up all their media services.

It all blows up with Sony obtaining 8GB of GDDR5.

Reverse the flow: If MS had 8GB of GDDR5 they would not have needed ESRAM. And if they didn't need ESRAM they would have the same number of CUs.

And the crappy part for MS is I am betting their BOM out of the gate will be on par with Sony's.

Now not all is doom and gloom.

First is that the GDDR5 is going to be costly for a long time. On the flip side the ESRAM is going to shrink QUICKLY so MS's APU will become more affordable, quicker (in theory--if process reductions continue at a snails pace I think this was a horrible gamble). So MS has much cheaper and common DDR3 and will get to leverage more cost savings on node reductions.

MS also has Kinect (this is where their launch BOM may be higher) -- which the tech finally looks fantastic but they need games to prove its worth. The Kinect's killer app is probably the dashboard and Skype (which MS paid BILLIONS for).

Ahhh yeah, this thing was getting expensive so the embedded 360 SOC was tossed out. Hey, this saves money and MAKES money when you resell the XBLA games!

(Cha-ching MBA's with dollar signs in their eyes!)

Anyways, for all these reasons I highly doubt MS ever entertained changing the HW once they knew what Sony had -- EVERYTHING was tied into this strategy. I am sure they expected to be bested by 50% in CU performance.

Where MS was gonna hit back is they are close enough for rough parity--resolution reduction (something studies show most consumers do NOT notice) will level the playing field. The extra memory in theory could have assuaged some of the load times, and their killer app was (a) XBL / Media centric and (b) Kinect.

Unfortunately the wheels on the bus have fallen off for the core market: The platform has 33% less compute, a similar reduction in texture performance, 50% less fill-rate, a more complex dev environment due to the ESRAM, slower main memory, less main memory, probably less CPU resources (Sony has some helper logic for background tasks and MS has to power all that media crap somehow), all in an expensive box sporting multiple OS's running side by side that is clearly not aimed at gaming as the core/prime use but as an equal, important, but comorbid function with the media suite all tied into Kinect.

This will make many consumer super duper happy.

This will make many gamers-only very unhappy.


If Sony was at 2GB, or even 4GB, of GDDR5 I think the discussion would like a lot different. It obviously all comes down to games. But as a gaming platform it is pretty clear Xbox One is at a disadvantage across the performance board. Multiplatform games are going to suffer. I think Xbox One will performance wise be similar to the GCN next to the PS2 and Xbox1 -- it ran the same stuff and play the same, just not always as pretty or as smoothly.

So stop dreaming of hardware changes or upclocks. All these scenarios were thought out well long ago. MS knew Sony would have more gaming beef. That was by design.

MS is content with "very similar" gaming experience because the 33% less compute means better cost reduction down the road and allowed MS to make KINECT and MEDIA (XBL) equal full-time partners in regards to access and utility.

As a consumer you have to choose: Are all the extra Media things MS doing worth a couple less pretty pixels here and there or not?

Obviously price, exclusives, multiplatform performance, how used games and online pricing plays out, (and stuff like 3rd party indie publishing) will all be important but in regards to hardware and what it was designed to do do you want something that can overlay your Cable Box and interact with it and has a killer 3D interface in Kinect always on and Skype and Apps always on ready to be used and "Almost as good as Sony" gaming performance or do you want Tier1 Console gaming (but not PC!) and the apps be more secondary (something like the 360 on steroids)?

I know what I would choose... I think many will be surprised what general consumers choose. A much better Kinect and media coming out of your ears, especially the likes of the NFL, will appeal to a lot of non-core gamers who are none-the-less technophiles.
 
I would be surprised, nay, I would be Shocked!, if the PS4 does not reserve some amount of GPU also. You're sounding like the folks who were insisting games got to use the full 8GB on the PS4 too. From all indications, it seems they're reserving a chunk of memory and what might be 2 cores as well.

Remember, they're claiming to provide most of the features MS is. Game DVR, stereo camera with some sort of skeletal tracking, multitasking, instant switching. It would be folly to assume they can provide the same features without the same costs.

I would be shocked if Sony didn't have some GPU as well as CPU resources allocated; but it does seem they are aiming to do a little less and they also have some specific logic (ARM cores?) for some tasks in the background. Not sure where the Xbox One stands on these.
 
From what I saw, the intent is not to provide multiple running game VM's but rather provide something very lit weight to the game. Much like 360 where the bulk of the game OS is statically linked to the application.
So you basically always have exactly 2 VM's running, one for the game and one for the App OS, the latter of which can be running multiple applications.

I personally am assuming there is some cost to using the cloud compute, so I think you'll still see p2p multiplayer games. The easy solution to keeping the other players in the game if you answer a Skype call, seems to me to leave your game instance running in the VM. You no longer need to render or animate anything, so there is probably plenty of performance on 4 cores to keep it running.

I just realized how much of a pain it is to have 2 VMs and allocating one of them for dedicated servers would cause the devs when they're doing cross platform work. NVM, probably won't work.
 
You're close on some things, very far away on others. It was originally 4GB, got raised to 8 when the app OS could not fit. And there never was an embedded 360 SOC.
 
What was the nature of this GPU reservation on XBox 360? It's the first I'm hearing of it.

You can virtualize CPU time, but that means partitioning a fixed amount of resources (in this case two cores). Virtualizing the GPU is another story..

Microsoft is using a streamlined version of RemoteFX to virtualize the GPU. I don't know much about it, but AMD does have GPU virtualization support for RemoteFX in their pro GPU lines.
 
I just realized how much of a pain it is to have 2 VMs and allocating one of them for dedicated servers would cause the devs when they're doing cross platform work. NVM, probably won't work.

You are able to install apps on the OS side. I don't know how much CPU power that would require, but there is a good amount of memory on the OS side. Maybe that is something that would be available to install as a System app.
 
First is that the GDDR5 is going to be costly for a long time. On the flip side the ESRAM is going to shrink QUICKLY so MS's APU will become more affordable, quicker (in theory--if process reductions continue at a snails pace I think this was a horrible gamble). So MS has much cheaper and common DDR3 and will get to leverage more cost savings on node reductions.
Can we really say 2133 DDR3 is common? I mean, I'm sure it's cheaper than GDDR5, but 2133 isn't really mass produced. It's a binned skew, isn't it?
 
You are able to install apps on the OS side. I don't know how much CPU power that would require, but there is a good amount of memory on the OS side. Maybe that is something that would be available to install as a System app.

Dedicated Multiplayer Server Apps :LOL:
 
Unfortunately the wheels on the bus have fallen off for the core market

I think you underestimate the upsides of the ESRAM and the Xbox memory system. It should be able to provide at least some level of help to shrink the performance gaps based on the high level specs.
 
Now that the unveil has come and gone I think I should summarize this for those having a hard time. bkillian can tell me how wrong I am :D

*snip*.

That all looks pretty sound to me. I think the question we really all want answered is we know the PS4 is going to have a hardware advantage. How will the hardware advantage actually play out in 3rd party games? I've said this before, even if Microsoft was terrible and only reading Internet rumors, they've known since at least last May the PS4 was likely to have 1.84 TFlops of performance, and I would hope they have a bit better corporate espionage than vgleaks.com.

My opinion is that even if the performance difference is overwhelmingly obvious to even the lay person, that never stopped the PSOne, PS2, DS, or 3DS from beating its competitors in the market by a rather large margin. You will go where your friends are.

I guess we'll have to wait for Digital Foundry to do a breakdown on a launch title like Watch Dogs or Battlefield 4.
 
All RAM is binned, but yes, 2133 is rather high. 1600 is the highest standard speed I've seen supported.

both 1866 and 2133 are real speeds purchased from foundries . MR FoX has posted the info before.
Well, more to the point, I was wondering how supply constrained 2133 DDR3 would be. CPU support or whether it exists or not is not a concern, since even if that were the case it clearly will exist in the XBO and it clearly will be supported.
 
I think you underestimate the upsides of the ESRAM and the Xbox memory system. It should be able to provide at least some level of help to shrink the performance gaps based on the high level specs.

It surely has a big effect being there and helps xbox one, but it is not important when compared to other machines.

Even if you factor in the full bandwidth on that 32 mb, when you compare it to 8gb with slightly higher bandwidth across that whole 8gb it's insignificant. It's there because MS didn't think GDDR5 was the way to go for reasons mentioned by others it's not there because it's a better solution.
 
Last edited by a moderator:
I would be surprised, nay, I would be Shocked!, if the PS4 does not reserve some amount of GPU also.
Doesn't the system having some priority job queues suggest there is no gpu reservation?

If the OS needs GPU time, it just takes it and other things have to wait.
 
It's there because MS didn't think GDDR5 was the way to go for reasons mentioned by others it's not there because it's a better solution.

It's not a better solution performance-wise (at least not for bandwidth). But it might very well be the right solution once your consider power consumption and liftetime costs.

Cheers
 
Doesn't the system having some priority job queues suggest there is no gpu reservation?

If the OS needs GPU time, it just takes it and other things have to wait.

Do you really want the OS to just grab GPU time during a hectic firefight in a game if the game was using 100% of GPU time? Or would you rather have developers develop to 90% of GPU resources so that the OS can never cause a stutter if the OS decides to put up an overlay notifying you of a friend logging in or a message that was received or your video that you purchased finished download or you are getting an incoming Skype call (or whatever other potential communications programs the user is using), etc.

Regards,
SB
 
To be honest I dont want or even see the need for the os to grab anything unless I tell it to by bringing the xmb when I also expect the game to pause anyway so it wont make any difference.

What could the os possibly want to take the gpu for anyway when I'm gaming?

The task you mention do not require a 10% reservation or even 1%.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top