Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
Because engineering entirely new, completely custom functionality is easier than adding more compute units to an existing scalable design?

I think he means exaggerating the importance of simple functional blocks might deflect criticism to performance deficits. Since we know so little about these "blocks" it is a bit premature to be making claims about the engineering expense they represent.
 
The more I think about it, the more I am fascinated by the HDMI in... it's a big enabler for my fanboy dream of a GT5-like functionality built in from the start.
But OTOH, the best way to enable that would have been a simple lightpeak connection (like the Sony Vaio Z). HDMI is kind of dirty to work with and unidirectional.
 
Well, if it can move the needle to Durango GPU is "like" 1.5 TF instead of 1.2, I think thats all it needs, combined with more RAM.

What I find a bit odd is why MS spent all this engineering time and effort over just adding a few more CU's. Nothing's more expensive these days than people. They'll quickly get more expensive than silicon.

It would seem like MS/AMD would have expended a lot of effort on these "DMA engines" and it's kind of weird.

Well, I guess if you look at it from the angle of "I am buying this ... thing ... one way or another so I might as well try to defend my purchase... and also reward the kickbacks on leaks" :LOL: :p

And no, no matter how much you and Proelite argue "fake flops" it doesn't work that way. I have a long history of arguing the importance of architecture and workloads but that is with some architecture to discuss. All you know is it is AMD, GCN, and 12CUs. Orbis is the same except 50% more CUs.

You cannot go and count make believe MS fairy dust to inflate flops -- a flop is a flop. And there is no point pretending the pixie dust does crap until there is some real information on it.

As for the CUs: it would be very wrong to assume they "cut out" CUs for a huge investment into pixie dust. More close to reality: It isn't a couple CUs but it is the GPU across the board and the balance of CUs & TMUs (and probably ROPs, too) that all took a hit down 33%.

MS invested 1/3 less in their GPU than Sony did. Get over it people, MS shifted their budget away from competing with Sony--very likely, based on their own leaked documents, BOM prices and Kinect are the center of the platform, no longer the core gamers who wanted the biggest and best in Xbox1.

As for the knocks against Sony, calling it pretty much a wash of 1.8TF for Orbis and "1.5TF of Durango Pixie Dust" I hope people realize that (a) Orbis has its own pixie dust and (b) Orbis is said to have 192GB/s of bandwidth. A 7850 usually has about 150GB/s so Sony has basically upped the bandwidth to meet the CPU needs while preserving a full GPU bandwidth budget. And unless you think AMD, NV, and Intel are idiots there is something to be said about their long history of balancing CUs, ROPs, and TMUs to bus width/bandwidth.

The cringe worthy part of all of this is Sony seems to have really embraced "multiplatform" development. Orbis is sounding like an easy platform to quickly port to and from the PC. This and Sony's hoard of 1st party devs should allow Orbis to stretch its legs.

Durango? MS's strong suit has usually been not going esoteric. Now they find themselves the odd man out: Sony/PC are very similar and Durango comes trotting along with 2/3 the power and *demands* you bend over backwards to bring up to parity. I guess they still have Bungie ... FASA ... Ensemble ... Carbonated ... Flight ... oh snap, MS has a very limited portfolio of developers working on core AAA games. Hint hint.

What did you expect from a platform that pretty much spits in the face of the most preferred gaming perspective (FPS) and spits in the face of it with Kinect and says "You do not need to be able to move! No controller for you! Or buttons!!"

Snappy comments aside this design seems pretty simple:

Cost saving.

Embedded memory is there to address the cost of high bandwidth (high power, large pad) memory. MS wanted a lot of memory (OS folks, set top box) and it is hard to have:

* A lot of memory
* A lot of bandwidth
* Cheap

So you take 1 and 3 and solve the bandwidth problem on-chop with die space that will scale down. MS's design is cheap (RE: the leaked PDF wanted a BOM around $200 with the SoC at $50) and it opens the door to scaling. Cheap will become cheaper.

A larger, more power hungry GPU that would require better memory (wider bus, more expensive) but also not allow as much memory was not in the cards.

Durango, good or bad (and most people I know think good--but they are not core gamers), is really shooting for a different demographic. The design documents and console design all aim at keeping costs very low to push into being a major player in the set top box market. The savings on console BOM will allow a better Kinect devise and cheaper retail prices AND the huge amount of cheap memory means MS can go crazy with Apps (prediction: XBL will NOT be Free).

And before anyone back peddles about how a 1TF+ console is so awesome compared to 2005 when the Xbox came out--this is 2013. You can get a 2TF/s GPU at retail for $160 after all extra mark up (and it has a PCB, high speed memory, etc). The MS leaked PDF made it clear silicon costs were taking a HUGE cut and the Durango leak, if real, confirms this. Total area in silicon dedicated to core gaming is going to drop over 30%.

My real concern as a gamer is if Sony comes out with a 50% edge in compute Durango will (a) suffer in IQ and (b) suffer in framerate, or both. The 360 regularly owned the PS3 in these areas and as a more core gamer it concerns me. The last thing I want is an entire generation--10 years, ouch--of torn frames, chugginess, and blurry crap.

No thank you.
 
The more I think about it, the more I am fascinated by the HDMI in... it's a big enabler for my fanboy dream of a GT5-like functionality built in from the start.
But OTOH, the best way to enable that would have been a simple lightpeak connection (like the Sony Vaio Z). HDMI is kind of dirty to work with and unidirectional.

I can't imagine the HDMI in will be anything more than the Google TV use case of displaying live TV from a cable box. The iterations i've seen use an IR blaster to change the channel but ideally they leverage the remote control part of the HDMI spec instead for a much less clumsy implementation. Microsoft already has plenty of experience with manipulating Live TV EPG data and should be well suited to integrate your existing cable TV package into a bing search for media. i.e. search for "Spiderman Movie" would yield something like:

The Amazing Spiderman":VUDU and XBOX Movies ($5.99 HD, $4.99 SD)
Spiderman 2 :HBO HD 7:00PM
Spiderman 3 : STRZ HD1:30PM

etc.

Not that this wouldn't be a welcome feature. Quite honestly if i could pass my cable box through the Durango in this way and it plays BR movies i would never need to change inputs on my TV. I dont think its any more than that, though.
 
I tend to agree with Acert, MS has shifted focus away from the core to a broader demographic.

Bkilian is also saying that's one of the reasons why he left - "the core gamers who headed Xboxt have been replaced by MBA's with $ signs in their eyes"

Plus, they didn't know what Sony was targeting with Orbis, their own target was 6-8x 360 and Durango is definitely that.

My real concern as a gamer is if Sony comes out with a 50% edge in compute Durango will (a) suffer in IQ and (b) suffer in framerate, or both. The 360 regularly owned the PS3 in these areas and as a more core gamer it concerns me. The last thing I want is an entire generation--10 years, ouch--of torn frames, chugginess, and blurry crap.

No thank you.

Yeah, that's my fear too, almost all my MP titles are on 360 for this reason.

Plus, the PS3 controller is terrible compared to 360 for shooters - especially if you have a Razer gamepad like I use.
But hopefully rumours are true about Sony ditching the Dualshock design for something else.

It seems I'll have to buy both machines again (i'll need the 720 for Halo and Forza at least).
Having just one machine means you have half as many exclusives to be tempted by, which, when you have little time to play games is a actually a good thing.
 
Last edited by a moderator:
Well, I guess if you look at it from the angle of "I am buying this ... thing ... one way or another so I might as well try to defend my purchase... and also reward the kickbacks on leaks" :LOL: :p

....

I would agree with all of this (edited down for space reasons) except for the fact that MS is apparently coming out with an app-enabled set-top box as well. (unless this is it, oh god no). In that case there's way too much overlap in a set-top box and "cheap" console.
 
My PC gamer mind has questions.

1. Why go for 8 'weak' cores over 4 'stronger' cores
power consumption and it's tinier, more chip out thios expansive waffers
2. How much of that 8Gb RAM is reserved for the OS?
Unknown rumors have it at 3 which sounds high.
3. Why use SRAM? The biggest benefit in the past was helping to reduce MSAA performance hit, people hardly use MSAA these days making it's biggest selling point moot. And why only 32Mb?Which in my opinion is not enough, 64Mb would of been better.
It still allows to use cheaper RAM, the size is not that relevant if the GPU can also render in the main RAM.
4. Why give the machine such low bandwidth? 68Gb/s for the main GPU tasks and the CPU?
It is not that low, trinity renders quiet some stuffs with half that (so a t the same time feed the CPU, do the texture read and rendering). That is twice as much as AMD APU, with 100GB/s for rendering only. A couple of years ago, GPU with 128bit bus and gddr5 did not have much more to play with.
I think that the SRAM will be a big pain in the arse and a waste of transistors just like the EDRAM was a waste in 360.
The edram is the reason the 360 performs as it does and won so many comparison at DF or elsewhere.
 
i.2 TF... Oh well, I guess I can't complain. Videogame consoles are TOYS after all, not serious high-powered enthusiast PCs.

I now see why bkilian has a low opinion of Charlie. Huge SoC that taped out the end of 2011, my foot. Jaguar core was still being designed.
 
Last edited by a moderator:
Yes it was very late, but MS could do it because the devkits already had 2x as much memory and they use the same board as the retail units.
It wasn't until years afterwards that devkits had more memory than retail units as a result.

Why did it take them that long to update the dev kits? I'm sure games suffered as a result...
 
liolio said:
Unknown rumors have it at 3 which sounds high.
Seems reasonable to me... If you have 68 GB/s that is only ~1GB/frame @ 60fps and ~2GB/frame @ 30 fps that you can actually do something with. I would think 5 GBs for the game would be more than sufficient.
 
It pretty much is 3GB, Richard from DF double sourced that rumour (the latest source was from CES), plus we independently have AndyH, thuway and Karak on GAF all saying 3GB, 2 cores as well.

I think only aegies and proelite are saying otherwise.
 
I'm wrapping my head around those secretive graphics auxiliaries that would make Durango match or exceed anything Orbis comes up with. Are they aimed at helping with GPGPU integration:?:
Eager to know more...
 
I'm wrapping my head around those secretive graphics auxiliaries that would make Durango match or exceed anything Orbis comes up with. Are they aimed at helping with GPGPU integration:?:
Eager to know more...
If we want to spend time deducing it, surely there's a finite list of things that:

1. Are necessary for *every* frame the console renders
2. Can clearly be done more efficiently with fixed function hardware so the transistor budget is worth it
 
Why did it take them that long to update the dev kits? I'm sure games suffered as a result...

I guess they just didn't want to spend the money to design and manufacture a special mobo to handle an extra 4-8 chips. :p On the other hand, the GPU itself was hard limited to address 512MB max IIRC, so maybe it took awhile to come up with the 1GB solution? I can't remember what they said about the kit. There is of course just being lazy and waiting for 1Gbit chips to show up.
 
Status
Not open for further replies.
Back
Top