Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
major nelson just tweeted a new video interview with marc whitten about the xbox architecture...

and all they talk about is cloud, kinect, voice, the controller, skype, everything but hard specs.

ugh.

http://news.xbox.com/2013/05/marc-whitten-and-major-nelson-discuss-xbox-one-architecture

i guess the interesting part is they talk cloud right at the beginning. interesting i guess.



yeah, they are definitely hiding.

Honestly with all the talk of manufacturing difficulty, overheating devkits at a rather pedestrian 1.6/800mhz clock in a huge case ... maybe it's a smarter idea for MS to cut the cord on this one.

Bite the bullet and drop in a kaveri + gpu if they have to.

Should be a speed bump for their devs to get switched over and they have flexibility WRT yields...
 
yeah, they are definitely hiding.

Nah, just a bunch of techical guys brainwashed/taken hostage by their marketing department.

In other words, they played marketing BS-bingo before the press event.

Incredible -> tick
Amazing -> tick
Powerfull -> tick
Cloud -> tick
etc.

Maybe the "force" from Cupertino is not far away ;-)
 
If we're talking a hypothetical difference between 800MHz GPU and 1GHz GPU then I do think it makes a difference. The first will be (and routinely is) spun as 50% weaker than PS4's GPU, the second would be spun as 20% weaker than PS4's GPU and I think there absolutely is a difference in the kind of impact that has on potential buyers. You may think that the numerous sites reporting things like this has no real effect on how the product's perceived and made no real contribution to the big stock spike for Sony but I don't agree with that.

It isn't really about what the real perceived difference in game quality will be, it's about getting the feeling that MS is at least trying to bring forth about the same level of hardware capability. 20% will "feel" a lot closer to that than 50%, and I do believe this matters psychologically. And the irrational psychological reaction to a product does impact sales.

1.2TF is 33% weaker than 1.8TF.
Microsoft could save its $50 million and correct a few forumers at GAF et al. on their math.
 
Last edited by a moderator:
1.2TF is 33% weaker than 1.8TF.
Microsoft could save its $50 million and correct a few forums on their math.

Current APUs like Trinity and Llana were already memory constrained, higher memspeed => better performance.

I think Microsoft produced a limping dog .. ehh.. Xbone here.
 
yup, and you arent gaining nothing, either.

if being more competitive allows you to maintain a higher price, you save money.

Good point.

@3dilettante

Upping the spec isn't simply a matter of appeasing fanboys, it is focusing their efforts on a market that is readily willing to drop $500 on a top notch console and not wasting time chasing casuals that get their fill of gaming for free on ios/android tablets.

Sure there are costs associated with cooling a hotter running 1/2ghz chip, but there are costs associated with under-performing and losing marketshare as well.

1.5TF puts them within 20% of 1.8TF
 
i was making a point. we dont know what that percentage is. you said pick a percentage so i did...at the percentage i picked it's probably worth it.

At that percentage, it means the design can hop around in that clock range and TDP at will.
The question would be why AMD isn't selling 4-5 GHz Jaguars.
 
Actually upping the spec is mostly about appeasing forum warriors. Most people don't know what a CU is or if 170GB/s is enough. If you want to sell console hardware you sell the games and the features. If their games showing at e3 is bad and their games look pathetic that will matter, having more jiggawatz is just a number for fanboys to froth over.
 
1.2TF is 33% weaker than 1.8TF.
Microsoft could save its $50 million and correct a few forumers at GAF et al. on their math.

So I said XBox One's GPU at 50% weaker when I meant to say PS4's GPU 50% stronger. Sue me. The latter is something that's being reported and that same number has an impact.

This idea that only some tiny handful of forum nerds care about specs sounds absurd to me. People don't have to know what the numbers mean, they'll still be listening to whatever the press says about how much stronger one is than the other.
 
Exophase, James Carr and rangers are on the money.

Why are some if us making comparisons to 4-5 ghz jaguars and suggesting clocking parts at speeds they already retail at as wishing for unicorns?

I'm not saying it will happen, but I would say it is entirely feasible and needed.
 
Exophase, James Carr and rangers are on the money.

Why are some if us making comparisons to 4-5 ghz jaguars and suggesting clocking parts at speeds they already retail at as wishing for unicorns?

I'm not saying it will happen, but I would say it is entirely feasible and needed.

Exactly. there is a huge diffrence between a 200-400 mhz bump on the cpu and a 200mhz bump on the gpu portion of the apu than a 3-4 times increase in clock speeds.

We don't know the tipping point of when yields go to hell vs clock speeds.
 
So I said XBox One's GPU at 50% weaker when I meant to say PS4's GPU 50% stronger. Sue me. The latter is something that's being reported and that same number has an impact.

Just changing the messaging gets the number down to 33%
The sad thing is that so many of the angry forum warriors are using the former percentage to justify their anger.

This is a marketing job, and the population of internet posters who have an inkling of what a TFLOP is, but don't know percentages, and have no understanding of the non-linear relationship between peak and graphical output are a poor metric to use.

Microsoft spent hundreds of millions of dollars for NFL features, and probably got back four orders of magnitude more customers. That's a way better return on investment than tens of millions to please GAF.

Exactly. there is a huge diffrence between a 200-400 mhz bump on the cpu and a 200mhz bump on the gpu portion of the apu than a 3-4 times increase in clock speeds.

We don't know the tipping point of when yields go to hell vs clock speeds.

Anand pointed out where the curve tips for Jaguar, past the 1.6 GHz/800MHz range.
edit: Probably slightly worse, since the best power trade-off is around 500 MHz for the GPU.
 
Last edited by a moderator:
Just changing the messaging gets the number down to 33%
The sad thing is that so many of the angry forum warriors are using the former percentage to justify their anger.

This is a marketing job, and the population of internet posters who have an inkling of what a TFLOP is, but don't know percentages, and have no understanding of the non-linear relationship between peak and graphical output are a poor metric to use.

Microsoft spent hundreds of millions of dollars for NFL features, and probably got back four orders of magnitude more customers. That's a way better return on investment than tens of millions to please GAF.

Microsoft doesn't control the image of the console exclusively through their marketing. What's reported on sites like Anandtech and Digital Foundry DOES matter and is not controlled by them. You can be as denigrating as you want about "angry forum warriors" or whatever but I don't think it's correct at all that that's the extent of who gets the message.

What does "got back four orders of magnitude more customers" mean? Four more than what? You mean they got a 10000x return on investment for hundreds of millions of dollars? What are you saying?

On a personal level I don't care at all how powerful either console is. And I get that it's frustrating when people keep dreaming about secret sauce. But MS raising the clock here is far from impossible. Realistic? I don't really know. But it's possible in a way that lots of other things wouldn't be. And it wouldn't necessarily cost tens or hundreds of millions of dollars. Maybe it would, but we don't really know, not least of all because we don't know when such a change could have been made.

Anand pointed out where the curve tips for Jaguar, past the 1.6 GHz/800MHz range.
edit: Probably slightly worse, since the best power trade-off is around 500 MHz for the GPU.

We can't use that to conclude a damn thing about what the extra power cost would have been going from 800MHz to 1GHz, much less what the impact would have been on their existing power and cooling capabilities.

If you're going to ask why so few Jaguar bins go over 1.6GHz you may as well also ask why few (as in currently none) Bonaire bins go under 1GHz. Then consider that a lot of binning is artificial market segmentation and not strictly the performance limits of the device. Then consider that Jaguar is designed for devices with totally different requirements. You can design cooling for a 50W Kabini, but is there a market for it? And if there is, would it eat at AMD's higher end markets? Going by what Kabini does or doesn't do as arguments for what XBox One can or can't do makes no sense. Why be so sure that 800MHz is a hard GPU limit but going from 12CU to 18CU is fine?
 
You can't forget that if this system is less conducive to used games, gamestop is going to arm its associates with these facts on the PS4 to try and sell it to keep their used sales up.

You see this in mobile carrier stores where they push android phones simply because their subsidy is less than what it is on an iphone.
 
What does "got back four orders of magnitude more customers" mean? Four more than what?
More than there are hundreds to a few thousand forum posters that care about the TFLOP count.

You mean they got a 10000x return on investment for hundreds of millions of dollars? What are you saying?
Impinging yields by 1% over something like 10 million consoles costs millions, I went with 5 million.
10%, and it's $50 million. The people it wins over number in the hundreds, probably, and they might change their mind for some other reason anyway.

But MS raising the clock here is far from impossible. Realistic? I don't really know.
I'm not saying it's impossible. I'm saying there are a million better things to do.

If you're going to ask why so few Jaguar bins go over 1.6GHz you may as well also ask why few (as in currently none) Bonaire bins go under 1GHz.
There will be, although they may be OEM or Chinese market SKUs.
(edit: That's too strongly worded. I strongly suspect there will be, as AMD does eventually create relatively obscure SKUs with lowered specs.)
There's also the lack of two Jaguar modules and miscellaneous accellerators.
I'm also keeping an eye out for an explanation for some of the bandwidth numbers for the coherent bus and the northbridge to L2 bandwidth numbers, which are a bit higher than Orbis has.

I'm also waiting to see which process the chip is being manufactured on. Is it TSMC's process that Bonaire uses, or the GF process?
A difference in process can explain having wider margins if the process is subject to greater device variability.
 
Last edited by a moderator:
You can't forget that if this system is less conducive to used games, gamestop is going to arm its associates with these facts on the PS4 to try and sell it to keep their used sales up.

You see this in mobile carrier stores where they push android phones simply because their subsidy is less than what it is on an iphone.

I get a feeling there must be some kind of agreement in place between them both, I can't imagine microsoft being that stupid as to be the only next gen console trying to restrict it's consumers.

If xbox one doesn't sort the huge question marks out I and many others have then I for one will make my mind up and change back to Sony.

1, I want clocks to get as high as tdp will allow,
2,i want to see a real demo of a cloud optimised game that makes a credible difference to not having it at all.
3,i want this used game malarkey explained fully and it not be totally restrictive.

If we get good answers to those questions, then I might consider an Xbone :)
 
If they wanted to go 1GHz/2GHz, the devkits would have been clocked that way.

I don't think we can completely rule out an expensive knee jerk reaction from Microsoft, but the odds are that they won't. Why do that when you can simply exaggerate numbers and be creative with maths? They told us the number of transistors, which is a completely meaningless number. But it's a bigger number, which pleases the crowd.

If games are at parity in screenshot comparisons, the numbers won't matter.
 
Exophase, James Carr and rangers are on the money.

Why are some if us making comparisons to 4-5 ghz jaguars and suggesting clocking parts at speeds they already retail at as wishing for unicorns?
If the arbitrary yield impact for a given TDP between say 1.6 GHz and 2.0 GHz for an 8-core Jaguar is .0000001, that is basically saying Jaguar can scale its clocks better than Bulldozer.



1, I want clocks to get as high as tdp will allow,
That would be what they're doing.
What else would they be setting the TDP for?
 
More than there are hundreds to a few thousand forum posters that care about the TFLOP count.


Impinging yields by 1% over something like 10 million consoles costs millions, I went with 5 million.
10%, and it's $50 million. The people it wins over number in the hundreds, probably, and they might change their mind for some other reason anyway.


I'm not saying it's impossible. I'm saying there are a million better things to do.


There will be, although they may be OEM or Chinese market SKUs.
There's also the lack of two Jaguar modules and miscellaneous accellerators.
I'm also keeping an eye out for an explanation for some of the bandwidth numbers for the coherent bus and the northbridge to L2 bandwidth numbers, which are a bit higher than Orbis has.

I'm also waiting to see which process the chip is being manufactured on. Is it TSMC's process that Bonaire uses, or the GF process?
A difference in process can explain having wider margins if the process is subject to greater device variability.


The more I think about it, the more I agree with you.

Even if the overclock to 1.5TF, I'd still go with the 1.8TF ps4. They'd still have anti-consumer policies that I don't agree with, and they'd still have a developer deficit.

The design is simply not strong enough to compete for the policies they intend on implementing.
 
More than there are hundreds to a few thousand forum posters that care about the TFLOP count.

You're fixating on TFLOP count, but you know that increasing the GPU clock would improve the performance of everything related to GPU performance except main memory bandwidth - but since it'd probably also improve the performance of eSRAM it'd scale pretty well.

Impinging yields by 1% over something like 10 million consoles costs millions, I went with 5 million.
10%, and it's $50 million. The people it wins over number in the hundreds, probably, and they might change their mind for some other reason anyway.

These numbers are completely arbitrary. Nobody knows what the actual impact to yield is, if 800MHz is really at any kind of inflection point. I've seen products where pretty much all of them will clock significantly higher than what they're shipped at without any problems. Initial specifications can be overly conservative. You can't know the exact manufacturing limits years in advance. I've even seen a few products get a spec bump after release when nothing in their manufacturing changed, just because the ones making it realized there was more headroom. This is probably more common before release. I'm sure there are plenty of examples in the opposite direction (where specs had to be lowered), probably plenty more at that, but that doesn't mean it can't happen both ways.

The idea that only hundreds of people would care about a higher clock speed is absurd. You grossly underestimate the number of people who pay attention to this, whether or not they understand it. I suppose you think Sony has completely wasted their time releasing any specifications at all. I bet they must also be kicking themselves for going with a 18CUs when apparently 12-14 would have made zero difference to anyone.

Hell, let's look at your argument in reverse - if fusing off a couple CUs increases yields by 1% for Sony and no one cares anyway why shouldn't they do it? Because they already released specs? Guess they should have kept it open then, huh? But I don't think Sony agrees with you.

I suppose you also think that AMD wasted resources with their 1GHz edition discrete GPUs, which just featured a modest clock bump (much more modest than what we're considering here). Yes, I know it's not the same since those naturally come out of the binning, I'm not saying they're the same, but I am saying that if only a few hundred forum readers cared about THAT that it wouldn't have even been worth making the bin for. But I suspect more than that cared, and I suspect more would care about an 800MHz to 1GHz bump for XBox One's GPU.

Putting aside real impact, I think plenty of people would respect MS more for trying to close the gap. More than they respect them for being completely coy about specs, and certainly more than they'd respect them if they tried to outright pass off that the two consoles have basically the same capabilities when everyone in the gaming media is saying otherwise. But that seemed to be what you were suggesting they do.

I'm not saying it's impossible. I'm saying there are a million better things to do.

A million huh. I can think of a few things but this would be pretty high on my list.

There will be, although they may be OEM or Chinese market SKUs.

Do you have a source for this? I couldn't find anything.

There's also the lack of two Jaguar modules and miscellaneous accellerators.

There's also the lack of two CUs. This really isn't the point. I don't buy the claim that the other stuff on the SoC is limiting the performance of the same GPU design (although I suppose the eSRAM could be).

It's pretty much a given that a console with a 1GHz GPU (and most likely 8x2GHz Jaguars, if you'd like; I've kept that proposal completely out of anything I've said) would use less power than the original XBox 360. And it'd probably use less power than the PS4. So it's probably a tangible design.

Maybe it's not worth any noteworthy impact to yield. But maybe it doesn't have to come with one.

I'm also waiting to see which process the chip is being manufactured on. Is it TSMC's process that Bonaire uses, or the GF process?

I don't think GF's 28nm is going to have anywhere close to the density of TSMC's, but Bonaire's die size fits in with the others exactly where you'd expect it to. The very idea that a refresh chip like this would be made on a different process would be mind boggling if not for AMD's massive obligation to GF.

But if you consider GF Bonaire a possibility why not GF XBox One SoC?
 
Status
Not open for further replies.
Back
Top