Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
Just spotted over on the GPU side of the forum, some site got a early look at the HD7790 "Bonaire" GPU by AMD. Supposed to slot between 7770 and 7850 and launch in April.

Most interesting to me is that it's 768 SP's (12 CU's). Exactly the count Durango is rumored at.

But more interesting is this model is at 1075 mhz. And at that speed approached a 7850 (basically the PS4 GPU)

My wishful thinking, this indicates a lot of clock room for MS to play with. I hope for an overclock of Durango GPU to at least 1ghz.

http://translate.google.com/translate?hl=en&sl=&tl=en&u=http%3A%2F%2Fwww.sweclockers.com%2Fnyhet%2F16702-radeon-hd-7790-bonaire-i-sweclockers-testlabb

3512
That's a most interesting find.

There are too many coincidences here not to have meaning. In fact they don't seem, well.... coincidences at all.

This new graphics card shares exactly the same bandwidth, it has the exact number of SPs, the same amount of CUs... and the only changes are the memory interface and perhaps the clock rate.

Does it mean that Durango's GPU is GCN 2.0 too?

Facts:

- It's not the first time AMD develops new technology for the PC based on hardware they created for a console. Xenos comes to mind.

- This means the 7790 Bonaire could well be a derivative of their work on Durango's GPU until now.

- CUs, SPs and bandwidth aside, there is another interesting coincidence. Bonaire is going to be launched in April -April 26th maybe? The same day Durango is going to be revealed? :smile:

Yet I don't buy the performance rumours. The performance gap between the 7790 is so close that you have to wonder how is that possible taking into account the PS4's 7850 has 18CUs vs Bonaire's 12CUs.

Not to mention the bandwidth of the 7790 is almost half of the 7850. GCN 2.0 exists for a reason, but I don't think it is able to perform miracles.

According to this article;

http://www.tomshardware.com/news/Bonaire-XT-7790-AMD-Radeon,21199.html

....the 7790 is a very impressive performer and apparently AMD doesn't allow manufacturers to release overclocked versions of the card for fear of it surpassing the 7850 and consequently creating a conflicting slot in their own GPU market, but I am still wondering where those numbers come from, because they don't seem to be very realistic if you ask me.
 
Last edited by a moderator:
7850 has 16 CUs.

BTW so what if one PC chip is basically identical to Durangos GPU portion? The base architecture is very flexible and scalable allowing multiple configurations like there are already on PC. The changes in GCN 2.0 are bound to be quite small anyways.
 
7850 has 16 CUs.

BTW so what if one PC chip is basically identical to Durangos GPU portion? The base architecture is very flexible and scalable allowing multiple configurations like there are already on PC. The changes in GCN 2.0 are bound to be quite small anyways.
Fixed already, thanks for the clarification. I meant the PS4, which is expected to perform based on the 7850 numbers --albeit the PS4's GPU than the 7850 is better on paper, so that's a factor too.

As for your question, there are too many coincidences; same company, same bandwidth, same CUs, same SPs, and they developed another console's GPU in the past where they took their ideas and technology forward in order to create similar products in the PC space.

It makes sense to me.
 
Fixed already, thanks for the clarification. I meant the PS4, which is expected to perform based on the 7850 numbers --albeit the PS4's GPU than the 7850 is better on paper, so that's a factor too.

As for your question, there are too many coincidences; same company, same bandwidth, same CUs, same SPs, and they developed another console's GPU in the past where they took their ideas and technology forward in order to create similar products in the PC space.

It makes sense to me.

Yet I don't buy the performance rumours. The performance gap between the 7790 is so close that you have to wonder how is that possible taking into account the PS4's 7850 has 18CUs vs Bonaire's 12CUs

That still doesn't make too much sense. This Bonaire as tested has 12 CUs at 1075Mhz. 7850 has 16 CUs at 860Mhz and PS4 18 CUs at 800Mhz, do the math on them and there is nothing weird about the performance.

What makes sense to you? You think this Bonaire packs some "cool" new tech primarily developed for Durango? No it's just a minor revision on their current GCN tech and it will probably be used in Durango also, because that's what AMD will have at the moment. The fact that Durango shares many elements of this Bonaire is not really newsworthy in itself as we already knew the actual specs of Durango. This chip is just a another revision of AMDs GPU tech. Now with 12Cus.
 
That's a most interesting find.

There are too many coincidences here not to have meaning. In fact they don't seem, well.... coincidences at all.
I think I disagree about a deeper link being necessary. It's not coincidental that a lot of things in digital designs scale by power of two, and for things that do not do this with GCN, you find things that like to change in multiples of 4 or 16.

GDDR5 channels are 32 bits, and only a handful of total bus widths are used.

We're not talking about the whole space of real numbers here for design options, and only a few numbers fall in the category of "not stupid".
If you want a design that is better than what is currently designed with 10 CUs and not quite as good as a design with 16, there really aren't enough choices to scream conspiracy.
If a combination of choices is a good optimum design, then a large difference would mean that either this design or Durango is going out of its way to be worse.
 
A Gearbox developer talks about the power -among other things- of the PS4 and the next Xbox, and he leaves a hugely enigmatic response to a question, where he hints at surprises, as if he knew anything, when he actually knows everything, or should know. It also gives people hope, for those who are waiting for a very capable machine.

http://www.thesixthaxis.com/2013/03...and-xbox-720-power-lowest-common-denominator/

That's a pretty worthless statement - he's just playing the "I have to pretend I don't know what MS are doing" card.

The beta kits have the same hardware as the vgleaks specs - and they went out at the end of last year.

But sure, maybe MS did throw out all the work they did on the Durango chip and decided to spend another few hundred million to commission a new 2+TF chip in the past few months, which they're aiming to have out by the end of the year......

At most what we can hope for is for MS to overclock the GPU (and perhaps the CPU too) to narrow the gap with the PS4 - but this is still quite unlikely, given what happened last time with RRoD.

And why would they need to overclock it anyway, few consumers will be able to tell the difference between COD7 running at 1080p on PS4 and 900p on 720.

MS will be counting on Kinect and the tight integration into the Windows ecosystem to sell consoles to, and not on having 30% more power than the competition.
 
Last edited by a moderator:
That's a pretty worthless statement - he's just playing the "I have to pretend I don't know what MS are doing" card.

The beta kits have the same hardware as the vgleaks specs - and they went out at the end of last year.

But sure, maybe MS did throw out all the work they did on the Durango chip and decided to spend another few hundred millions to commission billion a new 2+TF chip in the past few months, which they're aiming to have out by the end of the year......

Not saying that I believe it, but it isn't impossible for them to have worked on GPUs with the same architecture but with different amount of CUs enabled / disabled. Tell the devs to expect a baseline, but pick the version that has the best performance with acceptable yield in the end.

Not impossible but highly unlikely.

The few Xbox fans still clinging onto the hopes (ropes) of spec upgrades can run with this instead. :devilish:
 
Not saying that I believe it, but it isn't impossible for them to have worked on GPUs with the same architecture but with different amount of CUs enabled / disabled. Tell the devs to expect a baseline, but pick the version that has the best performance with acceptable yield in the end.

Not impossible but highly unlikely.

The few Xbox fans still clinging onto the hopes (ropes) of spec upgrades can run with this instead. :devilish:

Isn't a GPU with more CU's going to be bigger and hotter therefore requiring a redesign?
 
That's a pretty worthless statement - he's just playing the "I have to pretend I don't know what MS are doing" card.

The beta kits have the same hardware as the vgleaks specs - and they went out at the end of last year.

And the vgleaks specs are supposed to be from the same documents that are as old as Feb 12, right? So, obvious question..........why would Microsoft need to send out beta kits that have the EXACT specifications of the kits that developers already possess?

That wouldn't just be redundant, it would be ridiculous.
 
And the vgleaks specs are supposed to be from the same documents that are as old as Feb 12, right? So, obvious question..........why would Microsoft need to send out beta kits that have the EXACT specifications of the kits that developers already possess?

That wouldn't just be redundant, it would be ridiculous.

Err allegedly initial kits contained only generic standard PC hardware. The beta kits supposedly contain essentially final hardware. Including the GPU with ESRAM, etc.
 
Err allegedly initial kits contained only generic standard PC hardware. The beta kits supposedly contain essentially final hardware. Including the GPU with ESRAM, etc.

I see.

So Microsoft spent billions and hired this team of world class SoC engineers just so they could sign off on a mid range 7000 series and what appears to be a very basic CPU design from AMD?

It still doesn't make sense, but there it is.
 
They spent billions???

They've been spending about $2 billion on R&D a year since 2010. $2.53 billion in the second quarter of the 2013 fiscal year alone.

Of course we don't know how much of that R&D went towards the next Xbox, but it would seem plausible that some of it did go towards the console given it's soon arrival.

Also I'm sure it's team of engineers didn't come cheap. Some of those guys are individually responsible for filing more technology patents then whole companies combined.

It just seems odd to have that kind of talent gathered together, and then not use them for what they're renowned for.
 
I doubt that R&D costs are directly proportional to the number of repeatable shader blocks that a vendor decides to put on their chip. Roar Powah is more likely to influenced by things like cost, power, market requirements etc rather than how much R&D MS are prepared to spend on.

R&D for things like the Kinect 2, the esram, the display planes, and various other system related stuffs (like the southbridge and various IOs) will be costs that need to be covered regardless of how "high end" the GPU and CPU are. And developing the OS and its related services are probably burning through several tens of millions of dollars a quarter.
 
On the subject of [strike]overclocking[/strike] running processors beyond their originally anticipated clock speeds, here's some crazy dude that got his Bobcat powered system up from 1.6 to 2.36 gHz despite only having an itty bitty little cooler:

http://www.xtremesystems.org/forums/showthread.php?266200-GIGABYTE-E350N-USB3-who-is-your-daddy

So MS (and/or Sony) probably could boost CPU clocks without a major (or perhaps any) rework of the silicon, but power and cooling requirements would obviously be different. I don't think the question is so much whether it's possible as whether there was time and whether it was worth the added cost, disruption and risk.
 
I see.

So Microsoft spent billions and hired this team of world class SoC engineers just so they could sign off on a mid range 7000 series and what appears to be a very basic CPU design from AMD?
It's not a midrange 7000. There's a whole system that had to be designed around a long-term business plan, including evaluations of multiple designs. They had to come up with a solutions for substantial RAM without blowing the bank on GDDR5, which meant designing an eSRAM system and the DMEs and whatnot. They may also be designing the hardware with an eye on cross0device APIs for supporting Windows, so had to match Durango's design and ongoing API development to Surface and Windows 8 designs and APIs.
 
On the subject of [strike]overclocking[/strike] running processors beyond their originally anticipated clock speeds, here's some crazy dude that got his Bobcat powered system up from 1.6 to 2.36 gHz despite only having an itty bitty little cooler:

http://www.xtremesystems.org/forums/showthread.php?266200-GIGABYTE-E350N-USB3-who-is-your-daddy

So MS (and/or Sony) probably could boost CPU clocks without a major (or perhaps any) rework of the silicon, but power and cooling requirements would obviously be different. I don't think the question is so much whether it's possible as whether there was time and whether it was worth the added cost, disruption and risk.

Agreed. If it wasn't in the plans before now.............it's a little too late.
 
If there is BC hardware on the Durango mainboard (Xenon and/or Xenos) is there an easy-ish way those could effectively be used for compute tasks for Durango games?
 
If there is BC hardware on the Durango mainboard (Xenon and/or Xenos) is there an easy-ish way those could effectively be used for compute tasks for Durango games?

That would prevent them from removing BC after the first 18 months.

It would also push max power consumption and thermals up.
 
Status
Not open for further replies.
Back
Top