AMD/ATI for Xbox Next?

AzBat

Agent of the Bat
Legend
Small rumor from Fudzilla of all places, but they're reporting MS has already selected AMD/ATI for the next Xbox console...

We've learned from industry sources that AMD / ATI has already won the GPU deal for the next generation XboX console. It looks like Microsoft was happy with first Xeons GPU and it wants to continue using the same, especially since the new ATI GPU should keep the compatibility with legacy games.


The consoles refresh was supposed to happen in 2010 but due to the recession both Microsoft and Sony have decided to push its plans for 2012 and keep the Xbox 360 and Playstation 3 alive for more than it was originally planned.

We don’t know how the GPU looks like but judging from the timeline when it is supposed to be delivered we suspect that it might be a 28nm part.


http://www.fudzilla.com/content/view/15936/1/

Let the speculation begin! :)

Tommy McClain
 
Fair enough, the article does only mention the GPU contract. :)

At least going by rv870's power characteristics, I'd hope the idle power consumption to be very good...
 
Small rumor from Fudzilla of all places, but they're reporting MS has already selected AMD/ATI for the next Xbox console...




http://www.fudzilla.com/content/view/15936/1/

Let the speculation begin! :)

Tommy McClain
Thanks for posting it but if it were to happen it would be like "Duh". MS and AMD/ATI have seemed kind of lovey dovey since the 360 launched so chosing them makes sense. Intel doesn't seem to be a good fit because what they are doing is speculated to challenge Microsoft's whole Directx thing and NVIDIA seems to all be moving away from Directx as well. I perdict that backwards compatiblity will be a shoe-in. A directx 11 compliant off shoot of the Xenos with enough Edram to fit a 64bit 720p 4xMSAA without tiling. Since I don't know the problems devs currently have with the Xeno I have no idea what needs to be fix though.
 
1080p support will be an absolute minimum next gen, so if they do go with the edram again I'd wager it would be large enough to do at least 4xMSAA at 1080p without the need to go through hoops.
 
Nice. Knowing it's specs will help though. Basically, MS is the company that decides which way GPU's will go. Apart from raw compute, all the fancy new features are decided by them and them alone.

Besides the R&D money will subsidize their next gen gpu costs.

However, xbox1 was followed by nv30. xbox2 was followed by R600. Let's just hope AMD (if they indeed have the contract) are third time lucky. :)
 
If we're hearing about this so early... would it be prudent for us to assume that Microsoft is pushing to be first outta the gate again?

(I make the assumption based on the fact that Sony have shown consistently this gen that they're terrible at keeping secrets and so figure we'd have heard something about any deals made for PS4... as for Nintendo... well... they don't really need to be first)
 
If we're hearing about this so early... would it be prudent for us to assume that Microsoft is pushing to be first outta the gate again?

(I make the assumption based on the fact that Sony have shown consistently this gen that they're terrible at keeping secrets and so figure we'd have heard something about any deals made for PS4... as for Nintendo... well... they don't really need to be first)

Well, there were "rumours" back in Feb of this year that Intel has already won the GPU element of the PS4, slated for release in 2012.

But then, there's been the same rumours about ATI/MS since early this year too.
 
Thanks for posting it but if it were to happen it would be like "Duh". MS and AMD/ATI have seemed kind of lovey dovey since the 360 launched so chosing them makes sense. Intel doesn't seem to be a good fit because what they are doing is speculated to challenge Microsoft's whole Directx thing and NVIDIA seems to all be moving away from Directx as well. I perdict that backwards compatiblity will be a shoe-in. A directx 11 compliant off shoot of the Xenos with enough Edram to fit a 64bit 720p 4xMSAA without tiling. Since I don't know the problems devs currently have with the Xeno I have no idea what needs to be fix though.

Given MS's prior experience with intel in terms of their gaming console, I expect MS would be the last manufacturer to want to incorporate a Larrabee like GPU from intel regardless of the circumstances surrounding Direct X.
 
If we're hearing about this so early... would it be prudent for us to assume that Microsoft is pushing to be first outta the gate again?

They'll want to extend this generation as far as possible so what this likely means is that in the event of an early announcement by one of the other two, they'll be ready.

1080p support will be an absolute minimum next gen,

The only sensible metric would be on the number of ROPs and their capabilities (pixel, texel, zixel rates), but "designed for 1080p" is a completely useless statement considering devs will be pushing the machines any way they can. Resolution can always be sacrificed for more complex shaders, lighting algorithms, shadow maps, overdraw (crowds), alpha effects and so on.

so if they do go with the edram again I'd wager it would be large enough to do at least 4xMSAA at 1080p without the need to go through hoops.
There are a lot of "ifs" associated with this. Besides the point of eDRAM's usage, the size of such a chip will be down to the number of bits and process node for eDRAM.
 
1920*1080 * 32 bit * 4x MSAA is already ~32MB, and I'm not sure how feasible it is to fit even that amount of EDRAM on a die. And as AlStrong says, devs can do a lot of different things with a given amount of memory, from deferred shading through various HDR hacks to whatever they can come up with.

On the main topic, well, if they do get a GPU then maybe this also means no Larrabee for the CPU?
 
Indeed... even if they double the existing eDRAM (16ROPs, DX11 compliancy, 20MB), they'd need at least 55nm for similar die space assuming perfect scaling (vs 80nm). FWIW, eDRAM process nodes are not on-par with conventional processes due to the nature of their design.

I think what will be more interesting from an IQ standpoint are shader-based approaches to the aliasing issues as well as post-DX11 features.
 
Not sure how I feel about an AMD CPU. ;)

I would think anything from AMD would be a performance boost over the PPEs. I would be curious to know if it would be possible for AMD to go with a CPU design with a fair number of AMD64 cores. These are relatively small and relatively fast (especially if they see some vector extensions... Fusion?). Not as fast per core as the new AMD stuff, and probably not as small as PPEs, but would seem to be a chip that could be a nice middleground, especially with some FP extensions.

Which could always come in the form of multi-GPU if AMD can address those issues. I am sure GF would love to snag the GPU and CPU contracts (80-100M chips in a 5 year window?) and going with "DX" chips continues to leverage MS's investments and the "accessability" mantra. While some may cringe at a few traditional CPUs with a very large (or a number of) Fermi Style GPUs I think it would offer a huge marketing blurb (3TFLOPs+), offer an instantly huge graphics upgrade, and with OpenCL (DirectCompute, etc) would offer some legs for squeezing performance out of the hardware in years 5-8. So you get the instant eye candy to sell the new platform, some ease of access, etc. I am curious about eDRAM... maybe a global scratchpad will find its way in? Based on AMD's new GPUs, getting 1080p with some MSAA seems "cheap" enough in many cases to question whether you need to have 50mm2-100mm2 of silicon dedicated to the framebuffer. A scratchpad would be bigger-per-Mb but something that could be used systemwide could offer high bandwidth memory for all the clients. Not sure how with all the new buffers (Gbuffers, Abuffer, etc) devs are using for advanced rendering how a dedicated eDRAM for a framebuffer would go over with devs.

It will be curious what routes MS takes to control dev costs next gen while also allowing for cutting edge technology. Hopefully Epic cons them into 4GB of memory :p
 
I know this is going to stur things a bit but you guys don't think, given the direction Microsoft has taken in the last few years, that the 720 will just end up being a souped up 360? If that were the case do you guys think that a souped up Xenos or 2(some implementation of crossfire or whatever AMD calls it now.) won't be enough for next gen graphics?
 
It wouldn't be enough for me to buy one ;)

A 2x 360 offers what compelling reason to get a new console in terms of software? A 360+Natal is essentially a "HD Wii."

There will always be some hardcore gamers, and while there are diminishing returns, I would not expect all 3 to abandon the technology segment. If MS did decide to abandon this market it pretty much means Sony would capture that market defacto.

While there will be some tempering of chip area due to process shrink projections (i.e. don't expect a 500mm2 Fermi GPU!) and power issues, I wouldn't expect something drastically smaller than what the current consoles shipped on 90nm in terms of silicon budgets either.

Oh well, if MS dumps the high end I hear Uncharted 2 should looks pretty snazzy on the PS4 :)
 
I know this is going to stur things a bit but you guys don't think, given the direction Microsoft has taken in the last few years, that the 720 will just end up being a souped up 360? If that were the case do you guys think that a souped up Xenos or 2(some implementation of crossfire or whatever AMD calls it now.) won't be enough for next gen graphics?

IMHO, that is very unlikely. Both MS and Sony both are already trying to respond to the Wii motion control and extending the life cycle of their consoles. They will however, not forgo the opportunity to leverage their hardware power as a differentiating factor against Nintendo next time to.

Plus, MS and Sony are direct competitors so they won't take the risk of going into the next round of consoles underpowered. At best they might aim for a slightly lower initial launch price (as initial sales were a bit slow due to the high price tag), but apart from that I don't see them rehashing old hardware (BC proved to be not that much of distinguishing factor).
 
Back
Top