AMD/ATI for Xbox Next?

It's not gonna have a separate AMD CPU. Intel, despite their past history, will do everything and anything to prevent it.

I suspect it will be an IBM CPU as well. Like an ATI GPU, it's simply the path of least resistance.

I dont know though, if take the article at face value, maybe it will be an all AMD box. I can see that happening.

Well it's seem that AMD (ATI) works on a new design for the time range of 2012, a response at Fermi and Larrabbe.

Response to Fermi is already out, it's called HD5870, they will be competitors in 2-3 months, not 2012.
 
It's not gonna have a separate AMD CPU. Intel, despite their past history, will do everything and anything to prevent it.

Why? What evidence in the past do you have for this?

I can see MS being "loyal" to it's Wintel fellow I suppose. But OTOH AMD doing the whole box makes a lot of sense too. Why go with separate GPU/CPU vendors when AMD will probably push hard for the whole contract with some sort of discounted package deal?
 
There's no EDRAM in 5870, 5850, or any other desktop GPU, or the PS3 GPU, or any forthcoming GPU, or any past GPU, except Xenos.

Edram is an implementation detail. The bigger point is that gpu's will adopt tbdr sonner rather than later. You can use SRAM to do edram's job too. LRB does that. Bottomline, on-chip/on-die framebuffers are here to stay.
I do think EDRAM makes sense for WiiHD and possibly PS4, I agree with you it's needed there to save power, but not in the next Xbox.

Why does edram make sense for PS4 but not xbox720?
 
BC is crucial to capture as much early adopters as possible and give others a clear upgrade path. The next gen won't be uber powerful me thinks anyways thanks to the Wii model and to keep costs low.

I dont agree. I bought new consoles at launch, never cared if I could play my old games on it. If I want to play the old games, I already got the old console so I can always play them besides I dont buy a new console just to play old games. The same goes for new buyers I think. Why buy a xbox next if you want to play x360 games?
 
I dont agree. I bought new consoles at launch, never cared if I could play my old games on it. If I want to play the old games, I already got the old console so I can always play them besides I dont buy a new console just to play old games. The same goes for new buyers I think. Why buy a xbox next if you want to play x360 games?

What if the BC improved the performance and graphics of the last gen games? I think that's what the next Xbox will do.
 
It's not gonna have a separate AMD CPU. Intel, despite their past history, will do everything and anything to prevent it.
Intel may try but what if AMD/ATI offers a sexy (read not too big while performant) "fusion" chip?

(Big "IF" (as it's not possible now due to difference in process) If the system where to ship soon and AMD for example offer a Propus ~170mm² and a Juniper ~170mm² ending in a single chip around the same size as a Cypress that hard for Ms to refuse:
It would be a potent system
Investing more die size in the GPU could be tempting but only if the competitor also use an ATI chips otherwise huge ATI perf/mm² advantage could make up for it. Even if they lose in raw power they can make up for it by say adding more RAM for example than what the competitors can if they want prices to be even.
It would save money in regard to the mobo layout, cooling solution, etc.

This is now things could look even better depending on what GF32/28 looks like.
 
Why does edram make sense for PS4 but not xbox720?

Because Sony seem to know engineering better? They will want the power savings==less heat= a more reliable system.

I think a massive amount, 32MB or more, EDRAM is likely in PS4's GPU, but I just cant see it for Xbox 3.
 
EDRAM is mostly a waste and could hurt the next xbox badly if it is not large enough, no one would care about the unremarkable things EDRAM brings if it stops the next xbox from running most games at 1920x1080.
 
How likely would a custom LRB chip as a CPU for PS4 be?

One designed to perform a similar function to the CellBE this gen (i.e. assisting the RSX by doing more graphics tasks, e.g. pre-culling geometry, post processing etc).

How would that affect the possibity for BC?

If such a concept is even possible, could Sony even have a second GPU LRB or would they be more likely to go with NV or ATI?
 
EDRAM is mostly a waste and could hurt the next xbox badly if it is not large enough, no one would care about the unremarkable things EDRAM brings if it stops the next xbox from running most games at 1920x1080.
Point is nobody cares about the native resolution at which games are rendered outside of some Internet bubbles. Choice should be made on other consideration, cost, does it make sense relatively to where the GPU are heading, what RAM will be available etc.
 
EDRAM is mostly a waste and could hurt the next xbox badly if it is not large enough, no one would care about the unremarkable things EDRAM brings if it stops the next xbox from running most games at 1920x1080.
I kind of hope devs don't agree with you on this because if they do we can kiss it goodbye along with the fat pipe it provided. Last gen certain devs were asking Microsoft to dump the hard drive in the Xbox and low and behold the hard drive is now optional when it is probably needed the most. If the next Xbox can hold a 720p 64bit buffer with 4xMSAA into edram with only 1 tile it could probably do 1920x1080 with very few problems unless devs have something to say about it. Most games on the 360 run in 720p including Micrsoft's graphical showcase game. If anything I think Edram helped this gen and next gen it would probably do the same.
 
I'm on the disc-based side but for DD trade/sell , something like this would work ;
They will add an option to games that creates a custom PS Store - XBL Marketplace code. When you decide to sell your game you will generate the code. When you use "create the code" option the game will connect to the store create the code and will be removed both from your account & console. You'll no longer be available to play it again until you re-purchase it. So , you'll be able to sell your games via code.
 
Because Sony seem to know engineering better? They will want the power savings==less heat= a more reliable system.

I think a massive amount, 32MB or more, EDRAM is likely in PS4's GPU, but I just cant see it for Xbox 3.

So, EDRAM = good engineering, therefore, Microsoft will absolutely NOT use it, because their main characteristics is being bad at engineering?
 
AFAIK, it does not interfere with deferred rendering, but I could be wrong.
It shouldn't be any different than any other use of MRTs. It's just the geometry processing cost associated with tiling.

Setup rates haven't quite seen any significant advancements though...
 
Small rumor from Fudzilla of all places, but they're reporting MS has already selected AMD/ATI for the next Xbox console...




http://www.fudzilla.com/content/view/15936/1/

Let the speculation begin! :)

Tommy McClain

We've learned from industry sources that AMD / ATI has already won the GPU deal for the next generation XboX console. It looks like Microsoft was happy with first Xeons GPU and it wants to continue using the same, especially since the new ATI GPU should keep the compatibility with legacy games.


The consoles refresh was supposed to happen in 2010 but due to the recession both Microsoft and Sony have decided to push its plans for 2012 and keep the Xbox 360 and Playstation 3 alive for more than it was originally planned.

We don’t know how the GPU looks like but judging from the timeline when it is supposed to be delivered we suspect that it might be a 28nm part.

That is some incoherent and vague rumour. What is a console refresh by the way?

What I would like to know at this stage is what kind of cost-reduction MS has up it sleeves for the 360. As far as I know both the CPU and GPU are still at 65 nm and has been for quite a long time.

I expect some major cost reduction to take place at lest within a year from now, well in time for the Natal release. But what will it be? Can the GPU go through one more shrink considering the size of the die and all the connections it has as it is interfacing the CPU, GDDR3 DRAM and EDRAM. Will they have to merge the GPU and CPU or will they merge the GPU and EDRAM? Could the ATI contract be the work involved reshaping the GPU for cost reductions? Considering how vague it is the rumour could mean a lot of things.
 
That is some incoherent and vague rumour. What is a console refresh by the way?
IBy the way it make it sounds and the fact that 2010 is five years after the 360 I think it plainly means next generation systems.
What I would like to know at this stage is what kind of cost-reduction MS has up it sleeves for the 360. As far as I know both the CPU and GPU are still at 65 nm and has been for quite a long time.

I expect some major cost reduction to take place at lest within a year from now, well in time for the Natal release. But what will it be? Can the GPU go through one more shrink considering the size of the die and all the connections it has as it is interfacing the CPU, GDDR3 DRAM and EDRAM. Will they have to merge the GPU and CPU or will they merge the GPU and EDRAM? Could the ATI contract be the work involved reshaping the GPU for cost reductions? Considering how vague it is the rumour could mean a lot of things.
I expect this too, in fact I expect Ms to stand pretty much still for a while letting Sony regaining ground because they have some serious R&D cost in fornt of them. Imho with Natal coming next year and what they consider a relaunch they have have to be in position to launch the whole thing for cheap.
For the chip I don't expect them to merge anything but I could see them pack the whole three chips on a single package. In my opinion they should really work hard to have:
the xenon @45nm
the xenos @40nm
the edram @65nm
Actually Xenos @40nm would be really tiny I wonder how much it would cost to ask ATI to change the mem controller for a GDD5 one, GDDR5 is cheaper than GDD3 and they could use even lower quality one. They could move for a single 64bits wide bus for the whole system.

Back to the rumor I think that true or not is really about next gen anyway.
 
That would seriously impede die size reduction.

On a similar note, DRAM chips will need to have some pretty big increase in density. I don't think sticking 16x1Gb chips for 2GB system memory will be all that nice on the motherboard and wiring, especially on a 128-bit bus.
 
Back
Top