AMD/ATI for Xbox Next?

However, as far as this particular rumor is concerned. I'm betting on standard GPU/CPU configuration.

THAT said, I wouldn't be surprised if there were provisions in place for AMD to design the GPU such that it would be trivial to later integrate it into a CPU ala. Fusion.

This.

BTW, how does MS provide 100% backward compatibility in hardware without having to go the route of actually including seperate hardware to do it? MS should make it mandated that you should be able to download all your XBLA & GoD & XBLIG titles directly from Live onto your Xbox Next & work seemlessly with no bugs on Day 1. Is there anyway for them to do this & expand the hardware capabilities for future titles too?

Tommy McClain
 
Since no one has brought it up, what about MS's alleged "forward compatibility?" Were there ever any statements as to how far this concept would go?
Possibilities
1 - Titles just play in higher res, more AA, AF, etc.
2 - More characters, cars etc, on screen. Maybe better draw distance. etc
3 - Something in between?

If they are serious about this, and that it would provide a bridge between generations that allows developers to make games for both system thereby drastically increasing the install base, that has to have an effect on their hardware choice doesn't it? I'm certainly not qualified to speculate on the specifics, but would love to read something from someone who is qualified :)
 
Last edited by a moderator:
This.

BTW, how does MS provide 100% backward compatibility in hardware without having to go the route of actually including seperate hardware to do it? MS should make it mandated that you should be able to download all your XBLA & GoD & XBLIG titles directly from Live onto your Xbox Next & work seemlessly with no bugs on Day 1. Is there anyway for them to do this & expand the hardware capabilities for future titles too?

Tommy McClain

I'd imagine BC will be a lot easier to maintain with regards to the GPU this time around.

It's the CPU side of things that would be more questionable if they went with say, an AMD CPU for example.

And, obviously BC will be easier for titles that don't code down to the metal.

It'll be interesting to see how that pans out, as there's always going to be some unforseen case that'll trip up BC.

Regards,
SB
 
I hope MS allows BC to die. I would rather not have new hardware a) impacted negatively in performance for BC or b) cost more for BC or worse, c) both!

By 2012 a 360 will cost under $150, easy. I see now reason why a console in 2012+ needs to run last gen games if that means it castrates the new system. No thanks. If they can get XBLA games to work, major thumbs up. But I couldn't care less about disc based games.
 
With multiplatform and XNA/DX developments, I wouldn't think there are too many that do low-level coding, especially given some of the comments from devs on the board here. Seems to me like it would just be like on PC...
 
With multiplatform and XNA/DX developments, I wouldn't think there are too many that do low-level coding, especially given some of the comments from devs on the board here. Seems to me like it would just be like on PC...

Is there a runtime? Does it guarantee realtime performance? I mean, if your engine is running at 60fps, moving upwards in hardware to Xbox Next, would we get a higher framerate or would we get the same 60fps? Would not guaranteeing performance bring any subtle bug? I mean, if we tie some piece of logic to each frame (which, given closed hardware allows you to make 'unsafe' assumptions) there might be consequences. Of course, since it's all multiprocessed anyway, if everything's strictly well-behaved there shouldn't be TOO many problems like that (is that even a realistic assumption)?

Not that it's untenable at all (probably just a runtime or patch away from working) but I'm not sure how PC-like it really is.
 
I hope MS allows BC to die. I would rather not have new hardware a) impacted negatively in performance for BC or b) cost more for BC or worse, c) both!

Exfrickenactly.

Sony dumped BC completely, after making a big deal out of it in the beginning, why cant others?

I dont want even a tiny bit of performance, let alone a lot, sacrificed for BC. And there's that nasty rumor that 360 performance is being held back by DX for BC reasons too.

And BC gets more troublesome as each generation goes on, not too mention I'm sure most want BC with all generations, which is even more impossible at reasonable cost. Is the next Xbox supposed to be compatible with Xbox 1 too (or, PS4 with PS2 and PS1 as well as PS3?). The only cheap way I see to do that is include all the actual original hardware, which would be costly.

Just drop it entirely. If people really want the old games so bad, well if they dont own the origonal system, theyre stupid. So it boils down to a minor convenience, one less box hooked to the TV. Well get over it. These BC terrorists tiny minority have been holding us hostage far too long :0
 
Exfrickenactly.

Sony dumped BC completely, after making a big deal out of it in the beginning, why cant others?

Because as Sony has discovered, BC offers a valuable revenue stream that loses its appeal, and a lot of its $$$, if you have to port old games to a new platform.
 
Oh... I'm a BC terrorist now? That's a first. LOL I'm afraid if MS wasn't going to put much effort into BC they wouldn't be going with the same GPU vendor. And the bad thing about supporting XBLA only is that it's not a subset of full retail disc games. It uses the same libraries & tools as the big boys(see UE on Shadow Complex). Now Indie games would be, but I don't see a lot of people worried that their 80 MS Point games are not going to work on the next system. LOL I bet it will be one of those all or nothing deals. Just remember MS is trying to sell a service as much as it is selling a platform. People are going to expect their digital content purchased on the old service to be BC with the new service. Yes, I'm one of those. Let's just hope it's done in a smart way. Not like most previous attempts.

Tommy McClain
 
BC is crucial to capture as much early adopters as possible and give others a clear upgrade path. The next gen won't be uber powerful me thinks anyways thanks to the Wii model and to keep costs low.
 
1920*1080 * 32 bit * 4x MSAA is already ~32MB, and I'm not sure how feasible it is to fit even that amount of EDRAM on a die. And as AlStrong says, devs can do a lot of different things with a given amount of memory, from deferred shading through various HDR hacks to whatever they can come up with.

On the main topic, well, if they do get a GPU then maybe this also means no Larrabee for the CPU?

They had 10MB of edram on 90nm . You don't think that 32MB would be possible on 32nm or even 28nm in 2012 ? Thats a little over a 3x increase. There are also better tech out there than edram. They could use T-Ram or Z-Ram even TTram.

I'm hoping for more than 4 gigs of ram.
 
They had 10MB of edram on 90nm . You don't think that 32MB would be possible on 32nm or even 28nm in 2012 ? Thats a little over a 3x increase. There are also better tech out there than edram. They could use T-Ram or Z-Ram even TTram.

I'm hoping for more than 4 gigs of ram.

No more EDRAM too please. I think it was arguable how effective it even was on 360 (it gives, but it takes away). Now to waste even more transistors on it going forward seems like a bad idea.

I'm no programmer, but I'd make a strong educated guess that if you had applied the EDRAM transistors to Xenos shaders instead, coupled with split ram pools ala PS3, you would have come out on balance stronger. In fact that seems almost inarguable, given that pound for pound the rest of Xenos seems to match up well with RSX, how much more Xenos+80 million transistors?

And doesnt it interfere with a lot of modern rendering techniques (deffered rendering) going forward?

Cant see it happening.
 
Because as Sony has discovered, BC offers a valuable revenue stream that loses its appeal, and a lot of its $$$, if you have to port old games to a new platform.

Actually, the word is much of the reason Sony dropped BC from PS3, is they wanted to begin selling those games as downloadable "classics" in the future, and reap the revenue (they would also no doubt rather you buy a profitable PS2 to play PS2 games, than a money losing PS3).

If anything it seems the profit motive favors no BC as well.
 
Oh... I'm a BC terrorist now? That's a first. LOL I'm afraid if MS wasn't going to put much effort into BC they wouldn't be going with the same GPU vendor.

There could be countless reasons for going with the same GPU vendor..especially given the MS-Nvidia squabble in Xbox 1. Plus plain old inertia, which probably counts for a ton here given the exceedingly complex technology here, the same partner would be a huge help. Not too mention, it seems onbvious given recent developments, ATI would be the company you'd want in a console regardless. In a way theyre rapidly becoming the only company that does strictly gaming focused GPU's, which is exactly what you want in a console.

I dont understand why you want BC. Just play it on your xbox/360 if it's that big a deal to you.

Also, do you think that Sony will have full BC with all their previous PSN download titles on PS4 as well? Nintendo on WiiHD/whatever?

If you can do BC for almost free, only then is it worth it. In fact I see a competitive advantage waiting to be exploited if one company burdens itself with BC and the other doesn't (perhaps Sony saw this as well when they dropped BC almost as fast as it came!)
 
No more EDRAM too please. I think it was arguable how effective it even was on 360 (it gives, but it takes away). Now to waste even more transistors on it going forward seems like a bad idea.

I'm no programmer, but I'd make a strong educated guess that if you had applied the EDRAM transistors to Xenos shaders instead, coupled with split ram pools ala PS3, you would have come out on balance stronger. In fact that seems almost inarguable, given that pound for pound the rest of Xenos seems to match up well with RSX, how much more Xenos+80 million transistors?

And doesnt it interfere with a lot of modern rendering techniques (deffered rendering) going forward?

Cant see it happening.

EDRAM was meant to save bandwidth and power, which it does.

AFAIK, it does not interfere with deferred rendering, but I could be wrong.

Unified memory is another cool innovation of xbox. It enables dynamic rebalancing of mem usage between cpu and gpu (sorta like unified shaders).

More ALU's made using extra 80M trannies aren't worth the bandwidth and power savings.

Going forward, the two constraints on gpu's are bandwidth and power. TBDR (of which present xenos is an example) save both (look at mobile space), so my hunch is that future gpu's will take this route anyway.
 
Well it's seem that AMD (ATI) works on a new design for the time range of 2012, a response at Fermi and Larrabbe.
 
Yep, and in "meager" 720p too.

If the EDRAM is specced to handle 1080p x 4xAA, and allows texturing from it (No.1 GPU request for the next Xbox, I'm sure), it will make for a killer deferred 720p renderer with the fattest, richest G-buffer you can imagine :)
Joshua Luna said:
I am curious about eDRAM... maybe a global scratchpad will find its way in? Based on AMD's new GPUs, getting 1080p with some MSAA seems "cheap" enough in many cases to question whether you need to have 50mm2-100mm2 of silicon dedicated to the framebuffer. A scratchpad would be bigger-per-Mb but something that could be used systemwide could offer high bandwidth memory for all the clients. Not sure how with all the new buffers (Gbuffers, Abuffer, etc) devs are using for advanced rendering how a dedicated eDRAM for a framebuffer would go over with devs.
Sorry Joshua I split your post in two for convenience.
If I understand right both of you would want a scratch-pad memory. Doesn't that imply that the EDRAM would have to be "on-chip to provide the maximum benefit (huge bandwidth to this local storage)?
IBM has managed to stuff edram in their last power CPU but they did so @45nm using their SOI process. Do you think that Other founders will be able to do so @32/28nm?
The Global Foundry have agreements with IBM so may be they could but that on all their processes? I mean ATI may end using a "CPU process" for the GPU likely to be less dense than their standard one. (May be it's a fake assumption but processes use by CPU manufacturers whether is IBM, AMD or Intel) are less dense against their TSMC counter part for example).

I would think anything from AMD would be a performance boost over the PPEs. I would be curious to know if it would be possible for AMD to go with a CPU design with a fair number of AMD64 cores. These are relatively small and relatively fast (especially if they see some vector extensions... Fusion?). Not as fast per core as the new AMD stuff, and probably not as small as PPEs, but would seem to be a chip that could be a nice middleground, especially with some FP extensions.

Which could always come in the form of multi-GPU if AMD can address those issues. I am sure GF would love to snag the GPU and CPU contracts (80-100M chips in a 5 year window?) and going with "DX" chips continues to leverage MS's investments and the "accessability" mantra. While some may cringe at a few traditional CPUs with a very large (or a number of) Fermi Style GPUs I think it would offer a huge marketing blurb (3TFLOPs+), offer an instantly huge graphics upgrade, and with OpenCL (DirectCompute, etc) would offer some legs for squeezing performance out of the hardware in years 5-8. So you get the instant eye candy to sell the new platform, some ease of access, etc.

It will be curious what routes MS takes to control dev costs next gen while also allowing for cutting edge technology. Hopefully Epic cons them into 4GB of memory :p
Swithing to X86 for the CPU could be interesting for Ms if they want to leverage further the XBOX/PC gaming dynamic. It's interesting to see what AMD could offer.
One Xenon core is 28millions transistors, AMD ones are ~50millions if my memory serves right.
As the difference between altivex and SSE is not what it used to be overall the question is can IBM match the performance that AMD OoO CPU provide for the same silicon budget. My gut feeling is no. It will be interesting if AMD is allow to use the same tech as IBM and can use EDRAm instead of Sram for the L2/L3 cache, that would allow for nice gain in die size could make up for the cores growing.
 
It's not gonna have a separate AMD CPU. Intel, despite their past history, will do everything and anything to prevent it.
 
Going forward, the two constraints on gpu's are bandwidth and power. TBDR (of which present xenos is an example) save both (look at mobile space), so my hunch is that future gpu's will take this route anyway.

There's no EDRAM in 5870, 5850, or any other desktop GPU, or the PS3 GPU, or any forthcoming GPU, or any past GPU, except Xenos.

I do think EDRAM makes sense for WiiHD and possibly PS4, I agree with you it's needed there to save power, but not in the next Xbox.
 
Back
Top