Next gen RAM choices *spawn

EDRAM sucks, because only the successful consoles have it?



Always much cheaper, yet the (profitable) PS2, GC, Wii and 360 used edram instead of fatter buses? Unlike the more expensive, highest BOM, and unprofitable Xbox and PS3?

It's possibly not safe to assume that that wider buses are always cheaper than embedded memory.



Xbox CGPU is already a SoC. That the edram hasn't been included in the main die doesn't mean it's not possible.


Different era's though in bandwidth, sizes, and memory prices. PS2 and GC, their edram was their ram, 360 was a bit of a tweener where MS wasn't sure if the bw to main mem was enough. PS4 with 2-4GB at 192GB of bw makes no sense to complicate the mem structure with edram. 720 with 8GB of DDR4 at 77GB of bw is again a bit of a tweener bandwidth-wise but may be enough that edram is more of a hinderance than help.

This coming gen is really tough for mem choices. All have risks and none is a slam dunk.
 
If ps4 is only 2gb...then that might not he the best solution, however if it 4gb with a massive 192gb/s then that will easily be the best imo.
 
So 190mm^2 is the limit (more or less) for a 256-bit bus? Thanks.

I know it's proprietary MS stuff, but do you have any idea how the bus for the 360's edram would impact on the minimum GPU size? The GPU got as small as 121 mm^2 with the Jasper revision, and was already well under 190mm^2 by the Falcon revision in 2007, so they appear to have been seeing real benefits of avoiding a 256-bit bus quite early on. The edram bus BW is quite high, but seems physically small enough to work with a pretty tiny chip.

The 360S CGPU die is 168 mm^2, aparrently.



That would indicate either starting at a reasonably low BOM, and not planning on as long a generation as this. Or moving away from the current, tech driven business model I guess.



Room for another 360 shrink yet then even without the edram, if the economics were ever to work out.


A quick scan of AMD and Nvidia cards shows a 256 bit bus for the 3870 @192mm^2. It might go lower but @166mm^2 other cards are at 128 bit bus.

I don't think anyone should be considering 2 shrinks. 14nm will need double or triple masking, finFets, and possibly EUV. That not a shrink but pretty much a complete redesign.
 
Different era's though in bandwidth, sizes, and memory prices. PS2 and GC, their edram was their ram, 360 was a bit of a tweener where MS wasn't sure if the bw to main mem was enough. PS4 with 2-4GB at 192GB of bw makes no sense to complicate the mem structure with edram. 720 with 8GB of DDR4 at 77GB of bw is again a bit of a tweener bandwidth-wise but may be enough that edram is more of a hinderance than help.

I think the PS2 had "normal" none embedded ram for its main memory, with only the GS having 4MB of embedded ram, but absolutely, things change. With the 360 there was also the issue of possibly launching with 256MB, and then memory benefits of resolving MSAA as you copy off the edram, so that may have factored into things a little too.

This coming gen is really tough for mem choices. All have risks and none is a slam dunk.

Yeah definitely. With additional background OS tasks even the role of the system itself is less clear. 8GB DDR4 + 32MB of EDRAM would probably trump 2GB of GDDR5, but there are so many other questions like bus width, OS reservations, possibly accessing the edram with the CPU

I'm intrigued by the idea of 12GB in Durango dev kits, but 8 planned for the final system. Does 12GB in dev kits (as opposed to 16) indicate split memory pools with one pool doubled to 8GB, or is it 4 + 8 in a single pool, with MS just trying to shave a few dollars off early dev kits?

A quick scan of AMD and Nvidia cards shows a 256 bit bus for the 3870 @192mm^2. It might go lower but @166mm^2 other cards are at 128 bit bus.

I don't think anyone should be considering 2 shrinks. 14nm will need double or triple masking, finFets, and possibly EUV. That not a shrink but pretty much a complete redesign.

Even without two full nodes, 128-bit buses might work if you had a separate CPU and GPU with split memory pools. Say 4GB low latency DDR4 for the eight core CPU, 4GB GDDR5 for the GPU and a small amount of edram on the GPU. Perhaps 32nm IBM for the GPU and 28nm GF for the 8 core CPU. Each chip could shrink at its own pace and not be limited by a large bus. That's two chips, two off chip memory buses and three memory pools total though, so maybe that's getting a bit messy.
 
Xbox CGPU is already a SoC. That the edram hasn't been included in the main die doesn't mean it's not possible.
IBM PowerPC A2 has 32 MB of EDRAM included in the SoC (die photo: http://www.power.org/events/2010_ISSCC/Wire_Speed_Presentation_5.5_-_Final4.pdf). It's in mass production (used already in many supercomputers). The chip is fabbed at 45 nm (same process as the newest XCGPU). So including the EDRAM would be definitely possible now... at least if you can have as good margins as you do with supercomputers :)
 
IBM PowerPC A2 has 32 MB of EDRAM included in the SoC (die photo: http://www.power.org/events/2010_ISSCC/Wire_Speed_Presentation_5.5_-_Final4.pdf). It's in mass production (used already in many supercomputers). The chip is fabbed at 45 nm (same process as the newest XCGPU). So including the EDRAM would be definitely possible now... at least if you can have as good margins as you do with supercomputers :)

Thanks! So cost must be the issue then.

I'm guessing that either it's cheaper to keep making the edram daughter die on 65nm, cheaper to make the XCGPU with another fab who can't do the edram (isn't it someone other than IBM who actually manufactures the current XCGPU?) or the cost of engineering a suitable on-chip emulator for the current off-chip bus (as they had to do for the CPU FSB) is off-putting. Or maybe it's some of all three.

I'm still secretly hoping that the WiiU uses a CGPU with embedded memory, despite all the rumours suggesting otherwise. IBM being able to do it and being the only named fab so far means I've not totally given up yet!
 
I think it's great that the PS34 needs lots of bandwidth, as it means we're not looking at a Trinity level APU. If the benefits of EDRAM won't be there for Sony, like they have been for Sony and other manufacturers in the past, then it makes sense to go with something else.

So basically, EDRAM is good if you want your system to suck. Thx.

If AMD are rapidly developing a large high performance APU for Sony and they don't intend to get into a decade long cost reduction slog against MS then it's a very different situation than it was for the 360.

Where are you getting all these conjectures? I still dont know why it's ok for one (the poorer) manufacturer to not care about costs but the other cant do x, y, and z because of them...Also, so you're saying graphics are all that matters (since apparently, if one has them cost reductions are not needed for that player)

I also fail to see why it's different an iota from 360, since that was a very bleeding edge performance device at the time. Morseo than PS4 will be, almost certainly (as Xenos was a top tier GPU of the era, while pitcairn is more mid range)!


It may not have been profitable even owning the IP. The HDD, poor sales of memory cards, greater power and cooling requirements, the bigger case, more memory, more complex motherboard etc up against the super cheap $129 PS2 might still have lost a bucket of money.

Maybe but to be honest we're just theorizing. Nobody but MS and Sony know the true intricacies. I just think it's a pretty reasonable theory that EDRAM was a net performance and cost loss for 360. I'm also open to that I may be wrong, but I'll never know.

Not owning the IP was not the only issue working against the Xbox. Sony own the PS3 IP, iirc, and it's only the high retail price of the 360 that's letting them charge so much. If they were up against a $129 fully featured Xbox they'd be stuffed.

I'm not sure what you're veering off to, but all we know is PS3 has reduced price more and is currently at price parity (depending how you look at it). Maybe the $129 Xbox doesnt exist cause the EDRAM is too expensive? TBH we're just going in circles here.


I never claimed that edram would always be cheaper, just that you can't say that a wider bus is always cheaper than embedded memory. And it almost certainly isn't. Come to think of it, WiiU has some kind of embedded memory too doesn't it?

Apparently. But those are rumored specs and you seem to have an aversion to those :p


And using a larger chip, larger and more complex mobo, and more memory chips (with worse $/MB), too. Which was probably on MS's mind when they made their decision.

There are pluses and minuses, I tend to think the minuses of EDRAM outweigh the plusses, but I'm no expert (BTW, dont we have an EDRAM thread?). Also I dont see why dual RAM pools mean more chips.

Even without two full nodes, 128-bit buses might work if you had a separate CPU and GPU with split memory pools. Say 4GB low latency DDR4 for the eight core CPU, 4GB GDDR5 for the GPU and a small amount of edram on the GPU. Perhaps 32nm IBM for the GPU and 28nm GF for the 8 core CPU. Each chip could shrink at its own pace and not be limited by a large bus. That's two chips, two off chip memory buses and three memory pools total though, so maybe that's getting a bit messy.

I like this sort of speculation. I cannot wait to learn about the final Durango system design to see exactly what they've done. All though Sony's design seems quite traditional , almost boring (but that doesn't mean low performance!).

It's always been how they tackle the bandwidth that's interesting, Sony's solution seems to be just a bog standard 256 bus. MS=?
 
Well wrt to the 360 and the edram I can't help but think of the comment of a member here that at some point MS was not heading for HD.
Considering SD resolution the size of the edram would have been more than enough in a lot of case.
MS decided other wise and went HD with twice the main memory and the same daughter die which ultimately allowed them to extend their product life cyle.

I'm not sure about how the 360 would have performed with a 256bit bus and gddr3 or xdram, 44GB/s is not that much bandwidth. Now if the trade off would have been 1GB of GDDR3 (if doable and I'm not sure it was) vs 512MB it could have proved interesting. As few games uses virtual texturing and it made it only at the end of this gen, the extra memory may have yield MS a greater competitive (more easily perceived) than better handling of transparencies. Definitely better textures are pretty obvious to any one.

But if its 512MB(256 bit bus) vs 512+edram I'm not sure MS made the wrong decision.
EDIT
Not to mention that Edram save quiet some memory.
 
Last edited by a moderator:
So basically, EDRAM is good if you want your system to suck. Thx.

no u? :eek:

Where are you getting all these conjectures? I still dont know why it's ok for one (the poorer) manufacturer to not care about costs but the other cant do x, y, and z because of them...Also, so you're saying graphics are all that matters (since apparently, if one has them cost reductions are not needed for that player)

Who said Sony don't care about costs? Who said graphics are all that matters?

Pretty strong rumours indicate that AMD got the contract early this year (February), and that they are handling everything processor related. Sony's R&D will be slashed compared to Cell and PS3 and there probably wouldn't be time to work with IBM even if Sony wanted to throw lots of money their way. Even a large APU would be cheaper than RSX + Cell.

I also fail to see why it's different an iota from 360, since that was a very bleeding edge performance device at the time. Morseo than PS4 will be, almost certainly (as Xenos was a top tier GPU of the era, while pitcairn is more mid range)!

You can't see why it's an iota different from the 360? From the top of my head I'd pick:

- The requirement of sampling from buffers making the 360's edram solution outdated
- Using a (larger) APU instead of separate (smaller) CPU and GPU
- Sony cutting back R&D
- Sony not having the time to work on a customised part like MS did
- No IBM involvement
- Process shrinks and their cost benefits slowing down

... as being notable differences. There are probably other differences too.

Maybe but to be honest we're just theorizing. Nobody but MS and Sony know the true intricacies. I just think it's a pretty reasonable theory that EDRAM was a net performance and cost loss for 360. I'm also open to that I may be wrong, but I'll never know.

I think the balance of evidence shows it was a net win for the 360. You have talented developers flat out stating on this board that that the edram is crucial to the 360 performing the way it does, and you have measurements showing that MS shrank the GPU way below the size demanded by a 256-bit bus (twice) in the time in which they were combating RROD and bringing the 360 into profitability. We also know that profitability through shrinking was MS's business plan from the beginning.

We also know that MS could make the current XCGPU on IBM 45nm chip if they wanted (and Nintendo are doing this for WiiU) but they choose to keep making part of it on an older and undoubtedly cheaper node, at least for the moment.

There are pluses and minuses, I tend to think the minuses of EDRAM outweigh the plusses, but I'm no expert (BTW, dont we have an EDRAM thread?). Also I dont see why dual RAM pools mean more chips.

I never said that dual ram pools means more chips - it doesn't have to. A 256-bit bus would have demanded double the minimum number of memory chips for the 360 though, which would have meant a larger and more complex motherboard and worse $/MB for the platform in the long run.

I like this sort of speculation. I cannot wait to learn about the final Durango system design to see exactly what they've done. All though Sony's design seems quite traditional , almost boring (but that doesn't mean low performance!).

It's always been how they tackle the bandwidth that's interesting, Sony's solution seems to be just a bog standard 256 bus. MS=?

I'm looking forward to seeing what Nintendo do too!

Sony moving from the exotic PS3 to what might appear to be a pretty capable PC graphics chip with a couple of AMD CPU modules bolted on would be a huge change, but it might put them on the right side of MS wrt PC multiplatform games, which at a stroke would give their games strength in numbers. If they were the only vendor offering fully unified memory and practically unlimited buffer sizes that would be a win too. I hope they go for 4GB of ram.
 
Well wrt to the 360 and the edram I can't help but think of the comment of a member here that at some point MS was not heading for HD.
Considering SD resolution the size of the edram would have been more than enough in a lot of case.
MS decided other wise and went HD with twice the main memory and the same daughter die which ultimately allowed them to extend their product life cyle.

Yeah, I think it might have been ERP, pointing out that SD with 4xMSAA would fit perfectly in the edram. They definitely had one eye on HD resolutions with the hardware tiling support but it looks like focus shifted rapidly towards HD in 2004/2005. The 360 has survived and prospered despite some pretty huge changes in MS strategy and changes in the industry itself, some starting back before the 360 was even released.
 
Who said Sony don't care about costs? Who said graphics are all that matters?

You basically said as long as Sony is going for a high performance product, they wont have to worry about shrinks or reducing costs. What else would be your logic there? Your own quotes

I think it's great that the PS34 needs lots of bandwidth, as it means we're not looking at a Trinity level APU. If the benefits of EDRAM won't be there for Sony, like they have been for Sony and other manufacturers in the past, then it makes sense to go with something else.


If AMD are rapidly developing a large high performance APU for Sony and they don't intend to get into a decade long cost reduction slog against MS then it's a very different situation than it was for the 360.

I dont understand this all, it's not reconcilable. How would the benefits not be there? Sony does not want to shrink but it's crucial for MS? Sony does not need to cost reduce but MS does? Sony can wave a magic wand and be above a "decade long cost reduction slog" against their competitor, but it's crucial ms not use a 256 bus? You're not being consistent at all.

I took you as saying "As long as Sony takes the graphics high road, they dont need to compete on price". From this is what I took the graphics statement. If you are arguing from some other point, please feel free to explain it...


You can't see why it's an iota different from the 360? From the top of my head I'd pick:

- The requirement of sampling from buffers making the 360's edram solution outdated
- Using a (larger) APU instead of separate (smaller) CPU and GPU
- Sony cutting back R&D
- Sony not having the time to work on a customised part like MS did
- No IBM involvement
- Process shrinks and their cost benefits slowing down

... as being notable differences. There are probably other differences too.

Again, none of these points apply to MS? why so much double standard? Why does Sony not have time BTW, they've had six years?

I'm just not getting your points at all. Why can you not understand the inconsistency in what you're writing? You're basically laying out giant arguments for why EDRAM is necessary for MS. But then you turn around and completely ignore those arguments for Sony. For example, you said it is necessary for MS to use a <256 bus so they can shrink. Why is not necessary for Sony?



I think the balance of evidence shows it was a net win for the 360. You have talented developers flat out stating on this board that that the edram is crucial to the 360 performing the way it does, and you have measurements showing that MS shrank the GPU way below the size demanded by a 256-bit bus (twice) in the time in which they were combating RROD and bringing the 360 into profitability. We also know that profitability through shrinking was MS's business plan from the beginning.

We have sebbi liking the EDRAM and Humus (?) wrote some devastatingly critical posts against it, which at the time everybody seemed to accept without objection...we dont have a whole ton of devs commenting about such things here at all. I'm sure EDRAM is better than none in the 360 setup, the question is was it better than alternative setups, namely a PS3 style system, and the EDRAM budget allocated to other things like more RAM or shaders.

We also know that MS could make the current XCGPU on IBM 45nm chip if they wanted (and Nintendo are doing this for WiiU) but they choose to keep making part of it on an older and undoubtedly cheaper node, at least for the moment.

And?

I never said that dual ram pools means more chips - it doesn't have to. A 256-bit bus would have demanded double the minimum number of memory chips for the 360 though, which would have meant a larger and more complex motherboard and worse $/MB for the platform in the long run.

I wasn't arguing for a 256 bus in 360 and never have, rather a dual pool 128 bus like PS3 arrangement, and crucially, the EDRAM budget being allocated elsewhere (as always, one can never look at system design in a vacuum, just as I always point out the question is never "do you want backwards compatibility in a console"? To which everybody answers "yes!". The real question is "do you want BC or that cost instead allocated to more RAM/CPU/GPU?". Forgive me if that wasnt clear.
 
You basically said as long as Sony is going for a high performance product, they wont have to worry about shrinks or reducing costs.

I didn't say that at all, you're just making that up.

What else would be your logic there? Your own quotes

My quotes show I clearly said nothing of the sort.

- It's great that the PS4 APU needs lots of bandwidth, because it means it's high performance.
- If the benefits of EDRAM won't be there for Sony, they shouldn't use it.
- If they don't intend to get into a decade long cost reduction slog against MS, then a bus limiting shrinking won't be the factor it would have been

There is no contradiction.

I dont understand this all, it's not reconcilable. How would the benefits not be there? Sony does not want to shrink but it's crucial for MS? Sony does not need to cost reduce but MS does? Sony can wave a magic wand and be above a "decade long cost reduction slog" against their competitor, but it's crucial ms not use a 256 bus? You're not being consistent at all.

You're doing a funny mix of ignoring key points and re-interpreting others, then trying to create false dichotomies. That's not my fault and I can't fix it.

I don't know what MS's plans are. A different strategy might result in different factors being more or less important. This generation, cost reduction through shrinking was crucial to the success of MS, but not to the Wii, which cost reduced less aggressively. The magic wands are your idea.

I took you as saying "As long as Sony takes the graphics high road, they dont need to compete on price".

Then you dun messed up big son!

From this is what I took the graphics statement. If you are arguing from some other point, please feel free to explain it...

I've explained several times now but each time but you do that funny thing I pointed out above.

Again, none of these points apply to MS? why so much double standard? Why does Sony not have time BTW, they've had six years?

Oh wow! You said there wasn't an iota of difference between the 360 and PS4 so I responded with several possibly important differences and this is the response?

I also fail to see why it's different an iota from 360, since that was a very bleeding edge performance device at the time. Morseo than PS4 will be, almost certainly (as Xenos was a top tier GPU of the era, while pitcairn is more mid range)!

You can't see why it's an iota different from the 360? From the top of my head I'd pick:

- The requirement of sampling from buffers making the 360's edram solution outdated
- Using a (larger) APU instead of separate (smaller) CPU and GPU
- Sony cutting back R&D
- Sony not having the time to work on a customised part like MS did
- No IBM involvement
- Process shrinks and their cost benefits slowing down

... as being notable differences. There are probably other differences too.
Again, none of these points apply to MS? why so much double standard?

I can't have a discussion under these conditions - you keep switching the terms of the exchanges and dropping important points when they don't fit your narrative.

I'm just not getting your points at all. Why can you not understand the inconsistency in what you're writing? You're basically laying out giant arguments for why EDRAM is necessary for MS.

No I'm not! Oh my wow.

But then you turn around and completely ignore those arguments for Sony. For example, you said it is necessary for MS to use a <256 bus so they can shrink. Why is not necessary for Sony?

It *was* necessary for MS with the 360, because that was their business plan! I don't know what MS are going to do this time! If they want smaller chips than are required for a 256-bit bus (either at launch, soon after or years later) then they're going to have to either limit performance or use EDRAM. I'd assumed another 360 like approach, but I don't know this.

Edit: I also pointed out why it may not be necessary for Sony to shrink beyond the limits of a 256-bit bus, but you have ignored this and simply restated the question. There is literally nothing I can do about you refusing to acknowledge an answer (at all, in any way), except to not keep answering it.

I wasn't arguing for a 256 bus in 360 and never have, rather a dual pool 128 bus like PS3 arrangement

Two 128-bit GDDR3 buses would require twice as many memory chips as the 360S uses, and all the other stuff that goes along with that.

We aren't getting anywhere, we should probably call it a day and agree to disagree about ... most things. ;)
 
Last edited by a moderator:
It's really not.



I'm talking about performance per dollar rather than market penetration. Using your logic though, the edram-less Xbox lost the most money of all (biggest commercial failure of the gen) and the edram less PS3 also lost the most money of the generation while managing to come last. PS2, Wii, 360 (eventually) have been great successes and even the GC was profitable.

But again I was talking about performance / $. Historically a smaller main memory bus and pool of faster embedded ram has offered a cost advantage to at least some players (co-incidentally the most successful ones). That may of course change.



That is retail price not manufacturing price. The 360S at $299 supposedly had a $115 profit margin for MS at the start of this year (analyst figure, not official). MS are laughing now, Sony not so much. A small edram chip on an old process seems preferable to separate memory pools using outdated memory that no-one else uses any more (that would be both the PS3's memory pools, I guess).

I still can't find the confirmation of a 256-bit bus for PS4 btw. Linky?


What if the 8GB of ram is a fast cache connected to the gddr5 ?
example:

APU===Gddr5 UMA===ddr3--HDD

The ddr3 is cheaper than an SSD and faster. If Microsoft/Sony fund a custom memory design you;ll have a large pool of memory and fast ram.

Does that not solve the memory issue ?

Im no expert :rolleyes:

Sorry for the double post, i messed up the quote. Cant edit.
 
I don't quite understand whats the issue here... As of now, PS4 is going for 2 GB of RAM (GDDR5). Low amount of memory, high bandwidth. Durango is 8 GB of RAM (GDDR3 or DDR4). High amount of memory, low bandwidth. To work around bandwidth limitation they will have to go for embedded memory. Thats what they did with 360 already, so I don't know whats the problem. Although eDRAM in 360 is write only and thats biggest problem devs had with it (along with little memory). If its read/write, with respectable amount of memory I would say go for it.

Point is, MS will be able to get alot of memory (albeit slower) and work around bandwidth problem, while Sony won't be able to work around low amount of memory problem.

Option #2. Go with a low end GPU and bandwidth isn't an issue. e.g. Cape Verde has like 70GB/s.

Look at some of the rumors: (1) 1TFLOPs (2) less GPU than the PS4 (<1.8TFLOPs) (3) 6670 class GPU via IGN (4) MS leak showing 4-6x faster with a very low ALU count.

If these rumors are true, a wide DDR4 bus may be more than sufficient to feed such a GPU, especially if 720p is the target resolution.

Something to chew on.
 
Option #2. Go with a low end GPU and bandwidth isn't an issue. e.g. Cape Verde has like 70GB/s.

Look at some of the rumors: (1) 1TFLOPs (2) less GPU than the PS4 (<1.8TFLOPs) (3) 6670 class GPU via IGN (4) MS leak showing 4-6x faster with a very low ALU count.

If these rumors are true, a wide DDR4 bus may be more than sufficient to feed such a GPU, especially if 720p is the target resolution.

Something to chew on.

Let's hope you are wrong...that sounds more like a good fit for the wuu.
 
I wish WiiU has 8g of ddr4 and a 1TF gpu, then its games wouldn't look below the likes of GOWA or TLOU. But all in all the gpu spec should be much higher than a 6670 for the 720.

You'd like to hope wouldn't you. I guess if the leaked doc holds any merit, the proposed dual APU setup (one for games and one for BC/services) would limit their GPU options given an overall fixed silicon budget and thus eventual retail price point.
 
Option #2. Go with a low end GPU and bandwidth isn't an issue. e.g. Cape Verde has like 70GB/s.

Look at some of the rumors: (1) 1TFLOPs (2) less GPU than the PS4 (<1.8TFLOPs) (3) 6670 class GPU via IGN (4) MS leak showing 4-6x faster with a very low ALU count.

If these rumors are true, a wide DDR4 bus may be more than sufficient to feed such a GPU, especially if 720p is the target resolution.

Something to chew on.

Arent you the guy chewing me out for suggesting Mars/Cape Verde? Lol.

Now you're getting it...

And yeah, I've already brought up 256 DDR4 could feed a HD7770. That would require MS to not waste money on EDRAM though, and I just cant see them doing that even if it's unnecessary, LOL. They'll put EDRAM in just to waste people's time.
 
Back
Top