Next gen RAM choices *spawn

It's really not.

I disagree. It's certainly a settled issue in dev kits. You have to argue that they then plan to change it for retail.

But again I was talking about performance / $. Historically a smaller main memory bus and pool of faster embedded ram has offered a cost advantage to at least some players (co-incidentally the most successful ones). That may of course change.

Failure of Gamecube shows EDRAM itself isn't the key, there are many factors. I cant speak to past architectures, all I know is that I've always been not sold that it was a beneficial decision in 360's case.


That is retail price not manufacturing price. The 360S at $299 supposedly had a $115 profit margin for MS at the start of this year (analyst figure, not official). MS are laughing now, Sony not so much. A small edram chip on an old process seems preferable to separate memory pools using outdated memory that no-one else uses any more (that would be both the PS3's memory pools, I guess).

Very doubtful 299 360 profits $115.

We dont really know the hardware profit for either. I was just pointing out if anything Sony has dropped the price more aggressively this gen.

I dont see it as preferable, it would seem to me to be more expensive, a dedicated dollop of silicon (that is also preventing MS from combining everything into one SOC to date) vs just running some more buses. For that matter 360 is using outdated memory too, that's not anything to do with EDRAM lacking here.


I still can't find the confirmation of a 256-bit bus for PS4 btw. Linky?

It's based on the commonly accepted and much rehashed alleged PS4 dev kit specs, which can be found written up here. There's no official confirmation of course, nor will there likely ever be, but a known dev on neogaf confirmed the specs although he said they were old. But the 192 GB/s spec is the key, along with rumor of Sony considering upping the ram to 4GB.
 
I disagree. It's certainly a settled issue in dev kits. You have to argue that they then plan to change it for retail.

You said it was impossible to have 8GB without a 256-bit bus - it's not though, and that was my only point. I don't know what's in the dev kits.

It's possible they could have DDR3 in the dev kits and DDR4 in the finished machines, or split memory pools (GDDR5 and DDR3/DDR4) possibly with different bus widths. I dunno.

We dont really know the hardware profit for either. I was just pointing out if anything Sony has dropped the price more aggressively this gen.

That's true, although it's likely they bled more for it.

I dont see it as preferable, it would seem to me to be more expensive, a dedicated dollop of silicon (that is also preventing MS from combining everything into one SOC to date) vs just running some more buses. For that matter 360 is using outdated memory too, that's not anything to do with EDRAM lacking here.

A SoC is at least an option for the 360 through, and IBM would appear to have the process, it just might not be worth it. That edram will be dirt cheap now after all. You can never shrink a bus though.


It's based on the commonly accepted and much rehashed alleged PS4 dev kit specs, which can be found written up here. There's no official confirmation of course, nor will there likely ever be, but a known dev on neogaf confirmed the specs although he said they were old. But the 192 GB/s spec is the key, along with rumor of Sony considering upping the ram to 4GB.

Thanks. The bandwidth indicates GDDR5 on a 256-bit bus, as you say. So yeah, no edram for Sony. Given an all AMD solution that does make sense, and it indicates way above Trinity GPU performance.

The low amount of L2 is interesting especially if it's an indicator of what to expect from all Steamroller based parts.
 
Last edited by a moderator:
You said it was impossible to have 8GB without a 256-bit bus - it's not though, and that was my only point. I don't know what's in the dev kits.

It's possible they could have DDR3 in the dev kits and DDR4 in the finished machines, or split memory pools (GDDR5 and DDR3/DDR4) possibly with different bus widths. I dunno.



That's true, although it's likely they bled more for it.



A SoC is at least an option for the 360 through, and IBM would appear to have the process, it just might not be worth it. That edram will be dirt cheap now after all. You can never shrink a bus though.




Thanks. The bandwidth indicates GDDR5 on a 256-bit bus, as you say. So yeah, no edram for Sony. Given an all AMD solution that does make sense, and it indicates way above Trinity GPU performance.

The low amount of L2 is interesting especially if it's an indicator of what to expect from all Steamroller based parts.

If they can attach 4gb to that bus unified...then we are all in for a special treat!
 
If they can attach 4gb to that bus unified...then we are all in for a special treat!

Yeah, that does sound pretty good (especially considering the bandwidth).

Maybe a sacrifice for the larger gpu?

I thought that too, but maybe the same logic could drive them to implement a smaller L2 on all their APUs. Trinity with another 128 shaders and DDR4 could really start to make some impact as a cheap gaming APU.

Maybe the PS4 is kind of like a Pitcairn GPU with 2 Streamroller modules taped on. Wouldn't that come in at about 300 mm^2 on GF 28nm? Kind of big, but as your only chip it might make sense. And would probably have room to shrink even with a 256-bit bus, so cost reductions and a decent shelf life might not be off the cards. It'd definitely justify that 4GB and 192 GB/s in rumour.
 
Thanks Al, so then it's just the actual DDR3 chips they currently use with gpu's (as opposed to the ones that go on dimm's) like the GT640. I suppose there will eventually be a gDDR4 too.

Yes. It is basically a "part" that takes the margin that was sunk into long board traces, multiple connectors, multiple loads and converts them all to frequency gain. 1 load per channel with no connectors is a much much better electrical enviroment than 3-8 loads per channel, with multiple connector stubs, plus a connector, all after a relatively long trace length.
 
Yeah, that does sound pretty good (especially considering the bandwidth).



I thought that too, but maybe the same logic could drive them to implement a smaller L2 on all their APUs. Trinity with another 128 shaders and DDR4 could really start to make some impact as a cheap gaming APU.

Maybe the PS4 is kind of like a Pitcairn GPU with 2 Streamroller modules taped on. Wouldn't that come in at about 300 mm^2 on GF 28nm? Kind of big, but as your only chip it might make sense. And would probably have room to shrink even with a 256-bit bus, so cost reductions and a decent shelf life might not be off the cards. It'd definitely justify that 4GB and 192 GB/s in rumour.

Good point, a similar process but from a different angle and thus different tradeoffs/compromises. It makes you wonder about MS too. It seems they started out with their cpu specs/needs in mind and then were seeing what gpu they could fit. However, they may have done a reload and are now looking at it from the opposite angle. It would explain some of the changes that we've heard rumored such as a switch to x86
 
Steamroller and Kaveri are set for 28nm Bulk, so yes, the proposed Orbis design would more than likely be an APU at over 300mm^2, which would allow for a 256-bit bus after a couple of shrinks even. It sounds like a really good system to me, I do hope they go for 4GB though.
 
Maybe a sacrifice for the larger gpu?

could be a complete new cache design, its one of those things where it could be a win win for AMD, sony pays for a new cache design for PS4 which can be used on APU's and they can use there current style on the big server SOCs.


Write through L1 but without all the stuff needed for probing of the L2 from the L3, reduced size bring it down from 20 odd cycles to 12-14 cycles maybe lower?

you dont sarifice memory perofrmance for your CPU for a few more shaders, your just asking for crappy performance. 4ns(llano L2 14 cycles?) vs 70ns(DDR3 memory) for access time, not have enough cache is a one way trip to bad performance.
 
A SoC is at least an option for the 360 through, and IBM would appear to have the process, it just might not be worth it. That edram will be dirt cheap now after all. You can never shrink a bus though.

Thanks. The bandwidth indicates GDDR5 on a 256-bit bus, as you say. So yeah, no edram for Sony. Given an all AMD solution that does make sense, and it indicates way above Trinity GPU performance.

The low amount of L2 is interesting especially if it's an indicator of what to expect from all Steamroller based parts.

EDRAM is really great, but it's really great if PS4 doesn't have it?


That edram will be dirt cheap now after all. You can never shrink a bus though.

Dirt cheap compared to what? It costs similar to the CPU and GPU. You can never shrink a bus...and they are always much cheaper I would assume, they are just traces not silicon.

The 360 CPU and GPU are combined in one. It would already be an SOC if not for the EDRAM, which cannot be combined at this point.
 
You can never shrink a bus...and they are always much cheaper I would assume, they are just traces not silicon

They can be surprisingly expensive, especially as they reserve a lot of room on the edges of silicon, possibly making it impossible to shrink past a certain point.

If they go for a very large APU design, one of the biggest advantages is probably that they can go for a really wide bus with a lot of cheap ram and get great speed. IMHO 256bit 8GB DDR4 would be the absolute ideal ram config.

However, the cost of that approach is that they would never be able to shrink past 190mm² or so, at least without crazy 3d integration. So, if you want two full node shrinks during the lifetime of the chip, the chip should be 700mm² or so at release -- which is just plain unmanufacturable. So either you put faith in unproven 3d integration tech, or you accept that your cost-reduction shrinks will not be very useful.

(Also, since the chips will be pad-limited, even if they will have a lot of wasted silicon, they cannot bring the low-speed io stuff on-die, so it will always be a two-chip system).
 
EDRAM is really great, but it's really great if PS4 doesn't have it?

EDRAM sucks, because only the successful consoles have it?

Dirt cheap compared to what? It costs similar to the CPU and GPU. You can never shrink a bus...and they are always much cheaper I would assume, they are just traces not silicon.

Always much cheaper, yet the (profitable) PS2, GC, Wii and 360 used edram instead of fatter buses? Unlike the more expensive, highest BOM, and unprofitable Xbox and PS3?

It's possibly not safe to assume that that wider buses are always cheaper than embedded memory.

The 360 CPU and GPU are combined in one. It would already be an SOC if not for the EDRAM, which cannot be combined at this point.

Xbox CGPU is already a SoC. That the edram hasn't been included in the main die doesn't mean it's not possible.
 
They can be surprisingly expensive,

You can get a cheap motherboard for $30. With much more than traces.

Perimeter limits, we should probably be discussing in relation to PS4, since thats more confirmed at 256 bit, so all those troubles you expound upon will be Sony's for sure. Sony plans no shrinks or what?
 
EDRAM sucks, because only the successful consoles have it?

I'm asking why you seemed to think it was a good idea for PS4 to not have EDRAM then, judging by your comment.

Always much cheaper, yet the (profitable) PS2, GC, Wii and 360 used edram instead of fatter buses? Unlike the more expensive, highest BOM, and unprofitable Xbox and PS3?

PS3 is profitable. Xbox wasn't profitable because MS didn't own the hardware IP's.

It's possibly not safe to assume that that wider buses are always cheaper than embedded memory.

Or vice versa.

Xbox CGPU is already a SoC. That the edram hasn't been included in the main die doesn't mean it's not possible.

As far as I know the EDRAM lags at least 1 node, along with probably other issues. Regardless, if the EDRAM wasn't there the 360 would already be an full SOC since 2010.
 
They can be surprisingly expensive, especially as they reserve a lot of room on the edges of silicon, possibly making it impossible to shrink past a certain point.

If they go for a very large APU design, one of the biggest advantages is probably that they can go for a really wide bus with a lot of cheap ram and get great speed. IMHO 256bit 8GB DDR4 would be the absolute ideal ram config.

However, the cost of that approach is that they would never be able to shrink past 190mm² or so, at least without crazy 3d integration.

So 190mm^2 is the limit (more or less) for a 256-bit bus? Thanks.

I know it's proprietary MS stuff, but do you have any idea how the bus for the 360's edram would impact on the minimum GPU size? The GPU got as small as 121 mm^2 with the Jasper revision, and was already well under 190mm^2 by the Falcon revision in 2007, so they appear to have been seeing real benefits of avoiding a 256-bit bus quite early on. The edram bus BW is quite high, but seems physically small enough to work with a pretty tiny chip.

The 360S CGPU die is 168 mm^2, aparrently.

So, if you want two full node shrinks during the lifetime of the chip, the chip should be 700mm² or so at release -- which is just plain unmanufacturable. So either you put faith in unproven 3d integration tech, or you accept that your cost-reduction shrinks will not be very useful.

That would indicate either starting at a reasonably low BOM, and not planning on as long a generation as this. Or moving away from the current, tech driven business model I guess.

(Also, since the chips will be pad-limited, even if they will have a lot of wasted silicon, they cannot bring the low-speed io stuff on-die, so it will always be a two-chip system).

Room for another 360 shrink yet then even without the edram, if the economics were ever to work out.
 
I don't quite understand whats the issue here... As of now, PS4 is going for 2 GB of RAM (GDDR5). Low amount of memory, high bandwidth. Durango is 8 GB of RAM (GDDR3 or DDR4). High amount of memory, low bandwidth. To work around bandwidth limitation they will have to go for embedded memory. Thats what they did with 360 already, so I don't know whats the problem. Although eDRAM in 360 is write only and thats biggest problem devs had with it (along with little memory). If its read/write, with respectable amount of memory I would say go for it.

Point is, MS will be able to get alot of memory (albeit slower) and work around bandwidth problem, while Sony won't be able to work around low amount of memory problem.
 
I'm asking why you seemed to think it was a good idea for PS4 to not have EDRAM then, judging by your comment.

I think it's great that the PS34 needs lots of bandwidth, as it means we're not looking at a Trinity level APU. If the benefits of EDRAM won't be there for Sony, like they have been for Sony and other manufacturers in the past, then it makes sense to go with something else.

If AMD are rapidly developing a large high performance APU for Sony and they don't intend to get into a decade long cost reduction slog against MS then it's a very different situation than it was for the 360.

PS3 is profitable. Xbox wasn't profitable because MS didn't own the hardware IP's.

It may not have been profitable even owning the IP. The HDD, poor sales of memory cards, greater power and cooling requirements, the bigger case, more memory, more complex motherboard etc up against the super cheap $129 PS2 might still have lost a bucket of money.

Not owning the IP was not the only issue working against the Xbox. Sony own the PS3 IP, iirc, and it's only the high retail price of the 360 that's letting them charge so much. If they were up against a $129 fully featured Xbox they'd be stuffed.

Or vice versa.

I never claimed that edram would always be cheaper, just that you can't say that a wider bus is always cheaper than embedded memory. And it almost certainly isn't. Come to think of it, WiiU has some kind of embedded memory too doesn't it?

As far as I know the EDRAM lags at least 1 node, along with probably other issues. Regardless, if the EDRAM wasn't there the 360 would already be an full SOC since 2010.

And using a larger chip, larger and more complex mobo, and more memory chips (with worse $/MB), too. Which was probably on MS's mind when they made their decision.
 
Back
Top