Next gen RAM choices *spawn

Didn't know that about Orbis - where has it been confirmed btw?

It's practically what this whole thread is on about!


With the 360 there were several stages between the original components and integrating the CPU and GPU into one. 2005 -> 2010 would have been quite the no-mans land if they hadn't been able to shrink.

Well, looks like PS4 is gonna have some problems then...

360 90>65>45...xb3=?

Node shrinks are increasingly less cost effective.

The alternative to 256 bus is typically EDRAM, it's own horror show of costs imo. Even now, it's apparently always lagging a node or two on 360, and still a separate die in 2012.

Anyways according to our best source BG, 360 dev kits have 12GB RAM for a retail target of 8GB, which correct me if I'm wrong, would actually require a 256 bus.
 
It's practically what this whole thread is on about!

I'd missed that Orbis had been confirmed (or as confirmed as a "trusted" leak) as having a 256-bit bus though. I'll have to go and trawl back through the NGT thread. Google is giving me nuthin'.

Well, looks like PS4 is gonna have some problems then...

360 90>65>45...xb3=?

360 had 90, 80, 65, 45 (360S) for the GPU; 90, 80, 65? (360S) for the edram; 90, 65, 45 (360S) for the CPU. Maybe we'll get a 32nm 360SS at some point with the edram included in the SoC, or maybe not if the 720 lands at mainstream price points...

Node shrinks are increasingly less cost effective.

Yeah, but one or two over the lifetime of the platform shouldn't be out of the question - assuming we aren't moving to a phone style rapid upgrade cycle that is.

The alternative to 256 bus is typically EDRAM, it's own horror show of costs imo. Even now, it's apparently always lagging a node or two on 360, and still a separate die in 2012.

Nintendo have fared pretty well with embedded memory, as did Sony in the past. IBM claim to have a 32nm process ready and waiting for high performance applications - they specifically mention games consoles - that incorporates edram. While it may have come with complications on the 360 it's pretty clear from Sebbi's breakdown that the 360 would have been up poop creek without it, despite using pretty much the fastest memory around at the time.

Anyways according to our best source BG, 360 dev kits have 12GB RAM for a retail target of 8GB, which correct me if I'm wrong, would actually require a 256 bus.

I think you could connect fewer lanes per chip (say 16 per chip pair in clamshell mode) or route the same 32 lanes to four chips (which might make board layout messier) but if you did things like the 360 then I guess you'd need a phatter bus.

How do PC DIMMS handle packing on up to 16 16-bit memory chips and connecting over a 64-bit memory channel?
 
The alternative to 256 bus is typically EDRAM, it's own horror show of costs imo. Even now, it's apparently always lagging a node or two on 360, and still a separate die in 2012.

The EDRAM die is separate because it is the most cost effective solution. It lags a node because it is the most cost effective solution.

The CPU+GPU die is fabricated using a process tuned for higher performance with many metal layers. The EDRAM is made on a process tuned for DRAM cells, with few metal layers. More metal layers mean more processing steps and poorer yields, - thus higher cost. If they integrated the EDRAM they'd pay the higher per mm² price for the EDRAM array, and the cells would have poorer performance (leak more charge).

Anyways according to our best source BG, 360 dev kits have 12GB RAM for a retail target of 8GB, which correct me if I'm wrong, would actually require a 256 bus.

Depends on the RAM technology used. DDR4 supports x8, x16 and x32, Micron already demoed a 4Gbit x8 chip. Sixteen of those (for a 128 bit bus) would yield 8GB.

Cheers
 
Depends on the RAM technology used. DDR4 supports x8, x16 and x32, Micron already demoed a 4Gbit x8 chip. Sixteen of those (for a 128 bit bus) would yield 8GB.


If they do go with DDR4, I think it's a very cost effective solution for the life time of the console. Because I think it the costs will plummet and density will go drastically up.

Starting out with 16 4Gb x8 chips on a 128-bit bus, they could move quickly to 8 8Gb x16 chips, and then 4 16Gb x32 chips. If DDR4 will take over the PC market, then this has potential to really reduce the cost of the console vs using something like GDDR5 which likely will never see such cost reductions.

I am sure that MS is well aware of bandwidth issues that will be arise from using only DDR4 so I'm expecting to see something innovative like a large "cache" on silicon interposer.

Hypothetically, if you had a 8 GB of DDR4 connected with a 50 GB/s channel to the CPU/GPU and a 256-512MB buffer accessible by both the GPU and CPU at 256GB/s, would that be good enough? (Ignore for a moment if that's commercially feasible)
 
Micron already demoed a 4Gbit x8 chip. Sixteen of those (for a 128 bit bus) would yield 8GB.

Cheers

"Micron already demoed" doesn't sound like a realistic solution for a dev kit shipping today to me. I think it's settled it's a 256 bus in Durango, 8GB is impossible without it.

Nintendo have fared pretty well with embedded memory, as did Sony in the past. IBM claim to have a 32nm process ready and waiting for high performance applications - they specifically mention games consoles - that incorporates edram. While it may have come with complications on the 360 it's pretty clear from Sebbi's breakdown that the 360 would have been up poop creek without it, despite using pretty much the fastest memory around at the time.

What does "fared pretty well" mean? Maybe they'd have fared better without it. For Nintendo, Gamecube did not fare very well at all, so that's 50% failure rate, and that with Wii not competing graphically. Anyways, that was in past gens that I cant speak to. For the second part, yes because 360 was designed for EDRAM. I've always sort of thought using a PS3 style split bus and using the extra EDRAM $/transistors for something else (more RAM or shaders, etc) would have been better. But that's an old argument.

Heck as far as costs go, it means little but the PS3 has actually cost reduced better than 360 imo, as the 249 PS3 (with hard drive) compares to the 299 360 (with hard drive).
 
GDDR5 is much more costly for lots of reasons ... but how many of those reasons hold when the purchaser is willing to sign a multiyear contract for truly vast quantities?
 
GDDR5 is much more costly for lots of reasons ... but how many of those reasons hold when the purchaser is willing to sign a multiyear contract for truly vast quantities?

Presumably non factor. The same discount theory should apply to DDR. Also, I'm guessing AMD/Nvidia have shipped more GPU's with DDR5 than the console makers will ship next gen consoles already. So volume isn't the issue.
 
Presumably non factor. The same discount theory should apply to DDR.
The margins on top of manufacturing costs on GDDR5 start off higher (of course normally they have to be higher to earn back non recurring costs on lower volume) so they can drop lower.
So volume isn't the issue.
There is the risk aspect too, Microsoft can throw in volume purchase guarantees smaller players can not ... even if in aggregate the GPU business is larger. In these hard times some guaranteed income can be worth quite a bit.
 
So volume isn't the issue.


Today it's not. But it might be in the future. How many GPU's (or any other devices) use GDDR3 today like the 360 does?

I think it's more important that MS and Sony predict correctly what the predominant and low cost memory (and one that meets all the tech specs) will be in the future and not necessarily today. Likely, they'll sell the majority of their consoles in year 3 and beyond so it's important that costs can be reduced by significantly by then. I'm not sure using GDDR5 would be a good bet for that.
 
Today it's not. But it might be in the future. How many GPU's (or any other devices) use GDDR3 today like the 360 does?

True and all, but I cant imagine Sony/MS are paying exorbitant amounts for their out of date RAM, nah mean? Especially with the last penny cost focus on consoles.

I guess they have a special dedicated assembly line for it.
 
True and all, but I cant imagine Sony/MS are paying exorbitant amounts for their out of date RAM, nah mean? Especially with the last penny cost focus on consoles.

I guess they have a special dedicated assembly line for it.

Considering it is at best boutique ram, someone is either paying a lot or someone is soaking a lot of cost
 
If you want to learn about g-spec DDR3 I would suggest you look it up.



DDR4 is sampling now. HMC isn't a capacity solution but a bandwidth solution. HMC isn't even sampling currently. It is a lab technology without even a specification. DDR4 has a spec and is sampling from multiple vendors to multiple companies and will revenue ship at the end of this year or the beginning of 2013.

It takes 16 4Gb drams to hit 8GB of memory. And it isn't a lot of chips. You can fit that on a single sided DIMM.

DDR4 has severe patent problems right now, correct?

And as other posters have said, google is failing at finding this g-spec stuff.
 
And as other posters have said, google is failing at finding this g-spec stuff.

hm... So after some digging around, I found references to gDDR2/3 on Samsung's site (yay). The data sheet (2011) does mention up to 2400Mbps speed grade as opposed to the site only mentioning up to 2133Mbps, though I suppose maybe that has more to do with demand from graphics card manufacturers and the process node; the 1.5V suggests 40nm i.e. old stuff. The 1Gbit/x16 DRAMs seemed strange to me at first, but I guess it's probably just a symptom of the particular GPUs that use DDR3 - they're low end, don't need more than 1GB, have a less-than-128-bit bus, and need to be cheap. The data sheet itself even lists up to 8Gb configurations (page 10).
 
Considering it is at best boutique ram, someone is either paying a lot or someone is soaking a lot of cost

Why? i would guess at some point it's like building a toaster. Toaster X may not be sold in stores anymore, but that doesnt mean it cant be manufactured as a commodity fairly easily.

Again, I highly doubt Sony selling all the stuff in PS3 for $249 at profit is eating a lot of ram cost. And XDR would seem about the worst possible scenario as far as uncommon RAM.
 
Why? i would guess at some point it's like building a toaster. Toaster X may not be sold in stores anymore, but that doesnt mean it cant be manufactured as a commodity fairly easily.

The issue is are there manufacturers running GDDR3 as part of their normal production currently.

Again, I highly doubt Sony selling all the stuff in PS3 for $249 at profit is eating a lot of ram cost. And XDR would seem about the worst possible scenario as far as uncommon RAM.

Depends. Expensive and boutique merely means something is no long cost competitive, it may or may not be individually hugely expensive.
 
"Micron already demoed" doesn't sound like a realistic solution for a dev kit shipping today to me. I think it's settled it's a 256 bus in Durango, 8GB is impossible without it.

It's really not.

What does "fared pretty well" mean? Maybe they'd have fared better without it. For Nintendo, Gamecube did not fare very well at all, so that's 50% failure rate, and that with Wii not competing graphically.

I'm talking about performance per dollar rather than market penetration. Using your logic though, the edram-less Xbox lost the most money of all (biggest commercial failure of the gen) and the edram less PS3 also lost the most money of the generation while managing to come last. PS2, Wii, 360 (eventually) have been great successes and even the GC was profitable.

But again I was talking about performance / $. Historically a smaller main memory bus and pool of faster embedded ram has offered a cost advantage to at least some players (co-incidentally the most successful ones). That may of course change.

Heck as far as costs go, it means little but the PS3 has actually cost reduced better than 360 imo, as the 249 PS3 (with hard drive) compares to the 299 360 (with hard drive).

That is retail price not manufacturing price. The 360S at $299 supposedly had a $115 profit margin for MS at the start of this year (analyst figure, not official). MS are laughing now, Sony not so much. A small edram chip on an old process seems preferable to separate memory pools using outdated memory that no-one else uses any more (that would be both the PS3's memory pools, I guess).

I still can't find the confirmation of a 256-bit bus for PS4 btw. Linky?
 
But it isn't 8GB of memory at 1/4 the bandwidth. It is 8GB of memory at >50% of the bandwidth vs 2GB of memory at 100% of the bandwidth.

Then I would take the 8gb with half the bandwidth over 2gb with full bandwidth...

If it was a choice between 4gb with full bandwidth vs 8gb half..then in that instance I would go with the former.

If you could use the 8gb unified setup and be ambitious and stick 32-64mb of edram as a kind of L4 cache between cup and gpu...would that not solve the bandwidth AND memory capacity issues?

What would the expense be?

Edit. Of course the storage option would effect the decision would it not?
 
hm... So after some digging around, I found references to gDDR2/3 on Samsung's site (yay). The data sheet (2011) does mention up to 2400Mbps speed grade as opposed to the site only mentioning up to 2133Mbps, though I suppose maybe that has more to do with demand from graphics card manufacturers and the process node; the 1.5V suggests 40nm i.e. old stuff. The 1Gbit/x16 DRAMs seemed strange to me at first, but I guess it's probably just a symptom of the particular GPUs that use DDR3 - they're low end, don't need more than 1GB, have a less-than-128-bit bus, and need to be cheap. The data sheet itself even lists up to 8Gb configurations (page 10).

Thanks Al, so then it's just the actual DDR3 chips they currently use with gpu's (as opposed to the ones that go on dimm's) like the GT640. I suppose there will eventually be a gDDR4 too.
 
Back
Top