The ESRAM was there before the 8GB of RAM. The size of the RAM pool was not responsible for the decision to use ESRAM.
In the end the difference would at most would be probably 6 W against 16 W. The argument of if that's a lot or not would probably be up to the criteria. Looking from a Wattage/bandwidth perspective GDDR5 is probably equal/better even when comparing 40nm to 20nm modules. Standing from Wattage/size perspective GDDR5 is probably ~half of DDR3.
The ESRAM was there before the 8GB of RAM. The size of the RAM pool was not responsible for the decision to use ESRAM.
Even though it did not show in the launch price I think that if MSFT were cornered and had to fight on price foremost they made it so they are likely to be in a good situation.Cost of the esram will fall considerably over the console lifetime as die revisions take advantage of process node shrinks and the soc size also shrinks. The price of gddr5 is relatively fixed over its lifetime.
Edram has no bandwidth advantage only power and die areaI wonder why they didn't go with edram which would have offered much larger sizes and higher bandwidth in a smaller footprint. I would think size and bandwidth would trump any advantages lower latency would provide.
Cutting corners because it would be cheaper with esram in the long run?
Edram has no bandwidth advantage only power and die area
The issue with EDRAM is that it is pretty boutique and ties you to a couple of foundries, the roadmap are not as aggressive as with standard lithography, etc. I suspect it is more expansive.
See what could happen to Nintendo if Renesas troubles are mishandled.
GDDR5 was 70nm in 2008 and it's 30nm right about now.Cost of the esram will fall considerably over the console lifetime as die revisions take advantage of process node shrinks and the soc size also shrinks. The price of gddr5 is relatively fixed over its lifetime.
Well, the ESRAM was more tied into the type of ram used instead of the capacity chosen due to the bandwidth in the first place anyway.
It was pretty clear that it was either a large pool of high speed GDDR5, or a huge pool of DDR3 + an eSRAM. 8GB of it was just a choice of "how large is enough."
History doesn't always repeat itself, but iSuppli estimated the XDR in the PS3 at $50 in 2006. 3 years later the slim had a 4x512Mb for only $9.80, and the new super slim has only a pair of 1Gb which cost even less. That seems to contradict all the doom and gloom about XDR that supposedly wouldn't drop in cost. But I have no idea why. Could they have negotiated the contracts very long in advance? Could they do this for GDDR5?That starts delving into questions about the costs of using RAM that doesn't get produced at the massive scale of bog-standard DDR3, and what happens to GDDR5's availability or production cost for an alleged 10-year lifespan for the console.
I really loved the idea that was proposed earlier, upgrading 256bit DDR3-2133 to 128bit DDR4-4166, it looks so cool on paper :smile:GDDR5 is nearer to the end of its life span, especially if graphics boards move to something new in the next decade.
DDR3 is due to be replaced, but its consumption has a longer tail, and it starts out orders of magnitude higher.
I'm curious if there is consideration of moving to a new memory format, if only for cost savings.
There are some crazy-high bandwidth standards coming out in several years, and eventually using them for more mundane bandwidths could lead to savings in power and manufacturing.
No. We know that the original plans were to have 4GB of DDR4 along with 32MB of eSRAM. That's in the Yukon leak from mid-2010. This narrative about the inclusion of the eSRAM and how it related to the choice of DDR3 etc has been falsified since DF asserted it back in February.
You are making assumptions based on Sony's expressed considerations, then applying that to MS. Sony was the one who looked at embedded RAM pools to boost bandwidth and performance but instead opted for a simpler design as their priority. MS's considerations almost certainly would have been well informed by the utility of having eDRAM on 360 since 2005.
GDDR5 was 70nm in 2008 and it's 30nm right about now.
New nodes. Higher density. Less chips. Lower cost.
History doesn't always repeat itself, but iSuppli estimated the XDR in the PS3 at $50 in 2006. 3 years later the slim had a 4x512Mb for only $9.80, and the new super slim has only a pair of 1Gb which cost even less. That seems to contradict all the doom and gloom about XDR that supposedly wouldn't drop in cost. But I have no idea why. Could they have negotiated the contracts very long in advance? Could they do this for GDDR5?
It's one thing to be the main consumer of a given piece of silicon and have costs ramp down over time with new process nodes, etc. XDR.
The same with being one of many consumers of a given piece of silicon and have costs ramp down over time with new process nodes, etc. DDR3.
It's somewhat more complicated when you will for the time being be one of 3 major consumers of a given piece of silicon, but then midway through your product life cycle, 2 of the major players stop using it. Economies of scale suddenly diminish greatly.
DDR3 has a long tail life ahead as it is used in all computing devices from servers down to tablets. It doesn't have a relatively abrupt end of support as usually happens with GDDR.
In other words there's a much longer transition with a long period of overlap between product generations for standard DDR versus GDDR. 5 years from now, Sony may be the only consumer of GDDR5 chips. On the other hand there will still be a plethora of companies using DDR3, hence much greater volume, and lower margins required to still make a healthy profit for the memory manufacturer. For example DDR(1) is still being manufactured. GDDR(1)?
Regards,
SB
While you mistaken my assumption, you also made a big assumption.
You're making assumption that they made the decision based on the eDRAM, I made the assumption they opted for the best choice for as much ram as possible to deal with the multimedia multitasking requirements (while keeping costs and availability in check), not "simpler design", which was Sony's design choice.
We all have the consensus that Sony got lucky with the availability of 4Gb GDDR5 chips just in time to make 8GB GDDR5 mass-production possible.
Choosing 4GB DDR4 back in 2010 may have seemed like a good idea, but looking at how DDR4 is doing now it isn't too hard to see why they changed to DDR3 instead of GDDR5.