The pros and cons of eDRAM/ESRAM in next-gen

It doesn't look like the choice was an economic one, unless they completely missed the mark with their projection of GDDR5 cost. They were targeting 4GB at some point ($14 difference between DDR3 and GDDR5) and ended up with 8GB ($28 difference).

All else being equal, the cost of die area for the ESRAM is practically compensating that difference (something like 1/4 of a $100 chip?), and it would have been a more expensive proposition, comparatively, had they went with the original plan of 4GB. In short, their 4GB solution of DDR3+ESRAM would have been more expensive than GDDR5 and no ESRAM.

I dont know where you are getting those pricing numbers from (not doubting you btw). The time/place and how negotiations go behind the scenes all contribute to these design choices between companies. During that time I dont know what the prices were or the projected costs calculated were. But the decision to include the expensive sensor package with the console probably helped the case to make DDR3 the standard memory in the system because of risks.
 
I still have confidence that there will one day be a dev on this board explaining how they're taking advantage of the esram on X1 in clever ways. That might not happen for a while, but confident it will. I do think MS was backed in a corner and had to choose esram over edram. I was hoping for edram that had bandwidth comparable to PS2's edram (for its time), something like 512 GB/s. Oh well. They ended up selecting esram and I do like the way they've set it up, allowing reads and writes concurrently with main RAM. I anticipate reading developer comments on how they take advantage of it and how it's enabling to do things on X1 that PS4 isn't, though I'm doubtful that will happen.
 
360s EDram was only an advantage early on until deferred rendering became dominant and you had to tile. Really the biggest asset for the 360 was its unified memory and very good GPU in comparison to the RSX.
At 1152x720 you can fit two 32 bit render targets + depth buffer to EDRAM. That's for example what Crytek (and we) did. Deferred rendering is possible on x360 without tiling. Overdraw on deferred rendering is a big bandwidth hog, so the EDRAM definitely helped.
 
Wow, that's amazing how they created a "system" that was originally presented years ago and had already been used in shipping games! So unique and novel!
 
I am not as knowledgeable as most of you guys here but tiling isnt the same solution devs used for the 360's edram due to its limited size? What are they doing different in the XB1's case?
 
It doesn't look like the choice was an economic one, unless they completely missed the mark with their projection of GDDR5 cost. They were targeting 4GB at some point ($14 difference between DDR3 and GDDR5) and ended up with 8GB ($28 difference).

All else being equal, the cost of die area for the ESRAM is practically compensating that difference (something like 1/4 of a $100 chip?), and it would have been a more expensive proposition, comparatively, had they went with the original plan of 4GB. In short, their 4GB solution of DDR3+ESRAM would have been more expensive than GDDR5 and no ESRAM.

14 USD over a million consoles would be 14 million USD. 28 USD would be 28 million USD. For MS that is a drop in the bucket. For many other companies, Sony included, that could determine whether they'll be in the red or in the black. With PS4 shipping over 10 million consoles in it's first year, that's 280 million USD.

How relevant is that? Sony minus their financial services had a net loss of 117.8 million USD for 2013. A 280 million USD swing would theoretically have swung it into the black.

Now, that isn't to say the investment was bad. It lead to a good performing machine that is also performing well in the marketplace. As long as software and hardware attach rate (where the profit is actually at) is good, then their decision should be an overall profitable one. Especially as time goes on and they, presumably, extend their market lead over the competition.

But the point here is that while it's easy as a lay person to look at 14 USD and 28 USD as being so low as to be meaningless, when it comes to the financial balance sheet of a corporation that plans on selling 10's of millions of units, it represents a relatively huge investment.

It's why there has to be a large justification for inclusion of a component that costs even 1 USD or even a fraction of a USD. Because that 1 USD part could be the difference between a profitable company and an unprofitable company. For example, at 1 USD, 10 million PS4s would be 10 million USD. Sony's top executives are going to return their yearly bonus for 2014 in an attempt to help the company become profitable. The total worth of their bonuses? 9.8 million USD.

If Sony again (minus their financial division) has a loss this year (quite likely with current financial news coming out of the company), it's entirely possible their board of directors could look at the cost incurred by the inclusion of GDDR5 as a point where they could have been profitable if they had gone with a cheaper memory solution. Of course, that would be short sighted as the design of the PS4 requires GDDR5 to have good performance.

TLDR:

So, in short, yes. It is quite likely that DDR3 was a cost conscious decision. Especially going forward where after a few years they may have been the only consumer of GDDR5 in the marketplace versus DDR3 which will still have many corporate consumers. IE - the cost differential between DDR3 and GDDR5 should widen over the next 5-10 years assuming newer graphics memory solutions come to market.

Regards,
SB
 
Crytek Shares Secrets of Using Xbox One eSRAM’s Full Potential, Resulted In ‘Big’ Bandwidth Saves

http://gamingbolt.com/crytek-shares...ull-potential-resulted-in-big-bandwidth-saves
Wow, that's amazing how they created a "system" that was originally presented years ago and had already been used in shipping games! So unique and novel!
Yeah :). It was first used in Uncharted. This technique was really important for PS3 as the SPU local memory was small. Tiled lighting allowed processing the lighting inside the SPU local memory. This technique was really popular in PS3 games. I think we were the first ones to use this method on Xbox 360 (Trials HD, 2009). Black Rock used it also on their Xbox 360 game Split/Second (2010). Battlefield 3 was the first AAA game to use compute shader based tiled lighting method on PC (if I remember correctly).
 
Yeah :). It was first used in Uncharted. This technique was really important for PS3 as the SPU local memory was small. Tiled lighting allowed processing the lighting inside the SPU local memory. This technique was really popular in PS3 games. I think we were the first ones to use this method on Xbox 360 (Trials HD, 2009). Black Rock used it also on their Xbox 360 game Split/Second (2010). Battlefield 3 was the first AAA game to use compute shader based tiled lighting method on PC (if I remember correctly).

And iirc T.B. used it for the PS3 version of Sacred 2 already.
 
It doesn't look like the choice was an economic one, unless they completely missed the mark with their projection of GDDR5 cost. They were targeting 4GB at some point ($14 difference between DDR3 and GDDR5) and ended up with 8GB ($28 difference).

Do you have a source for GDDR5 pricing? 8GB DDR3 is currently around the $60 mark, I'd be surprised if there only was a 50% price premium on GDDR5.

All else being equal, the cost of die area for the ESRAM is practically compensating that difference (something like 1/4 of a $100 chip?), and it would have been a more expensive proposition, comparatively, had they went with the original plan of 4GB. In short, their 4GB solution of DDR3+ESRAM would have been more expensive than GDDR5 and no ESRAM.

That $100 chip is going to be $25 by the end of this generation. DDR3 is going to be mainstream for a long time, and could in principle be replaced by a DDR4 interface running with a narrower data bus at twice the speed (same command rate as DDR3) GDDR5 is already boutique, and will be even more so in five years time.

Cheers
 
14 USD over a million consoles would be 14 million USD. 28 USD would be 28 million USD. For MS that is a drop in the bucket. For many other companies, Sony included, that could determine whether they'll be in the red or in the black. With PS4 shipping over 10 million consoles in it's first year, that's 280 million USD.

Makes you wonder how much money they will loose to fight the damage to the brand itself over the lifetime of the console. A few million machines less, PR agencies/magazines/websites, less game sales and lost fan base. That will be billions.
 
Do you have a source for GDDR5 pricing? 8GB DDR3 is currently around the $60 mark, I'd be surprised if there only was a 50% price premium on GDDR5.
Be surprised, it's from IHS iSupply. I see no reason to doubt their BOM estimates on either DDR3 or GDDR5. $60 versus $88.
That $100 chip is going to be $25 by the end of this generation. DDR3 is going to be mainstream for a long time, and could in principle be replaced by a DDR4 interface running with a narrower data bus at twice the speed (same command rate as DDR3) GDDR5 is already boutique, and will be even more so in five years time.

Cheers
I agree there could be a long term cost reduction gamble, spend more now for ESRAM die area, save cost later if GDDR5 remains expensive while SoC falls with new nodes. But it remains just that, another gamble. DDR4 at 4266 isn't happening anytime soon, I would see them staying at 256bit and DDR4/2133 instead. Long term could show HBM being the lowest cost solution. HBM is currently sampling, and the next SoC shrink could have been planned to move in tandem with the GPU industry.

http://wccftech.com/amd-feature-gen...0-graphics-cards-allegedly-launching-2h-2014/
http://www.jedec.org/standards-documents/docs/jesd235
 
Last edited by a moderator:
Makes you wonder how much money they will loose to fight the damage to the brand itself over the lifetime of the console. A few million machines less, PR agencies/magazines/websites, less game sales and lost fan base. That will be billions.

True that the PR could be handled far better, but that is completely irrelevant to the machine hardware.
 
Be surprised, it's from IHS iSupply. I see no reason to doubt their BOM estimates on either DDR3 or GDDR5. $60 versus $88.

I agree there could be a long term cost reduction gamble, spend more now for ESRAM die area, save cost later if GDDR5 remains expensive while SoC falls with new nodes. But it remains just that, another gamble. DDR4 at 4266 isn't happening anytime soon, I would see them staying at 256bit and DDR4/2133 instead. Long term could show HBM being the lowest cost solution. HBM is currently sampling, and the next SoC shrink could have been planned to move in tandem with the GPU industry.

http://wccftech.com/amd-feature-gen...0-graphics-cards-allegedly-launching-2h-2014/
http://www.jedec.org/standards-documents/docs/jesd235

in regards to HBM (interesting article) -


"IP To Meet 2.5D Requirements"

- "What will be different about IP for 2.5D designs is that the pitch is much smaller and the interfaces are much more parallel, with a memory being 1,024 bits wide as opposed to 128 bits wide, DeLaCruz said."

- "Memory IP providers are all working on this, he said, because “everyone knows there will be no DDR5. DDR ends at DDR4"

http://semiengineering.com/ip-to-reflect-2-5d-requirements/
 
Better than old techniques and stupid ways I guess!
xDDD Well for you and me it's better than the sordid ways. You just have forebodings in regards to the E3, expecting free Skype. Maybe it will happen. Other people have a hunch that they will announce free Xbox Live Gold for everyone.

Like you English would say with your sense of humour; thank for your input, dear pbjliverpool. :)
 
Back
Top