*spin-off* Choice of RAM configuration

That wasn't the uncertain factor. The question was whether 4Gbit density chips would be production ready in time.
Sony is a big company, and that segment is important to it. Relative miracles have been done with far less than billions of quarterly revenue.


Power, sure. 100% efficient? Not possible and they didn't think it was.


I'm not sure if this is a set of steps or just a numbered list. If the latter, 8 GB was not the original plan--even with DDR3 already decided, according to other posters with some insight into the process.
Yup, I think there isn't a single console which is 100% efficient, although most of the money on the Xbox One went to R&D instead of sheer power, it seems, which is a twist of the original "more state-of-the-art more power" tale. It's more like God or the Devil "is in the details" now. :eek: It's kinda interesting, it reminds me of the consoles of old.

Additionally, Sony are hardware experts and I think their strategy with the PS4 shows they've been smart with their RAM memory choices deliberately offering the most capable console at a great price for what it offers.

On a different note, in regards to GDDR5 and the worldwide production and availability of this kind of memory, a week ago there has been huge chemical explosions at Hynix Fabs, which is the main provider of this type of memory in the world.

It seems to affect NVidia the most, according to the article, but the pricing of this type of memory could scale up for GPUs.

http://www.kitguru.net/components/memory/faith/hynix-fabs-on-fire-after-chemical-explosion/

http://www.kitguru.net/components/m...ory-shipments-on-hold-after-hynix-explosions/
 

Huh? Buddha said that you don't go to a different node unless you need performance gains or a reduction in power. You just responded with a quote stating that the node change was due to the use of the memory in laptops because of its lower power consumption.

So you've proved his point. There's not a cost benefit to going to a different node, if anything, there's a possible increase in cost, especially if you (Sony) are the only consumer.
 
Sony will probably be the single largest consumer of GDDR5 with orders for 100s of million of gigs of memory over the life of the console.

Sony presence will probably make GDDR5 a lot cheaper to produce for the companies that hold Sony RAM contracts. 80+ million gigs a year based off a conservative 10 million consoles sold annually will probably lead to a ton of cheap ram that doesn't meet Sony's specification as well as a ton of cherry picking of RAM that can operated at high speeds.

4Gbit chips will probably be the cheapest configuration around.

Dedicated production lines with huge order volumes will allow Sony to purchase GDDR5 at prices thats not reflective of today market prices or availability.

And even if GDDR5 demand falls off the face of the earth. It will have little impact on Sony because it needs enough volume thats worthwhile for any company to maintain production just for Sony.
 
I just noticed that both Hynix and Samsung's lowest bin is 5gbps at 1.5v.
Considering there's no 5.5 step (it's 5, 6 and 7), is it possible Sony can take almost everything if the production from both suppliers have a negligible amount that doesn't pass 5.5?
That way the ram would be the cheapest and completely independent of market demand (they don't need the rest of the market to buy the lower bins in order to have their required volume of higher bins).
 
It seems people keep repeating the theme that because Sony will be possibly the single largest consumer of gddr5, maybe even the only at some point, that will give Sony leverage, volume purchasing power, etc. leading to reduced prices.

While it is true that the initial negotiation may have had that dynamic, with Sony saying they would purchase a bazillion gb over the console lifetime if the price was right (and presumably ready to use a different technology if it wasn't), the leverage is gone once that deal is made. In fact, unless Sony very carefully negotiated a lifetime cost reduction plan, the leverage is decidedly against them. If I was making the ram that Sony was utterly reliant on for their console, I'd have very little reason to pass on any future cost savings to them. Think nvidia and original Xbox.

And I doubt there would be any real competition for Sony to rely on. After several years, if Sony is the only significant customer of that chip, it is likely only one manufacturer will still be tooled appropriately to make it.

Lets hope Sony negotiated a smart lifetime contract. Otherwise lifetime price drops for the ps4 may be more meager than my budget would like.
 
It seems people keep repeating the theme that because Sony will be possibly the single largest consumer of gddr5, maybe even the only at some point, that will give Sony leverage, volume purchasing power, etc. leading to reduced prices.

While it is true that the initial negotiation may have had that dynamic, with Sony saying they would purchase a bazillion gb over the console lifetime if the price was right (and presumably ready to use a different technology if it wasn't), the leverage is gone once that deal is made. In fact, unless Sony very carefully negotiated a lifetime cost reduction plan, the leverage is decidedly against them. If I was making the ram that Sony was utterly reliant on for their console, I'd have very little reason to pass on any future cost savings to them. Think nvidia and original Xbox.

And I doubt there would be any real competition for Sony to rely on. After several years, if Sony is the only significant customer of that chip, it is likely only one manufacturer will still be tooled appropriately to make it.

Lets hope Sony negotiated a smart lifetime contract. Otherwise lifetime price drops for the ps4 may be more meager than my budget would like.

You have to consider that Sony as a corporation is purchasing more than just GDDR3 across its entire product line. They could very well negotiate terms that their memory supplier give them a favorable deal on GDDR3 for the PS4 lifecycle in exchange for the much larger contract of supplying Sony as a whole with memory chips.
 
It seems people keep repeating the theme that because Sony will be possibly the single largest consumer of gddr5, maybe even the only at some point, that will give Sony leverage, volume purchasing power, etc. leading to reduced prices.

While it is true that the initial negotiation may have had that dynamic, with Sony saying they would purchase a bazillion gb over the console lifetime if the price was right (and presumably ready to use a different technology if it wasn't), the leverage is gone once that deal is made. In fact, unless Sony very carefully negotiated a lifetime cost reduction plan, the leverage is decidedly against them. If I was making the ram that Sony was utterly reliant on for their console, I'd have very little reason to pass on any future cost savings to them. Think nvidia and original Xbox.

And I doubt there would be any real competition for Sony to rely on. After several years, if Sony is the only significant customer of that chip, it is likely only one manufacturer will still be tooled appropriately to make it.

Lets hope Sony negotiated a smart lifetime contract. Otherwise lifetime price drops for the ps4 may be more meager than my budget would like.

You could do that if you don't mind losing a volume customer when they run and take their business elsewhere mid or next cycle. And I doubt Sony will simply source one RAM manufacturer over the life of the console. More like several, which will remove Sony reliance on any one manufacturer, thereby reducing leverage of any one component provider.

Sony has been sourcing components for decades. They are probably versed enough to not to fall into the issues that MS had with nvidia and intel. MS is practically inexperienced from a hardware manufacturer standpoint especially compared to Sony.
 
Sony will probably be the single largest consumer of GDDR5 with orders for 100s of million of gigs of memory over the life of the console.

Sony presence will probably make GDDR5 a lot cheaper to produce for the companies that hold Sony RAM contracts. 80+ million gigs a year based off a conservative 10 million consoles sold annually will probably lead to a ton of cheap ram that doesn't meet Sony's specification as well as a ton of cherry picking of RAM that can operated at high speeds.

4Gbit chips will probably be the cheapest configuration around.

Dedicated production lines with huge order volumes will allow Sony to purchase GDDR5 at prices thats not reflective of today market prices or availability.

And even if GDDR5 demand falls off the face of the earth. It will have little impact on Sony because it needs enough volume thats worthwhile for any company to maintain production just for Sony.

When Nvidia and AMD pull out of the GDDR5 market Sony will be tops. And only then.

Last quarter alone over 275 million discrete graphics cards (AIB) were shipped. That encompasses the late spring/early summer months. So that isn't the high volume back to school months or holiday season.

One quarter. Now at least half of those are likely using DDR3 in the lower end. The rest will be some form of GDDR5. That leaves ~135 million boards shipping in one of the slowest quarters. Boards with GDDR5 have between 1-6 GB of memory. Let's be really conservative and say it averages out to 2 GB. That's ~270 million GBs worth of memory.

PS4 would have to sell over 33 million units in 1 year to equal the volume of 1 slow quarter's worth of graphics cards consumption of GDDR5.

Heck, if only 1/4 of those discrete graphics cards were using GDDR5, the PS4 would still have to sell over 16.5 million in one year to equal one slow quarter, not to even mention a full year's worth of GDDR5 using discrete graphics cards. You're looking at over 60-120 million PS4's a year for the PS4 to become the major consumer of GDDR5.

And last quarter was a down quarter for PC discrete shipments on a YoY basis.

So, no. Until AMD and Nvidia pull out of the GDDR5 market, Sony won't be the single largest consumer of GDDR5. And even over the lifetime of the PS4, it isn't going to be historically the largest consumer of GDDR5. Although I guess if the PS4 sells 1 billion units by the end of its life, they might have a good shot at making that claim.

Regards,
SB
 
That wasn't the uncertain factor. The question was whether 4Gbit density chips would be production ready in time.
Sony is a big company, and that segment is important to it. Relative miracles have been done with far less than billions of quarterly revenue.


Power, sure. 100% efficient? Not possible and they didn't think it was.


I'm not sure if this is a set of steps or just a numbered list. If the latter, 8 GB was not the original plan--even with DDR3 already decided, according to other posters with some insight into the process.

Oh wow, that's interesting about the 8GB not part of original plan. How would they have gotten away with running 3 VMs, allocations towards non-gaming s/w then? If they went with 4GB then console would really have been gimped.
 
The original leaked wish list diagram had 4 GB, I'm not sure what the app switch method would have been.
As far as the VMs go, the lowest-level one that hosts the other two is very lightweight, and the game OS is itself stripped of non-essential functions that a dedicated game environment doesn't need.
 
That Microsoft went for 4GB DDR3 originally seems kind of irrelevant anyway, as at that point Sony only had 2GB of GDDR5 on the cards as well.
 
If HMC or HBM really takes off, and becomes more cost effective than GDDR5, what's stopping Sony from using it with a new revision of their SoC?

AMD and Hynix indicated they plan to use GDDR5M while waiting for stacked memory to reach mainstream, which should be 2-3 years after the FPGA and enterprise market adoption. That means if HMC starts production in 2014, mainstream products could be in 2016-2017. Not a bad timing for the "slim" revision.

It looks like both HMC and HBM technologies have adequate bandwidth even with a single device. A quad link HMC is 160 to 240GB/s and HBM quad channel is 128 to 256GB/s. So at 4Gbits per die they need 2 stacks of 8. 8gbits per die is expected for 2015, then they can do it with a single stack of 8.
 
If you guys remember, a year or two ago there was an interview with the CTO of Sony about PS4/nextgen (prior to its announcement. He mentioned that they were looking at interposers and stacking. Some assumed that meant PS4 would use it, but perhaps, it was more strategic and long term research as in: Once the PS4 chip is below 20nm, we'll need a different RAM solution.

I think as long as the latencies, bandwidths, and costs are equal or better than the existing tech then they can move to a new tech for ram. A new memory controller can make the new RAM look like the old RAM without breaking any compatibility.
 
That Microsoft went for 4GB DDR3 originally seems kind of irrelevant anyway, as at that point Sony only had 2GB of GDDR5 on the cards as well.

This is a good point and I would add that when we were all discussing the VG leaks early this year nearly everyone dismissed 8GB for PS4 bc everyone knew that it wasn't possible and a couple months later we got 8GB so I think its a bit premature for some to write off Sony based off what they think the cost of memory will be in the future. Not to mention we don't even know if they have other options might be viable...
 
When Nvidia and AMD pull out of the GDDR5 market Sony will be tops. And only then.

Last quarter alone over 275 million discrete graphics cards (AIB) were shipped. That encompasses the late spring/early summer months. So that isn't the high volume back to school months or holiday season.

One quarter. Now at least half of those are likely using DDR3 in the lower end. The rest will be some form of GDDR5. That leaves ~135 million boards shipping in one of the slowest quarters. Boards with GDDR5 have between 1-6 GB of memory. Let's be really conservative and say it averages out to 2 GB. That's ~270 million GBs worth of memory.

PS4 would have to sell over 33 million units in 1 year to equal the volume of 1 slow quarter's worth of graphics cards consumption of GDDR5.

Heck, if only 1/4 of those discrete graphics cards were using GDDR5, the PS4 would still have to sell over 16.5 million in one year to equal one slow quarter, not to even mention a full year's worth of GDDR5 using discrete graphics cards. You're looking at over 60-120 million PS4's a year for the PS4 to become the major consumer of GDDR5.

And last quarter was a down quarter for PC discrete shipments on a YoY basis.

So, no. Until AMD and Nvidia pull out of the GDDR5 market, Sony won't be the single largest consumer of GDDR5. And even over the lifetime of the PS4, it isn't going to be historically the largest consumer of GDDR5. Although I guess if the PS4 sells 1 billion units by the end of its life, they might have a good shot at making that claim.

Regards,
SB

Probably the vast majority are sold to OEM and I doubt half of that volume is gddr5 based gpu. The highest volume cards are probably ddr3 based because the cheap memory and the fact most PC consumers aren't that demanding when it comes to graphic performance.
 
If HMC or HBM really takes off, and becomes more cost effective than GDDR5, what's stopping Sony from using it with a new revision of their SoC?
Technically, nothing stops them from doing what they want with the platform. I'm not aware of a console that changed something like that.
There's some risk involved in either favoring the new revision of the console over the older, or breaking some underlying assumption about the system's behavior that can cause software coded prior to the revision to behave differently. If the next memory setup has what is needed to emulate the quirks of what came before, it might be easier.

I'm trying to find an old link to back up my recollection that something as low-level as differences in the HDD controller between PS3 revisions tripped up a development build of Id's Rage.
For the Xbox 360, the desire to keep the platform as consistent as possible lead to the creation of a bus unit that emulated the latency and performance penalties of a two-chip system for the single-chip SoC.
 
Next 1-2yrs may see a new memory model emerge!

I read an interesting article today : "To DRAM or not to DRAM? That is one of (many) HPC questions"


http://www.theregister.co.uk/2013/09/14/intel_exascale_update_deepdive/

Intel : "There's many new technologies in flight. These are going to have a profound impact on how we build systems"

Intel: "I see it splitting into two directions: one is where we're stuck with DRAM and have to live with DRAM a long time,"

Intel : "the other is if one of these [new] memory technologies really does evolve, then things change dramatically".


Intel : "... over the next year and a half to two years ... . You're going to see them become real in that timeframe. They won't be what we want for a DRAM replacement at that point, [but] that's when you have to check."

...

Anyway I found that interesting ..... :)
 
Back
Top