Predict: The Next Generation Console Tech

Status
Not open for further replies.
Why haven't Sony and MS shifted to a smaller node yet? TSMC skipped 32nm. 28nm just came online late last year and for a non-premium line like xb360 and ps3, it doesn't make sense to pay a premium price for a premium node. When that price comes down (late this year, early next) I expect we will see new slim models.
But that's 8 years into the generation! PS2 had die shrinks every year. Sony could design PS2 to be a monster knowing it'd be non-lossy in a year or two, and degrees of profitable after that depending on where they priced it. Sony no doubt when into this generation expecting the same (we all were), only to find it wasn't happening and forcing their console to stay at a $300 price, keeping sales at half what they could be doing at $200.
 
But that's 8 years into the generation! PS2 had die shrinks every year. Sony could design PS2 to be a monster knowing it'd be non-lossy in a year or two, and degrees of profitable after that depending on where they priced it. Sony no doubt when into this generation expecting the same (we all were), only to find it wasn't happening and forcing their console to stay at a $300 price, keeping sales at half what they could be doing at $200.

Whoa wait a minute ... I don't want to go too far off-topic here Mr mod, but ps3 was hardly the poster boy for "Why isn't it cheap like ps2? Damn nodes!". The thing started with a BOM in the $700 range. And was never going to be as cheap as ps2 due to design decisions which forced a more expensive BOM for the lifecycle of the product. (It's one of the reasons I said MS made smarter design decisions at the outset of the gen)

Node shrinks have happened historically every 18-24 months. I don't recall a time when node shrinks were happening every year, but I'll take your word for it (unless you mean half nodes?).

Die shrinks at half node this gen were available pretty regularly. The following are from shipping amd GPUs:

90nm 2005 q4
80nm 2006 ??
65nm 2007 ??
55nm 2008 q1
40nm 2009 q4
32nm ---------
28nm 2011 q4

Now if Sony/MS decided the cost saving wasn't worth the engineering effort to target a node, that's their business.

The node shrink schedule I outlined above is full node shrinks. Every other year.

If a delay hits a snag, no big deal. The console stays at the current MSRP while being sold for cost. When cost reductions come (they will) the price drops. If not, the price stays the same (hello $300 xb360p for 4 years running!)
 
Last edited by a moderator:
Whoa wait a minute ... I don't want to go too far off-topic here Mr mod, but ps3 was hardly the poster boy for "Why isn't it cheap like ps2? Damn nodes!"
Sony have a history of aggressive cost cutting using process shrinks. That makes them an ideal reference.

http://ps3ultraslim.com/news/wp-content/uploads/2012/03/EE+GS.jpg

Node shrinks have happened historically every 18-24 months. I don't recall a time when node shrinks were happening every year, but I'll take your word for it (unless you mean half nodes?).
Yes, half node die-shrinks. You don't fuss over whether its a full node or a half node shrink if it's reducing your die size and saving you money.

Now if Sony/MS decided the cost saving wasn't worth the engineering effort to target a node, that's their business.
Umm, that's fobbing off the argument. If the node shrinks yielded good returns, then they'd have used them. The fact that the node shrinks (and half-node shrinks if you want to make a distinction) didn't yield good returns meant MS and Sony were forced into using larger, more expensive chips, because there wasn't money to be saved. Unless you believe that both MS and Sony considered using smaller, cheaper chips but decided they just couldn't be bothered?
 
Why do you say that.


Well, IF Microsoft is having a many core CPU designed similar to Larrabee, a company invested into tiling and Ray Tracing (Imagination Tech bought Caustics) would make sense. Power VR might be offer a better deal than AMD/ATI as well.
 
Yes, half node die-shrinks. You don't fuss over whether its a full node or a half node shrink if it's reducing your die size and saving you money.

I can't speak for Sony/MS on why they chose to wait for a full node shrink (90 => 65 => 45 => 32/28?) of their processors, but one can assume Sony wasn't as aggressive as before because of the increased complexity of Cell and RSX which aren't as easy to redesign and shrink as the old EE+GS were. Perhaps licensing of external tech was the cause.

Maybe before with Kutaragi at the helm he placed more importance on shrinking chips vs Stringer ...

I dunno!

There are a ton of decisions at SonyHQ which I can't account for ...

Umm, that's fobbing off the argument. If the node shrinks yielded good returns, then they'd have used them. The fact that the node shrinks (and half-node shrinks if you want to make a distinction) didn't yield good returns meant MS and Sony were forced into using larger, more expensive chips, because there wasn't money to be saved. Unless you believe that both MS and Sony considered using smaller, cheaper chips but decided they just couldn't be bothered?

Full nodes cut the cost in half (in theory).
Half nodes cut ~25%

There is a distinction in gains/projected-savings to be had between full and half nodes.

Now if TSMC/insert-fab-co-here are charging more per new node and that savings is eaten up in increased wafer cost + engineering to redesign the smaller more complex chips, then I could see them waiting.

What the above node release schedule since 2005 shows though is that it was economically feasible for ATI/AMD/Nvidia to release on the latest node for their entire GPU line.

If there were no gains to be had, I'm sure AMD/ATI/Nvidia would still be on 45nm ... (or mirroring Sony/MS and only release on full node drops)

But that isn't the case.

FYI - FWIR, Nintendo has notoriously not used aggressive node shrinks and redesigns in their consoles...
 
Last edited by a moderator:
Full nodes cut the cost in half (in theory).
Half nodes cut ~25%

Actual manufacturing indicates doubling or more than doubling design, engineering costs, capital expenditures, and wafer costs for something less than double the number of chips that are something less than twice as efficient.

Now if TSMC/insert-fab-co-here are charging more per new node and that savings is eaten up in increased wafer cost + engineering to redesign the smaller more complex chips, then I could see them waiting.
This is what happened.

What the above node release schedule since 2005 shows though is that it was economically feasible for ATI/AMD to release on the latest node for their entire GPU line.
Have you noted the chorus of complaints about how expensive the latest line is? A discrete GPU has more pricing leeway than a console component.

If there were no gains to be had, I'm sure AMD/ATI/Nvidia would still be on 45nm ... (or mirroring Sony/MS and only release on full node drops)
The higher the price ceiling and the lower the volume requirement, the more rapidly a company can transition to the leading edge of an immature process.

If pricing is low or the market requires millions of units as opposed to tens of thousands, then a design has to wait.
 
I find this part of the aforementioned article pretty telling:
“Increasing cost due to the complexity of advanced technology is a concern for the future,” said Mark Liu, TSMC’s senior vice president of Advanced Technology Business. “Intel, Samsung, and TSMC believe the transition to 450mm wafers is a potential solution to maintain a reasonable cost structure for the industry.

Doesn't sound that enthusiastic to me. And that's only the founders part. It says nothing about the other side of the medal, designing and implementing the chip as wafer by self are only sand that you can't comfortably sunbath on... lol
 
The best-case scenario is that a design shouldn't be affected by the wafer size. There have been teething pains anyway, since the wafers themselves are harder to make, and the manufacturing process involves physical and chemical steps that need to be controlled for uniformity over a wider area. AMD's regression in clock speed after the transition to 300mm at the 65nm node may have been related to a gate oxide that didn't scale, and some blamed a confluence of SOI and uniformity problems over the larger area.

Stating that they want to start a transition is not quite the same as rolling out full production. 3-4 years went by before the transition started and there several volume lines at 300mm.
Intel would probably be in the lead, and its 450mm wafer fab is going to start with 300mm with the option to shift to 450mm in later years.
 
some (new ?) info on ps4 : sony almost abandoned blu ray for ps4 :

http://www.gamesindustry.biz/articles/2012-05-30-playstation-4-almost-abandoned-a-disk-drive-report

"The Wall Street Journal has offered up an interesting tidbit, suggesting that Sony did consider completely abandoning physical game discs for PlayStation 4 but ultimately decided against such a move.

The Journal cites "people familiar with the matter" and notes that Sony is still targeting said, opting to include optical disk drives rather than break with a decades-old model in the industry a 2013 release for the successor to PS3.

If we had to guess, Sony's next-gen console will take an approach similar to Vita, where much of the content is available digitally but will also be sold at brick-and-mortar. As for E3, according to analysts, Sony's focus will be on showcasing its core content on PS3, demonstrating that there are actually plenty of reasons to buy a Vita, and revealing the next phase of its digital plans on PSN."

if this prediction of : all ps4 games would be released both on disk and digitally a la Vita (vita is an experiment for ps4 in some sort) what this would mean for the size of ps4 games ? would sony limit the size of ps4 games to dual layer blu rays of 50 Gb instead of the 4 layers 100 Gb technology ? and what this would mean for the prospects of john carmack huge high rez 8k * 8k quantity of textures streaming technology ?

could sony even ask developers not to exceed 25 Gb per game ?

if this is so, would a low cost 40 Gb SDD be sufficient and a realistic solution for cashing ps4 games and get away with less RAM than xboxnext ?

this is another info relating to ps4 :
http://www.gamesindustry.biz/articl...compatibility-to-playstation-with-gaikai-deal

Sony brings ps2 backwards compatibility to PlayStation 3 with Gaikai deal.

this means sony dosent need anymore hardware solutions for ps4 compatibility with ps3 games. so the idea of including SPEs into ps4 is now an outdated idea.

Both of these infos show clearly that sony indeed is trying to think out of the box to decrease costs and solve its hardware problems with ps4 (digital distribution and streaming games technology). what else sony is thinking about for its ps4 ? we could have some surprises awaiting us in 2013.
 
Last edited by a moderator:
Correct - but you have to pay for the streaming service seperately (and of course deal with the slight input lag)

actually the streaming service would be part of the psn+ deal. it appears sony is shifting progressively away from a free ps3 psn service to payable ps4 psn service.maybe only some basic online gaming would be free on ps4, but if you want : cloud saving, multi-game audio and video chat service, streaming games...etc you should subscribe to psn+ in ps4.

clearly sony dosent like the idea of microsoft gaining money from online gaming and sony loosing money from the service :LOL:
 
I don't think there's any question that Sony will offer all PS4 software as a download, but I'm annoyed by the way the story of Sony "considering" a download only system has been so widely reported as Sony "almost" choosing that path. Of course they considered that option. There are obvious benefits. It would be irresponsible not to evaluate those kinds of options, but its not like it was a close call. The PSP Go was violently rejected, the internet infrastructure isn't ready and retail will go to war. They probably looked at it briefly and shelved it. At the same time, for people who do want it, PS4 can effectively be download only. It's in Sony's interest to accommodate those players.
 
I don't think there's any question that Sony will offer all PS4 software as a download, but I'm annoyed by the way the story of Sony "considering" a download only system has been so widely reported as Sony "almost" choosing that path. Of course they considered that option. There are obvious benefits. It would be irresponsible not to evaluate those kinds of options, but its not like it was a close call. The PSP Go was violently rejected, the internet infrastructure isn't ready and retail will go to war. They probably looked at it briefly and shelved it. At the same time, for people who do want it, PS4 can effectively be download only. It's in Sony's interest to accommodate those players.

Exactly right. All of the console manufacturers are going to have that option. It will be a necessity for the upcoming infrastructure changes on game purchases.
 
This is what happened.

But is not what happened at full node shrinks ...
The benefits of savings on a full node offset the engineering/redesign costs which I'm sure were/are more expensive than in the ps2 days due to the complexity of the chipset.


Have you noted the chorus of complaints about how expensive the latest line is? A discrete GPU has more pricing leeway than a console component.

Indeed. I'd be surprised if they didn't balk at a 25% increase in wafer costs for 28nm. The real kicker is what TSMC were looking to charge for 20nm and below.

However what TSMC would like to charge and what they will settle for are two very different things.

At one point, the cost benefit goes away and even cell phone makers will stop filling their latest node production lines with product.

TSMC knows this and will adjust accordingly. 450mm wafers are part of this solution.

The higher the price ceiling and the lower the volume requirement, the more rapidly a company can transition to the leading edge of an immature process.

If pricing is low or the market requires millions of units as opposed to tens of thousands, then a design has to wait.

One also has to consider the potential savings to be had.

If the chip is cheap enough already, it may not beneficial to invest in a redesign which doesn't offset the savings.

example:
45nm chip costs $40
28nm expected cost $20 (+25% wafer increase) = $25

Savings $15 * 10 million chips /yr = $150 million

If redesign costs = $100 million
then savings = $50m

next node:
28nm chip costs $25
20nm expected cost $12.5 (+25% wafer) = $15.63

Savings $9.38 * 10 million chips /yr = $93.8 million

Savings < Redesign costs = Wait for next node (or cheaper wafer costs).

_______________

As to the point on higher cost msrp items being more flexible to absorb higher cost new nodes... AMD is currently selling a 28nm 123mm2 chip based card for $100.
This chip size is roughly the same as a projected xb360/ps3 SoC @ 28nm.
Both consoles retail for more than double this cost.

So while I agree with the principle in general, I think at this point, Sony/MS are waiting for 28nm production capabilities more than anything.
 
File size isn't a barrier...

It is if MS/Sony are looking to limit costs on storage ...

Standard HDD is what has kept ps3 at $250 and will keep it above a certain price due to it's necessary inclusion.

Flash storage on the other hand could help to allow local store, but keep costs under control.
 
They can pass storage costs on to consumers, like with Vita. In any case, an entry level 500gb hdd is enough for 10 50gb games, and the profit potential from downloadable goods justifies their inclusion.
 
They already released Uncharted 1 and 2 for download, and the latter is something like 28GB, too... and (at least the European discs) include a multitude of languages which aren't really needed for a DL service and could be downloaded optionally.
 
Status
Not open for further replies.
Back
Top