Predict: The Next Generation Console Tech

Status
Not open for further replies.
Doubling this cost for a chip twice the size and we're looking at ~$80 for ~250mm2.
Hm, I'm far from an expert at these things, but chip cost ought to be more like quadruple rather than double for a chip twice the size. Perhaps not QUITE that high, but certainly way more than a strictly linear relationship.
 
Haswell on it's own would make a pretty good next gen console. Full DX11 GPU coming in at 5-6x the power of Xenos/RSX plus a quad core CPU with another 500 GFLOPS to throw at graphics and 4-5x the general computing power of Xenon. All that in a single chip that will probably clock in at under 100w.

The only issue I see with it would be memory bandwidth. I guess they'd need to do something custom with the memory controller to handle GDDR5.

The memory bandwidth could be mitigated by stacked memory (something Intel is actively working on). That being said: Haswell would make a HORRIBLE next gen console. Intel charges a premium as part of their business is aggressive fab technology so it wouldn't be positioned well in terms of price, Haswell while having a high IPC is also quite large, and you would be one of the first people slamming the graphics. Especially for a product aimed at a 8-10 year shelf life. Heck, you could probably walk out the door with a Pitcairn and and quad core AMD CPU for the same costs. From that perspective Haswell is a horrible next gen console suggestion unless there is some magical reason you are wanting to cut TDP to half of last gen's products which, unless you are Nintendo, I don't see why you would need to do that.

Hm, I'm far from an expert at these things, but chip cost ought to be more like quadruple rather than double for a chip twice the size. Perhaps not QUITE that high, but certainly way more than a strictly linear relationship.

It is really dependent on yields for the most part. You are going to fit in fewer die on the edge when going with a larger die (e.g. on 300mm a 250mm^2 die fits about 221 dice and a 125mm^2 dice fits about 473 dice) so you gain some there and then there is redundancy which, if you look at Chef's post, he actually does exactly that below that by adding in a 50% extra margin and jumps it up from $80 to $120. And for all the hand ringing Nvidia did about TSMC's 28nm I think people are skipping over the similar positive press notes by companies like AMD. Yes, new nodes are getting more expensive and slower to roll out. This has been true for a long time but I wouldn't take one companies political consternation as a doomsday prediction when other companies have adapted much better.
 
This has been true for a long time but I wouldn't take one companies political consternation as a doomsday prediction when other companies have adapted much better.
Can we just be clear that it isn't a 'doomsday prediction' but a design consideration. Shrinking of chips is becoming more costly making it something to factor when planning a launch configuration and planning price-points and long term scaling.
 
Can we just be clear that it isn't a 'doomsday prediction' but a design consideration. Shrinking of chips is becoming more costly making it something to factor when planning a launch configuration and planning price-points and long term scaling.

I thought I was already clear ;)

me said:
Yes, new nodes are getting more expensive and slower to roll out. This has been true for a long time but I wouldn't take one companies political consternation as a doomsday prediction when other companies have adapted much better

It certainly is a design consideration--but so is the abandonment of 5 year life cycles and screaming down to the $99 price point as fast as possible. The market has expanding in number as well as revenue streams (console advertising, paid online, DLC, media streaming, retail price hikes, etc).

Those are all factors in predicting the hardware.

Another big factor is not to overlook that one company so completely @#$%@# up their lunch in terms of BOM costs and has essentially run their company so deep into the red they cannot be reliably leaned on to launch a traditional console (not impossible, but it is surely not certain). Another competitor, already seeing their green grows in the Disney-family-friendly alternative gaming-toy market, ditched the technologically centric design philosophy. Which leaves only one device maker who lost their shirt with their first console and, due to a major #@$@##$ up in cooling design (and probably metal layers ala bump gate so I wouldn't pin it all on cooling), ate serious bad PR crow and lost over 1B which hurt platform profits and market positioning. That in conjunction with diminishing return on visuals, lack of competitive pressure, and PR folks wringing their hands at the eye of bigger prizes (broad market media distribution, alternative ubiquitous interfaces) really have the wind out of their sails so the table is being set for traditional budgets being "impossible" instead of a more realistic "we really want to nickle and dime you to death with DLC" ;)

The market is completely different (I predict MS saw how the Waggle Wii was disruptive and are hoping, core gamers be damned, their is enough lightening in the Kinect bottle to repeat) and there are *major* issues on the chip front but what people are ignoring is there are two sides to this coin of costs/revenues and amazingly board makers are able to buy chips, boards, memory, etc and consistently turn out new GPUs and whatnot as competitive prices from generation to generation (this last go around seems a little different for many reasons, one being NV essentially let AMD set the field and they have reaped at the high end and had little to compete in the middle markets; both seem to be enjoying a price way siesta.)

Anyways, I get the feeling there are those in certain peoples' ears (ahem, PM inboxes) setting the table, framing expectations, and directing the discussion in terms of what is possible and what should be expected. I think there is a big demarcation between such as well as the balance of costs/revenues and big picture corporate goals. I understand the later, and are totally valid in a prediction, but they don't dictate the parameters of the technical side.
 
...and this was last year as soon as GF was ready to produce 32nm.

We're talking about 28nm ... late NEXT YEAR. This is the same process that was producing retail product at the tail end of 2011. So two years on from that point, I'm pretty sure they'll be able to come to grips with a reasonable price, productions volume, and yield.
GF's 28nm shares some of its pedigree with the 32nm SOI process, but they are not the same.
Hopefully its entry into the market won't be marked by Llano's lack of success.
Llano is being quickly deprecated in favor of its redesigned successor, which is slightly larger but is targeting the same or higher price bands.
Apparently, there isn't much desire to produce Llano this year priced any cheaper.

And speaking of 28nm ... as I pointed out before the llano post, HD7750 retails for $110 ... with a heatsink, 1GB GDDR5, a board, retail packaging, shipping, retail margin, and margin for AMD.

So out of that budget, how much could the 123mm2 28nm chip possibly cost for TSMC to make?
I don't have the sources to vet the BOM of AMD's low-end discrete components.
Some comments by those on this board that are better positioned when analyzing other GPU board breakdowns seemd to hint that some things like the PCB and RAM may be cheaper.

I don't have a breakdown for the consoles either, although some of the speculation from 2009 had the CPU and GPU being allocated about half of the $120 ceiling you posit as acceptable.
 
Which leaves only one device maker who lost their shirt with their first console and, due to a major #@$@##$ up in cooling design (and probably metal layers ala bump gate so I wouldn't pin it all on cooling)
Also, a board designed for leaded solder does not work as well with lead-free solder.
 
Thats true lol I think the cap of 1 Tflop is the new magic for console hardware makers ! our console can do more than 1 Tflops calculations ! it is a monster....
The 1 TFLOP marketing battle happened already with the 360 and ps3. They both have over 1 TFLOP of processing power. It's just not all programmable.
 
Essentially the traditional gaming console is dead? Sound fine with me. Mobile devices are taking over anyways. The PC can survive, yes, but as a gaming platform in the high end and mainstream sense, it may go with the gaming console. That all really depends on the actions and success of ARM, x86, and the OSs involved on both processing platforms.
 
Essentially the traditional gaming console is dead? Sound fine with me. Mobile devices are taking over anyways. The PC can survive, yes, but as a gaming platform in the high end and mainstream sense, it may go with the gaming console. That all really depends on the actions and success of ARM, x86, and the OSs involved on both processing platforms.


Not really. Traditional consoles should still be successful because they will allow gaming experience you cannot get anywhere else. And these will be great gaming experience and many of them, not some stupid flash game meant to waste 10 minutes on. Plus consoles have adapted to taking on more tasks the living room requires.

I do not see mobile devices taking over, more like expanding the gaming market to new audiences. We will still have big screen HDTV's that only a true console gaming will be able to give justice to.
 
The memory bandwidth could be mitigated by stacked memory (something Intel is actively working on). That being said: Haswell would make a HORRIBLE next gen console. Intel charges a premium as part of their business is aggressive fab technology so it wouldn't be positioned well in terms of price, Haswell while having a high IPC is also quite large, and you would be one of the first people slamming the graphics. Especially for a product aimed at a 8-10 year shelf life. Heck, you could probably walk out the door with a Pitcairn and and quad core AMD CPU for the same costs. From that perspective Haswell is a horrible next gen console suggestion unless there is some magical reason you are wanting to cut TDP to half of last gen's products which, unless you are Nintendo, I don't see why you would need to do that.

Haswell is quite large? Ivy is only 160mm2 which is tiny compared with the CPU+GPU footprint of the current generation consoles. I can't see Haswell changing that significantly.

That said I do agree that Haswell wouldn't make an ideal single chip console, both from a cost and performance perspective. I was only saying that it would make a decent next gen console from a technical sense. i.e. given the rumors of 6x this gen (which I personally don't believe anyway) then Haswell wouldn't be far off that by itself.

Obviously from a performance perspective there are better options that would still fit within console TDP limits.
 
You can't count the CPU+GPU for 360 separately from the eDRAM; since the rasterizers are located on the eDRAM die the GPU is effectively split into a two-chip solution...
 
You can't count the CPU+GPU for 360 separately from the eDRAM; since the rasterizers are located on the eDRAM die the GPU is effectively split into a two-chip solution...

Only the ROPs are on the EDRAM. The EDRAM was 64mm² in 80nm, it would be 20-ish mm² on 45nm. While the EDRAM adds to the total silicon real estate, you need to consider the advantages it bring: if you removed it, you'd need an extra 128 bit bus with more IO cells and more SRAM for buffering on-chip.

Even with EDRAM the complete CPU+GPU+EDRAM should only be around 50-60 mm² on 22nm.

And while the CPU part of Haswell would be more than enough for a next gen. console, the GPU wouldn't be, IMO.

Cheers
 
Combined CPU+GPU in current X360 is apparently 168mm2 @45nm. edram is separate though.

http://forum.beyond3d.com/showpost.php?p=1447163&postcount=239

I was comparing it to the launch die sizes of both consoles, I.e. how viable it would be from a die size persepective to include in a modern console. Ivybridge is actually smaller today than Xenon was when it launched. And it probably has similar power draw. So purely as the cpu component of a new console it (or even moreso Haswell) would make a lot of sense if not for the rip off price Intel would likely charge for it. I agree with the others that the gpu would be a little weak for next gen consoles - although still powerful enough to make it a "next gen" platform compared to the current consoles - but just imagine what it could do if it only had to handle cpu type tasks. Imagine dedicating that entire GPU (potentially 3 or 4x Xenos) to phsyics and still having a quad Haswell with 500GFLOPS of AVX2 to crunch through the rest of you cpu workload.
 
Another graphics Engine, Luminous, from Square-Enix.

- DX11 based
- for Nextgen hardware

http://www.youtube.com/watch?v=UVX0OUO9ptU

l_4fcebcc486efboidbj.jpg
 
Only the ROPs are on the EDRAM. The EDRAM was 64mm² in 80nm, it would be 20-ish mm² on 45nm. While the EDRAM adds to the total silicon real estate, you need to consider the advantages it bring: if you removed it, you'd need an extra 128 bit bus with more IO cells and more SRAM for buffering on-chip.

Even with EDRAM the complete CPU+GPU+EDRAM should only be around 50-60 mm² on 22nm.

And while the CPU part of Haswell would be more than enough for a next gen. console, the GPU wouldn't be, IMO.

Cheers

EDRAM seems to lag at least a node (EG, 80nm when X360 CPU/GPU are 65nm, and so on), so an all 22nm die is a non point, seemingly. Currently in Valhalla it's probably 65nm (while CPU/GPU are 45nm) but a long way off from 22nm. As well it seems difficult to integrate (EG, why it's still a separate die in newest 360). Those are negatives in it's cost structure.

I have to wonder if MS has run any numbers and decided EDRAM ended up more costly in practice than it was worth. The fact it still a significant, separate die must be costly. But just the same, maybe those numbers show it was worth it, I dont know.

If the rumors of MS going with a extremely large amount of DDR3 for Durango are true, you'd have to assume EDRAM will be there again.
 
While the EDRAM adds to the total silicon real estate, you need to consider the advantages it bring: if you removed it, you'd need an extra 128 bit bus with more IO cells and more SRAM for buffering on-chip.
Oh, I'm more than aware of the advantages of eDRAM. I love eDRAM. Hearing Wuu would retain eDRAM from its predecessor totally made my day... :)
 
EDRAM seems to lag at least a node (EG, 80nm when X360 CPU/GPU are 65nm, and so on), so an all 22nm die is a non point, seemingly. Currently in Valhalla it's probably 65nm (while CPU/GPU are 45nm) but a long way off from 22nm. As well it seems difficult to integrate (EG, why it's still a separate die in newest 360). Those are negatives in it's cost structure.

I was just trying to compare state of the art Intel CPUs with 360 core logic on a process normalized basis.

If the EDRAM is still 65nm, it will be super cheap. Process equipment is normally fully depreciated (and thus amortized) in five years. That's a huge chunk of capital costs gone from producing 65nm chips. I'd expect the EDRAM dies to be a few bucks a piece.

I have to wonder if MS has run any numbers and decided EDRAM ended up more costly in practice than it was worth. The fact it still a significant, separate die must be costly.

Considering they saved an extra memory bus, while maintaining fillrate superiority it has been a huge success, IMO.

Cheers
 
Last edited by a moderator:
If 65nm was cheaper, they would leave the CPU/GPU on it. Also, if it was not there they would be down to one die instead of two.

It seems to me EDRAM might have been more cost favorable early on, then late it life. That's why I wonder what the total picture looks like now that's it's still on a separate die 7 years in.

I'm not convinced the EDRAM was a good deal lets leave it at that.
 
Status
Not open for further replies.
Back
Top