Predict: The Next Generation Console Tech

Status
Not open for further replies.
Sure, the larger the wafer, the less wastage for larger chips, but at what cost? There's a simple reason why it's taking so long and that it's simply not a trivial thing to manufacture. The whole issue surrounding 450mm adoption is the trade-off in the high difficulty in producing that large an ingot.

Ultimately, it just means it won't be cheap, and it'll be a very long time before the fabs pass on such cost savings, just like any new fab process (hell, TSMC won't have it available until 2013-2014, and that's not even close to replacing 300mm wafer usage). I wouldn't count on it being a "saviour" nor cost-friendly for the time frames we are currently interested in.

As for it facilitating larger chips, I'd say you have other problems to deal with rather than worrying about more efficient use of silicon wafers. Big chips are still costly to make just by nature, and typically very power hungry for the incredible number of transistors you'd be packing in there. And who knows how much of it goes to waste anyway since you'll be wanting to design a lot of redundancy so that your huge chip doesn't fail. It just doesn't make sense for a console design.

450mm again, is not some magical saviour.
 
Sure, the larger the wafer, the less wastage for larger chips, but at what cost? There's a simple reason why it's taking so long and that it's simply not a trivial thing to manufacture. The whole issue surrounding 450mm adoption is the trade-off in the high difficulty in producing that large an ingot.

Ultimately, it just means it won't be cheap, and it'll be a very long time before the fabs pass on such cost savings, just like any new fab process (hell, TSMC won't have it available until 2013-2014, and that's not even close to replacing 300mm wafer usage). I wouldn't count on it being a "saviour" nor cost-friendly for the time frames we are currently interested in.

As for it facilitating larger chips, I'd say you have other problems to deal with rather than worrying about more efficient use of silicon wafers. Big chips are still costly to make just by nature, and typically very power hungry for the incredible number of transistors you'd be packing in there. And who knows how much of it goes to waste anyway since you'll be wanting to design a lot of redundancy so that your huge chip doesn't fail. It just doesn't make sense for a console design.

450mm again, is not some magical saviour.
Wasn't a formula given a while back, that indicated a small drop in clocks would yield a drastic drop in energy consumption?

Recall
Power consumption scales with frequency cubed, a 20% reduction should result in around half the power consumption. I could see both MS and Sony go for largish dies but clocked lower than their PC counterparts. The cost of die area falls continuously, the cost for a given cooling solution does not.

An initial size of 250 mm² for the GPU shouldn't be a problem, over a consoles lifetime (~7 years) it should see two shrinks. Even if new process nodes fail to materialize, the cost of producing at the same node will continue to fall as capital costs are amortized.

I expect around 500mm² for CPU/GPU/eDRAM; The PS2, the PS3 and 360 were all around that at launch.

Cheers

Data from the past...

Playstation 2
Emotion Engine size 225 mm2 .25micron/250nm process (wikipedia entry suggests EE may be slightly bigger, iirc)
Graphics Synthesizer 279 mm2 .25micron/250nm process
Power consumption
if the GH-003 based SCPH-18000 is the same hardware-wise as the 10k and 15k (other than that small extra board), why did Sony downgrade the rated wattage on the case? It went from 50 watts in the prior two versions to only 48 watts in the 18k.-Elijah

Playstation 3
Cell B.E __________ 235.48mm^2 90nm process
Reality Synthesizer_ 258mm2 90nm process
Power supply 380 W
Consumption while gaming 195-209 Watts

While the cost benefits of 450mm wafers may take some time to pass, it is my understanding the industry has decided to move foward with such, and with 2013-2014 console launches it may affect them later during their lifespan transitions to smaller processes.

500mm^2 total combined seems fair game at least.

EDIT:
An 800W power supply can carry the latest and greatest single card pc hardware, if 20% clock drop halves power consumption, a 400W power supply should suffice for 20~% drop in performance.
 
Last edited by a moderator:
Wasn't a formula given a while back, that indicated a small drop in clocks would yield a drastic drop in energy consumption?

Recall


Data from the past...

Playstation 2
Emotion Engine size 225 mm2 .25micron/250nm process (wikipedia entry suggests EE may be slightly bigger, iirc)
Graphics Synthesizer 279 mm2 .25micron/250nm process
Power consumption


Playstation 3
Cell B.E __________ 235.48mm^2 90nm process
Reality Synthesizer_ 258mm2 90nm process
Power supply 380 W
Consumption while gaming 195-209 Watts

While the cost benefits of 450mm wafers may take some time to pass, it is my understanding the industry has decided to move foward with such, and with 2013-2014 console launches it may affect them later during their lifespan transitions to smaller processes.

500mm^2 total combined seems fair game at least.

EDIT:
An 800W power supply can carry the latest and greatest single card pc hardware, if 20% clock drop halves power consumption, a 400W power supply should suffice for 20~% drop in performance.

Theres an interesting major factor in those stats. Power consumption from PS2-->PS3 increased by ~8x to achieve its new performance. First gen PS3 was about as big and as hot as a console can really get, theres no 8x power difference this time. All your new performance has to be in the same or less power envelope.
 
Theres an interesting major factor in those stats. Power consumption from PS2-->PS3 increased by ~8x to achieve its new performance. First gen PS3 was about as big and as hot as a console can really get, theres no 8x power difference this time. All your new performance has to be in the same or less power envelope.

80% performance of a geforce 580, how would that compare to current console performance?

And that is worse case as AMD tends to be more energy efficiency, and the powervr thread hinted 8-10x is viable, with advances in mobile gpus, within these energy constraints.
 
Wasn't a formula given a while back, that indicated a small drop in clocks would yield a drastic drop in energy consumption?

Recall


Data from the past...

Playstation 2
Emotion Engine size 225 mm2 .25micron/250nm process (wikipedia entry suggests EE may be slightly bigger, iirc)
Graphics Synthesizer 279 mm2 .25micron/250nm process
Power consumption


Playstation 3
Cell B.E __________ 235.48mm^2 90nm process
Reality Synthesizer_ 258mm2 90nm process
Power supply 380 W
Consumption while gaming 195-209 Watts

While the cost benefits of 450mm wafers may take some time to pass, it is my understanding the industry has decided to move foward with such, and with 2013-2014 console launches it may affect them later during their lifespan transitions to smaller processes.

500mm^2 total combined seems fair game at least.

EDIT:
An 800W power supply can carry the latest and greatest single card pc hardware, if 20% clock drop halves power consumption, a 400W power supply should suffice for 20~% drop in performance.

Thank you.

I've been saying this 500mm2 die size for a while now.

Though I disagree on the need for a 400watt power supply.

Current high end GPU HD 7970 draws 250w @ 925MHz.

Dropping the frequency by 20% (740MHz) gives us 125w.

Matching the Tahiti Core with a Cell x2 or Xenon x2 @ 28nm (no increase in frequency aside from a "turbo core" mode should do the trick. This chip at 28nm should draw <40watts.

Another 10 watts for optical drive, ram, & misc brings the total draw to 175watts.

125w gpu
40w cpu
10w misc

Factor in power inefficiencies of the PSU, and this puts the need at around 220watts. 250watts to be safe.

Of course, the speed of these components may be adjusted to account for a specific power envelope, but the above is reasonable.

Component size:

352mm2 CPU
120mm2 CPU

472mm2 total
 
I'm fairly certain you can safely increase the "misc" power usage by at least twice. BD reader alone should take at least 10W, probably more. Also I don't think that fast interconnection between CPU and GPU is all that nice for power usage.
 
While the cost benefits of 450mm wafers may take some time to pass, it is my understanding the industry has decided to move foward with such, and with 2013-2014 console launches it may affect them later during their lifespan transitions to smaller processes.

That makes it completely irrelevant for the design decisions. The upfront costs would be quite enormous, and there'd be huge uncertainty and risk as to when the cost savings would be passed down. Hoping on it for it to happen years later makes no business sense.
 
I'm fairly certain you can safely increase the "misc" power usage by at least twice. BD reader alone should take at least 10W, probably more. Also I don't think that fast interconnection between CPU and GPU is all that nice for power usage.

As I pointed out pages back:

The ENTIRE power envelope for a netbook is 30watts.
(Screen, CPU, GPU, RAM, north & south bridge, HDD, USB power for external optical drives) power draw (19V x 1.58A = 30 watts)

For the discussion of a console, the TDP of "other" is fairly irrelevant.

Increase it from 10w to 20w. Doesn't change the thermal engineering of the box. Bump up the PSU a bit and case closed.
 
Current high end GPU HD 7970 draws 250w @ 925MHz.

Dropping the frequency by 20% (740MHz) gives us 125w.

Just wondering how you came to that conclusion.

The Kinect transformer output ~13W btw. Not that it'd put much heat inside a console that supplied power to the Kinect sensor itself, but it'll go on the cost of the main PSU.

I really hope the next Xbox doesn't have an actively cooled power brick. That's a bit lame on the 360 tbh.
 
Just wondering how you came to that conclusion.

Gubbi Quote above.

I don't care one way or the other about the power brick having a fan or not.

What's lame is gimping the hardware in the guts of the machine which limits the onscreen experience.

If a fan in the PSU means a Tahiti instead of a Turks, mark me down in favor of the fan in the PSU.
 
Current high end GPU HD 7970 draws 250w @ 925MHz.

Dropping the frequency by 20% (740MHz) gives us 125w.
How about instead using HD7950? It's smaller (or can be if you actually produce a chip with less units instead of locking some out), roughly 30% slower and using only a little bit more than half of bigger brother's power.

Basically I'd say in the end it's cheaper to make a smaller but faster clocked chip than big and lower clocked one.
 
How about instead using HD7950? It's smaller (or can be if you actually produce a chip with less units instead of locking some out), roughly 30% slower and using only a little bit more than half of bigger brother's power.
It's the same chip, just clocked lower, undervolted, and with parts fused off.
 
Gubbi Quote above.

He says power consumption scales with frequency cubed, which fits pretty well with what I see on my laptop, and that would put a 250W chip at 80% speed at around 160W unless I dun goofed with my maths. If it's a full board you're looking at then power might not scale as it would for a processor alone.

If you're feeding the CPU from the same memory as the GPU then you might not be able to scale that GDDR5 speed back by 20%.

I don't care one way or the other about the power brick having a fan or not.

What's lame is gimping the hardware in the guts of the machine which limits the onscreen experience.

If a fan in the PSU means a Tahiti instead of a Turks, mark me down in favor of the fan in the PSU.

Consoles are always gimped in a number of ways. Not being able to leave the power supply where it might get dusty is super lame and somewhat gimpy. Put the PSU in the system (and add the cooling requirements to the rest of the system) or use a passively cooled brick. And I don't want a console as big as the launch PS3.

A "gimped" console that you buy is infinitely more performant than the console you don't buy.
 
It's the same chip, just clocked lower, undervolted, and with parts fused off.
I know but my point was that instead of taking a high-end huge chip and downclock it just make it a little bit smaller and run it somewhat faster. Yes, power usage won't drop as drastically but you'll make up the cost difference from increased chip count per wafer.
 
It's also better to not gimp clocks too much when taking triangle setup rate into account. :p

I really hope the next Xbox doesn't have an actively cooled power brick. That's a bit lame on the 360 tbh.

PSUs have to be cooled somehow. It won't get any better if people get their wish for 300W on day 1. :p

Put the PSU in the system (and add the cooling requirements to the rest of the system) or use a passively cooled brick. And I don't want a console as big as the launch PS3.
Seems rather contradictory to me. The whole point of the external brick is so that you can reduce the size of the console (despite the internal PSU being smaller than external), or at least not factor it into the cooling design as well.

Making the external brick passive would mean a pretty big jump in heatsink size too, no? i.e. just adds to the shipping weight. At any rate, it's no small concern when there are all sorts of costs to consider!

edit:

lol http://k8.dk/Projekter/XBOX360%20PSU%20Project.htm

This mod seems a bit extreme, and I'm sure with that much metal it ought to be cool to touch after awhile, but I'd be curious about what the TDP of the PSU actually is so that you could find a more suitable heatsink. Ultimately, it's going to be enclosed within a non-conductive shell, so passive cooling may simply just not be feasible.
 
Sorry to interrupt the party but power consumption in the real doesn't to follow that formula.
Let take a look either here or here
Basically ceteris paribus (same tension) with a ~19% in frequency the hd6570 consume the same as the hd 6670.
The hd 6850 does just bit better but it's interesting to look at the tensions for the XFX card, if the tension used is the same the power consumption is basically the same as the hd 6870.

I don't expect Gubbi to say BS, I believe that the formula doesn't take in account leakage/Static power (at all) (by the way I don't think Gubbi stated otherwise, nor that it ignores the fact).

Depending on how good the chips are one could be in AMD situation with the hd 6570 and keep the Tension too high for almost all parts only to make sure that the 15% worse are working properly (random number AMD won't communicate on this).
 
Sorry to interrupt the party but power consumption in the real doesn't to follow that formula.
Let take a look either here or here
Basically ceteris paribus (same tension) with a ~19% in frequency the hd6570 consume the same as the hd 6670.
The hd 6850 does just bit better but it's interesting to look at the tensions for the XFX card, if the tension used is the same the power consumption is basically the same as the hd 6870.

I don't expect Gubbi to say BS, I believe that the formula doesn't take in account leakage/Static power (at all) (by the way I don't think Gubbi stated otherwise, nor that it ignores the fact).

Depending on how good the chips are one could be in AMD situation with the hd 6570 and keep the Tension too high for almost all parts only to make sure that the 15% worse are working properly (random number AMD won't communicate on this).

Those are low performance parts.

Have you noticed that the higher clock you try and go (on overclocking cpu/gpu) the power necessary to hit those clocks starts to ramp far quicker?

With flagship GPUs, AMD/Nvidia are vying for the performance crown and pushing the chips to their limit (within reason considering warranty etc).

So there is a lot of power savings to be had by clocking these chips lower.

Tahiti might not be the chip, and 740MHz might not be the speed, but that's the ballpark.
________________________________

BTW, please don't ever mention a 6570 or 6670 in this thread ever again. :p
 
It's also better to not gimp clocks too much when taking triangle setup rate into account. :p

PSUs have to be cooled somehow. It won't get any better if people get their wish for 300W on day 1. :p

Seems rather contradictory to me. The whole point of the external brick is so that you can reduce the size of the console (despite the internal PSU being smaller than external), or at least not factor it into the cooling design as well.

Making the external brick passive would mean a pretty big jump in heatsink size too, no? i.e. just adds to the shipping weight. At any rate, it's no small concern when there are all sorts of costs to consider!

edit:

lol http://k8.dk/Projekter/XBOX360%20PSU%20Project.htm

This mod seems a bit extreme, and I'm sure with that much metal it ought to be cool to touch after awhile, but I'd be curious about what the TDP of the PSU actually is so that you could find a more suitable heatsink. Ultimately, it's going to be enclosed within a non-conductive shell, so passive cooling may simply just not be feasible.

Not to mention that custom heatsinks are expensive. If you go with standard heatsinks, they don't cost too much, but if you get into custom designs with heat piping etc then it will have a noticeable impact on your BOM. What we sell at work isn't of the same volume of these consumer devices, but I see heatsinks that range in price from $5-100. Put it more expensive hardware with a hotter thermal profile and you're forced to put in a more expensive cooling solution. Everyone wants a high-powered console, but nobody wants a repeat of the original 360 in terms of noise and thermal issues (red ring). I find it unlikely that they'd go much higher than 200W for the next-gen consoles because of price, stability and noise.
 
PSUs have to be cooled somehow. It won't get any better if people get their wish for 300W on day 1. :p

Anything that needs active cooling should be in the case IMO.

Seems rather contradictory to me. The whole point of the external brick is so that you can reduce the size of the console (despite the internal PSU being smaller than external), or at least not factor it into the cooling design as well.

Yeah, and as a result my Falcon's power supply sits next to it on the shelf under the telly, with its fat grey cable waving at everyone who comes into the room. So instead of an internal power supply making the system a little bigger I have a huge power shed sat next to the console taking up a PS3 sized amount of space when you take both the console and the PSU into account.

Keep the power requirement low enough to use a passive brick, Nintendo style, or build it in, Sony style. That beady amber eye always shining out makes it even harder to pretend the power brick isn't there.

edit:

lol http://k8.dk/Projekter/XBOX360%20PSU%20Project.htm

This mod seems a bit extreme, and I'm sure with that much metal it ought to be cool to touch after awhile, but I'd be curious about what the TDP of the PSU actually is so that you could find a more suitable heatsink. Ultimately, it's going to be enclosed within a non-conductive shell, so passive cooling may simply just not be feasible.

Yeah, would be good to know how efficient the 360's PSU was (and for a couple of reasons actually). It could be that it's generating more heat than the entire Wii. Next time I'm on my 360 I'll try and make a note of how much air the PSU seems to move...
 
Status
Not open for further replies.
Back
Top