Predict: The Next Generation Console Tech

Status
Not open for further replies.
2.6million trans for Radeon HD6950 nets 200watts TDP on 40nm.
3million trans for GTX 560 nets 210watts TDP on 40nm.

TSMC and GF are claiming roughly 50% reduction in TDP over 40nm.

That puts power draw of these chips at roughly 100watts on 28nm HKMG.



The launch xbox360 had a 203watt power supply. Power draw of the launch ps3 was roughly 209watts.

What power draw are you expecting for xb720 & ps4?



http://www.dailytech.com/Nextgeneration+28nm+GPUs+Could+Be+45+Percent+Faster/article23158.htm

http://www.globalfoundries.com/newsletter/28nm_frontier.aspx

Hi, I'm new here. Hope you don't mind me jumping in.

I am not positive on this, but I believe at PS3 launch the Cell was at 50-60W and the RSX 60-70W.~110W for both. The rest was taken up by HDD, wifi, bluetooth, memory, southbridge, EE+GS, etc.

I doubt Sony will go for a ~100W GPU alone. It is just too much IMO. But who knows!
 
Hi, I'm new here. Hope you don't mind me jumping in.

I am not positive on this, but I believe at PS3 launch the Cell was at 50-60W and the RSX 60-70W.~110W for both. The rest was taken up by HDD, wifi, bluetooth, memory, southbridge, EE+GS, etc.

I doubt Sony will go for a ~100W GPU alone. It is just too much IMO. But who knows!

Remember though, when looking at a current gpu card that its wattage also includes other things besides just the gpu chip. A Barts Pro 6850 != 127 peak watts in a console variant. With a node shrink, down clocking, and other things stripped/not counted, you could certainly be in the 50-60W or less range.
 
Remember though, when looking at a current gpu card that its wattage also includes other things besides just the gpu chip. A Barts Pro 6850 != 127 peak watts in a console variant. With a node shrink, down clocking, and other things stripped/not counted, you could certainly be in the 50-60W or less range.

Sure, it is possible to do a ~200W console just like PS3/360, I just don't think they will this time.

The cost to do the above is just too much in my opinion. Look at the launch PS3 , it had a very impressive,and no doubt expensive, cooling system which rivalled high end PC GPU solutions.

Another thing that isn't mentioned often enough is the rest of the system besides the GPU/CPU. I think it approaches 1/3 of the total power draw of the PS3 slim? (could be very wrong on this!:oops:)
 
Sure, it is possible to do a ~200W console just like PS3/360, I just don't think they will this time.

The cost to do the above is just too much in my opinion. Look at the launch PS3 , it had a very impressive,and no doubt expensive, cooling system which rivalled high end PC GPU solutions.

Another thing that isn't mentioned often enough is the rest of the system besides the GPU/CPU. I think it approaches 1/3 of the total power draw of the PS3 slim? (could be very wrong on this!:oops:)

Well a cpu/gpu in the 110W range and "other" in the 40-50W doesn't seem too unreasonable. The 40-50W of "other" doesn't need cooling per se, just removal from the case of ambient heat.
 
1.1 amps 12v = 13watts
(power draw of a 12X BRD Burner)
http://www.geeks.com/details.asp?invtid=BDR-206BK-DO-R&cat=DVD

Netbook total system (Display, CPU, GPU, RAM, north & south bridge, HDD, USB power for external optical drives) power draw (19V x 1.58A = 30 watts) food for thought

This notion of 50 watts for "other" ... is ridiculous.


And as upnorthsox pointed out, the cooling required for an optical drive isn't a huge concern.


Summary Power Budget:

210w x 80% (powersupply inefficiency) = 168w

So 100watts for the GPU budget, 13 watts for the BRD, leaves plenty (55watts) for the CPU, RAM, a cooling fan, and a watt or two for USB.
 
Last edited by a moderator:
1.1 amps 12v = 13watts
(power draw of a 12X BRD Burner)
http://www.geeks.com/details.asp?invtid=BDR-206BK-DO-R&cat=DVD

Netbook total system (Display, CPU, GPU, RAM, north & south bridge, HDD, USB power for external optical drives) power draw (19V x 1.58A = 30 watts) food for thought

This notion of 50 watts for "other" ... is ridiculous.


And as upnorthsox pointed out, the cooling required for an optical drive isn't a huge concern.


Summary Power Budget:

210w x 80% (powersupply inefficiency) = 168w

So 100watts for the GPU budget, 13 watts for the BRD, leaves plenty (55watts) for the CPU, RAM, a cooling fan, and a watt or two for USB.

Yea the 50W is too high after thinking about it, but I don't see 30W as being unreasonable for "other". The high clocked gddr5 itself will be in the 10-15W range.

As to a 100W gpu, I'd disagree mostly because the most likely variants are not in that range.

If they went with a VLIW gpu, then you'd be looking at a Barts 6850 variant. At 28nm that should be in the 60W range

If they went with a GCN gpu, then you'd be looking at a Thames 7850 variant coming in around 80W?

A good discussion would be VLIW vs GCN.

As I've said before, I don't believe there'll be any significant difference between Sony's or MS's gpu.

I don't believe any likely cpu will be under 40W and could go as high as 70W.

Those #'s should fit cozily in a 168W limit.
 
Yea the 50W is too high after thinking about it, but I don't see 30W as being unreasonable for "other". The high clocked gddr5 itself will be in the 10-15W range.

As to a 100W gpu, I'd disagree mostly because the most likely variants are not in that range.

If they went with a VLIW gpu, then you'd be looking at a Barts 6850 variant. At 28nm that should be in the 60W range

If they went with a GCN gpu, then you'd be looking at a Thames 7850 variant coming in around 80W?

A good discussion would be VLIW vs GCN.

As I've said before, I don't believe there'll be any significant difference between Sony's or MS's gpu.

I don't believe any likely cpu will be under 40W and could go as high as 70W.

Those #'s should fit cozily in a 168W limit.

Given how forward looking the last ATI GPU was for MS, and how it paid off for them, I would be surprised to see them not use AMD's latest and greatest.

GCN is almost a certainty.

With the GPU taking more of the workload off of the CPU, I expect the CPU budget to be even smaller than the last go round which was already relatively small at ~33%.

165 xcpu
332 (232 + 100) xgpu + EDRAM

This go round, I'd expect roughly 25% budget for the CPU, 75% for CPU (~150watts).

~38watts for the CPU
~112watts for the GPU

*****************************

Wattage comparison:
Sandy Bridge Core i7
2640M
2.8-3.5GHz
35Watts on 32nm

Granted, these are binned parts for the mobile market, but these are also 32nm chips, not 28nm.
 
Last edited by a moderator:
The high clocked gddr5 itself will be in the 10-15W range.

Word is if it is GCN, they have the option to use XDR2:

http://translate.google.com/translate?sl=auto&tl=en&js=n&prev=_t&hl=en&ie=UTF-8&layout=2&eotf=1&u=http%3A%2F%2Fnews.ati-forum.de%2Findex.php%2Fnews%2F34-amdati-grafikkarten%2F2170-rambus-amds-neue-geheimwaffe-gegen-nvidia

This would provide much needed bandwidth with fewer traces on the board.

Perfect for a console.

And the flexibility of the memory controller to use either GDDR5 or XDR2 is also right in line with MS/Sony not having a roadblock to production for their nextgen consoles.
 
We will never know.

It was merely meant as a benchmark, not an intention to predict that the nextgen consoles will have Sandy Bridge CPU's.

I think at least in terms of power efficiency, we certainly know. Intel has the magic that nobody else has. It has been this way for a while now.

You can't compare the power efficiency that Intel gets at 32nm to TSMC's 28nm process, because Intel is going to be way ahead.
 
Chef, Intel's geometry on mature processes is typically considered better than the competition on the same process (earlier in this thread, as an example, someone pointed to how some "90nm" products had average geometry sizes much larger) and no one questions their fab leadership across the board. Intel's design and manufacturing > everyone else by a healthy margin. They now have 22nm with FINFETS slated for April 2012 at retail which is probably 2 years ahead of everyone else, if not better. By the time people get a reasonable release of 20nm product Intel will already have a volume launch for their next process.

If history is any indicator, an Intel part at a set process (especially early one) is better than the competition in terms of density and TDP. They also crank up new processes faster.
 
As an aside, I remember when Xenos was first released was it not mentioned that the GPU itself (not memory) was 25W?
 
Chef, Intel's geometry on mature processes is typically considered better than the competition on the same process ...

Indeed.

I don't think anyone would argue otherwise.

However, we are talking about 32nm vs 28nm.

We (me) are also talking about a significantly less complex target CPU for nextgen xbox720 than a Sandybridge.

I'd estimate 500m trans at the most for xb720cpu. Roughly half the size of an i7.
 
Wattage comparison:
Sandy Bridge Core i7
2640M
2.8-3.5GHz
35Watts on 32nm
[/quote}

That's a dual core 2.8Ghz mobile part. It's doubtful you'd even be able to run Xenon emulation on it.

I tad bit more representative:

Intel Core i7 2600K (3.4GHz)
idle 5W
load 86W

I expect MS to use 28nm shrink for performance and go 2xXenon with OoOE cores. That's 6 cores and 12 OoO threads with die size and power between a 65nm and 45nm Xenon.
 
Indeed.

I don't think anyone would argue otherwise.

However, we are talking about 32nm vs 28nm.

Intel 32nm is likely just as dense if not denser than TSMC 28nm. And Intel 32nm will certainly achieve better power efficiency figures than TSMC 28nm. Just like Intel 45nm vs TSMC 40nm.
 
Status
Not open for further replies.
Back
Top