Predict: The Next Generation Console Tech

Status
Not open for further replies.
That is, because the system had an overall power usage of 200 Watts, under load. Assuming 80% efficiency of the PSU, that comes down to 160 Watts. And that's lowballing it, I think. I'd argue that 40 Watts for "misc" is a bit much. The hdd is a 2.5'' laptop drive, which probably uses less than 2.5 Watts, the chipset isn't too huge either and the BDROM doesn't make up the rest. Though, something between 140/150 Watts overall for CELL and RSX and RAM is probably correct.

Sure, I could be very wrong with my figures (in my last post I couldn't even add 70+50 correctly!) but I still say a lot of people underestimate the "Misc" column and overestimate the RSX+Cell column.

My guesses are mostly based on the figures for Cell given by IBM. 45nm Cell "below" 20watt, "less than" 40% that of 90nm Cell.

Like I've said before, I would love a really powerful console with 4-8GB RAM but to get that , I'd guess we'd have to wait till late 2014 or 2015.
 
Where did you find that picture? Flexible LCD displays?

I hardly believe that is silicon interposers meant for 3D-stacking.

I´ve seen one reference to the price here.

Is cost still an issue? Hmmm, it could be. SemiMD and Ultratech say a 300mm 65nm interposer wafer costs $10,000, which is several times larger than the cost of a 28nm high-k/metal gate wafer! The high cost is apparently because economies of scale haven't kicked in, and few interposer suppliers exist. Its not surprising that Xilinx is targeting the 2.5D FPGA towards cost-insensitive applications, such as ASIC prototyping, military equipment as well as high-end computing and communications.

The next question to ask is: will cost go down when economies of scale kick in? Yole Developpement, an analyst firm, claims interposers built with depreciated 300mm equipment could cost as low as $615 per wafer. They mention this is below the $700 per wafer (1 cent per sq. mm) required for large scale adoption. Another analysis on interposer cost from eSilicon reached similar conclusions.

The high price is probably due to the limited production.

That said, the lower price aligns with what I had read earlier that there are a lot of older 65nm and 90nm fab space that is cheap and interposers offer a good way to leverage the volume.

At $700, assuming a large SI (e.g. 20mmx25mm, so 500mm^2) to accomodate a GPU slighly above 200mm^2 and 2 DRAMs (so a little more than 100 dies per 300mm wafer) that is about $7 an SI. Now if that allows a really wide I/O and your DRAMs right next to your GPU and assuming stacked memory that sounds like that would be a major reduction in power draw and the cost is less than, say, a dedicted bus for a large external eDRAM module. Sounds win-win to me in regards to hitting 2 birds with one stone (performance, power).
 
It's larger than the original PS3 (it's huge!) and it has an Xbox 360 style external power brick! And the according to the internets it's noisy (the GPU cooler) under load.

I hope the X51 doesn't reflect nextgen consoles. They don't need to be Wii small, but I hope they aren't noisy.

X360 had a power brick already, so thats no problem.

To say it's louder than launch era 360 or PS3 I'd need actual measurements of both and not, some guy said it was loud on the interwebs.

Also, it's completely off the shelf, and you literally can put a modular off the shelf video card in it. So it's extremely unoptimized compared to what a console design would be.


By far, though, the most impressive part is that GPU bay. By giving the lion's share of space in the case to the GPU and using a PCI-Express riser card to angle the socket parallel to the motherboard, Alienware's managed to fit a full-size, dual-slot graphics card up to nine inches long. Alienware told me the 330W power supply only officially supports graphics cards that draw up to 150W TDP (or 105W TDP for the entry-level 240W PSU) which initally made me think an upgrade would be a bit of a waste, as both the Nvidia GeForce GTX 555 and GT 545 that Alienware sells already max out their respective power supply's recommended wattage spec. Thankfully, there seems to be a good bit of wiggle room with the larger power supply, as I transplanted the 170W+ Gigabyte GeForce GTX 560 Ti OC from our $1,000 Verge Gaming Rig to the X51 with no trouble — just a driver reinstall — and enjoyed substantially higher framerates after doing so.

Update: We fired our infrared temperature gun at the rear vents of the Alienware X51 midway through an intensive session of Battlefield 3, and were pretty pleased with the result: around 106 degrees Farenheit. That compares pretty favorably to the Xbox 360 S: in the same test, it heated up to 108 degrees even when idle.

http://cdn3.sbnation.com/entry_photo_images/2961377/alienware-x51-main_large_verge_super_wide.jpg

It is bigger than a launch PS3, but I'm not sure who out there would be saying "the launch PS3 was fine but noway I could ever fit that gigantic Alienware under my tv!!!!!". It's shape would also be easier to deal with because it allows some stacking.
 
Sure, I could be very wrong with my figures (in my last post I couldn't even add 70+50 correctly!) but I still say a lot of people underestimate the "Misc" column and overestimate the RSX+Cell column.

My guesses are mostly based on the figures for Cell given by IBM. 45nm Cell "below" 20watt, "less than" 40% that of 90nm Cell.

Like I've said before, I would love a really powerful console with 4-8GB RAM but to get that , I'd guess we'd have to wait till late 2014 or 2015.

The PS3 slim uses half the power of the original PS3, maybe that can be used to do math on the Misc?

In any case, i think the most important aspect of power, is getting heat out of the console, making sure it doesn´t melt it self to death.. and be able to turn the power up and down depending on workload. I would have no problem with a 250 watt or even more console as long as it only uses the power when i play games. Movie playback, surfing etc should not eat that much..
 
The PS3 slim uses half the power of the original PS3, maybe that can be used to do math on the Misc?

In any case, i think the most important aspect of power, is getting heat out of the console, making sure it doesn´t melt it self to death.. and be able to turn the power up and down depending on workload. I would have no problem with a 250 watt or even more console as long as it only uses the power when i play games. Movie playback, surfing etc should not eat that much..

I want to compare a launch PS3 with a launch PS4 to be fair.

Also, I imagine the "misc" would also have been power reduced on the latest slim PS3 with the possible exception of the HDD?

Anyway all I'm asking is that if I'm anywhere close with my guess that just RSX+Cell were ~120watts at 90nm, what could be achieved with that sort of power budget at 28nm?

I really do believe that 120watts is the limit for just the CPU/GPU for a 2013 launch for PS4/720.
 
At 120W you are looking at a Radeon HD 7850 GPU (about 105W in some tests at load iirc, rated 135W or so, but also remember it is using GDDR5 [8 chips at 1.2GHz on a 256bit bus] on a full board; integrated into a system I would guess the total power cost could be reduced in a console through design) and something akin to the 4 cores in the higher end Trinity APU (35W, strip out the GPU guts) or maybe getting away with 8x of the Trinity Mobile cores (17W for 4 cores + GPU). Or keep the GPU guts and really leverage compute?
 
I think a lot of people overstate how much power/heat the 360/PS3 chips used/produced.

Some people are expecting to have a 100-150watt 28nm GPU when even the PS3's RSX was probably around ~70watt and Cell ~50watt at 90nm.

I would love MS and Sony to prove me wrong, but I won't hold my breath just yet!

I would like a more knowledgeable member to estimate what specs would be achievable for a late 2013 launch at 28nm with a ~110watt budget for just the GPU/CPU?

I think that would be a more reasonable expectation?

I think you are underestimating how much power hungry the first models of ps3 and xbox360 were !

those are the real life numbers of power consumption of ps3 according to wikipedia : (but citation needed, i will try to find a source when I have time, maybe digitalfoundry)
http://en.wikipedia.org/wiki/PlayStation_3_hardware

first model of ps3 : (90 nm GPU, 90 nm CPU)
power supply (internal) : 350 watt.
standby : 1 watt
xmb menu : 171-176 Watts
playing a bly ray movie :173-178 Watts
playing a game (FF13) : 195-209 Watts


the last model : (40 nm GPU, 45 nm CPU)
power supply (internal) : 200 watt.
standby : 0.5 Watts
xmb menu : 61 Watts
playing a bly ray movie : 71-73 Watts
playing a game (FF13) : 72-79 Watts


Now onto xbox360 : (I used different sources)
http://en.wikipedia.org/wiki/Xbox_360_hardware
http://wpweb2.tepper.cmu.edu/ceic/pdfs/CEIC_11_01.pdf


Xbox 360 first model (90 nm GPU, 90 nm CPU)
power supply (external) : 203 watt
standby : ???
navigating menu : ???
playing a DVD movie : ???
Playing a game : 172 watt

the last model : (45 nm corona combined GPU+CPU)
power supply (external) : 115 watt
standby : 2 watt
navigating menu : ???
playing a DVD movie : ???
playing a game : 88 watt


So in average the first ps3 and xbox360 models consumed 200 watt of power when playing games.

Now according to this website :
http://www.sust-it.net/energy_saving.php?id=71

The power consumption of consoles is :
Playstation 1 : 10 watt
Dreamcast : 22 watt
ps2 : 50 watt
Gamecube : 21 watt, Nintendo Wii : 17 watt
Xbox1 : 70 watt

and maybe those are the real numbers for those consoles according to member in this forum topic :
http://s9.zetaboards.com/Nintendo_64_Forever/topic/7044025/1/
NES: 9 watts
Super NES: 10 watts
Nintendo 64: 19 watts

The data shows clearly there is a trend of increasing power usage with each new generation of consoles, with every new generation of consoles electricity power consumption increases, why this time it would be any different ? this is in support of my point of view that the next xbox and ps4 would have at least the same power consumption (200 watt) and silicon budget of their predecessors if not more.
 
Power could go up (I certainly hope it will, 250-300), but I think it's pretty clearly not going to double for this generation. They are running into the limits of whats reasonable for a console sized box in terms of power consumption heat and noise.
 
It's larger than the original PS3 (it's huge!) and it has an Xbox 360 style external power brick! And the according to the internets it's noisy (the GPU cooler) under load.

I hope the X51 doesn't reflect nextgen consoles. They don't need to be Wii small, but I hope they aren't noisy.

It does not matter that it does not have an internal PSU, other cases that can fit the same components as the alienware and are very close to the same size do have internal PSUs.
e.g
http://www.silverstonetek.com/product.php?pid=277&area=en
It even comes with a 450Watt PSU!

I am sure that a console could cut down on the space it uses due to the fact it does not have to support standard parts.
 
Acert, a 105watt HD7850 would be nice but that would only leave 15watts for a CPU in my imaginary 120watt CPU+GPU power budget!

It isn't just a 105W GPU. It is 105W for a GPU on a full board and 8 x 1.2GHz memory modules on a 256bit bus. The board is quite long/complex (e.g. it has to also power a PCIe bus), very similar to what you would expect a console MB to look like. A 256bit memory bus with 8 chips is about what you expect for an entire console.

On the CPU side I would just ask: what CPU are you expecting? What do you gain, CPU performance wise, going from a 17W or 35W Trinity (which if you strip out the GPU on such are going to be smaller) to a CPU with a 50W + performance? The performance nor core count certainly don't scale linearly by adding TDP or cores. Looking at an 4 module / 8 core BD and you are looking at a CPU w/o memory or a board with a TDP higher than a 7850 with 8 1.2GHz memory modules!

So it is easy to turn a nose at a ~ 17W-35W CPU because it is so disproportionally small, in term of budget, to a 105W GPU. But moving to a higher frequency 4 module BD nets you what in performance? The Power cost versus the performance is totally backwards. And when you look at say, Cell, and how it is being used (for a lot of GPU helping and stuff where GPGPU has encroached). There is a reason why Intel and AMD, given the opportunity to inflate their CPU core counts, have diverted major CPU real-estate to GPUs.

Splitting 120W evenly between the CPU and GPU, in a 60W CPU and 60W GPU system, would get crushed in real world scenarios where you have a 30W CPU and a 90W GPU. Which, looking at the 7850, if you minus the 8 dimms at 2-3W each, so lets say 20W which I am guessing 1.2GHz runs on the higher side, and the fact you are looking at powering an entire PCIe board and not just the GPU, a Pitcairn 7850 is well below the 90W there. As someone else noted memory controllers and the power needed to power a bus are huge costs so moving to, say, an Interposer which gets the memory closer and low power costs, you really do have potential for large GPU power savings. Which could translate into a larger shader array for example.
 
Thanks DrFouad.

Power could go up (I certainly hope it will, 250-300), but I think it's pretty clearly not going to double for this generation. They are running into the limits of whats reasonable for a console sized box in terms of power consumption heat and noise.

People should think of it as sticking an incandescent light bulb in a box. We used to have a well house and we put a bulb in there and it prevented the well pipes from freezing--it is amazing how much energy comes off a bulb. For a 300W console it is like 3x 100W bulbs in a little box. That is a LOT of heat to get OUT of the case. And it needs to be a fail-proof cooling solution.

It is a lot easier to get 200W out of a small enclosure like a PS3/360 case than 300W. I have not looked at the math but I bet it isn't a linear problem; those extra 100W is going to take a much more robust and powerful cooling solution.

Maybe next gen will gave HMD with stereo headphones... then the console can be uber loud ;)
 
For 3D stacks of more than 2 chips with high I/O density you need TSVs through the ICs, if you have TSVs through the ICs there is no need for an interposer. You can just build a stack of ICs connected directly to each other (potentially with organic substrates to bridge power/low-density IO across a level of the stack).

Silicon interposers are intended to be used for what the above link calls 2.5D, MCMs with higher I/O density ... it's a stop gap measure on the road to 3D stacking.

Ok, we are basically talking the same thing then. I'd add though, in a 3d stack an interposer can be used for alignment between two disparate chips. It can be easier to just throw an interposer in than redesign the chips to align. It can also be used as a thermal break.
 
It's a real shame that next-gen consoles will not come in standard AV Rack enclosures. That would allow for the larger power envelope targets people are dreaming of (250Watts or higher).
 
8 geometry units?
AMD achieves quite a punch with 2 in GCN.
The GeForce 480/580 had 16 "polymorph engines" yet the rasterizer count seemed to define performance in theoretical tests so it remains to be seen if 8 "geometry units" means much.
 
It's a real shame that next-gen consoles will not come in standard AV Rack enclosures. That would allow for the larger power envelope targets people are dreaming of (250Watts or higher).

I dunno, we don't need a full rack but just a 21x14x9 AV receiver format like this Sony one would do. People could just integrate it right into their nice HiFi system and no one would even know you have a console, you could even market it as a set-top box item that seamlessly integrates into your existing system ;)
 
That's exactly the sort of thing I was meaning... use the typical AV equipment sizing so it will blend nicely with existing equipment.
 
What you said AV Rack I thought you were talking about something like this or this. :p

I know some people with pass a brick if MS/Sony announced standard large sized AV format. But after seeing my parents and grandmother make room for their set top boxes/receivers from the cable company it kind of hits home that a lot of people are used to/familiar with/comfortable with a format. Space really isn't an issue for me... my 360 is still too big but I make room. If it was 21x14x9 I would just make room.

If that kind of space, a negative, was used to justify more aggressive HW I am ok with that. Not because more aggressive HW is necessarily a lot more expensive but because the process technology has dictated that to keep the pace of performance something has had to give. And heat has been the compromise for the most part.

That said I don't think anyone has the boiled eggs to do it. Sony has the best angle at it--they offer AV equipment as it is--but I think they pride themselves on their design aesthetics too much to say, "Screw it, the PS4 has finally matured to a full fledged part of the home AV system, but this time as the central hub to digital connectivity and the final piece to the ultimate entertainment system." I think MS would go with a Wii form factor before going with a massive box... oh wait, there was the Xbox 1 ;)
 
I think you are underestimating how much power hungry the first models of ps3 and xbox360 were !

those are the real life numbers of power consumption of ps3 according to wikipedia : (but citation needed, i will try to find a source when I have time, maybe digitalfoundry)
http://en.wikipedia.org/wiki/PlayStation_3_hardware

first model of ps3 : (90 nm GPU, 90 nm CPU)
power supply (internal) : 350 watt.
standby : 1 watt
xmb menu : 171-176 Watts
playing a bly ray movie :173-178 Watts
playing a game (FF13) : 195-209 Watts


the last model : (40 nm GPU, 45 nm CPU)
power supply (internal) : 200 watt.
standby : 0.5 Watts
xmb menu : 61 Watts
playing a bly ray movie : 71-73 Watts
playing a game (FF13) : 72-79 Watts


Now onto xbox360 : (I used different sources)
http://en.wikipedia.org/wiki/Xbox_360_hardware
http://wpweb2.tepper.cmu.edu/ceic/pdfs/CEIC_11_01.pdf


Xbox 360 first model (90 nm GPU, 90 nm CPU)
power supply (external) : 203 watt
standby : ???
navigating menu : ???
playing a DVD movie : ???
Playing a game : 172 watt

the last model : (45 nm corona combined GPU+CPU)
power supply (external) : 115 watt
standby : 2 watt
navigating menu : ???
playing a DVD movie : ???
playing a game : 88 watt


So in average the first ps3 and xbox360 models consumed 200 watt of power when playing games.

Now according to this website :
http://www.sust-it.net/energy_saving.php?id=71

The power consumption of consoles is :
Playstation 1 : 10 watt
Dreamcast : 22 watt
ps2 : 50 watt
Gamecube : 21 watt, Nintendo Wii : 17 watt
Xbox1 : 70 watt

and maybe those are the real numbers for those consoles according to member in this forum topic :
http://s9.zetaboards.com/Nintendo_64_Forever/topic/7044025/1/
NES: 9 watts
Super NES: 10 watts
Nintendo 64: 19 watts

The data shows clearly there is a trend of increasing power usage with each new generation of consoles, with every new generation of consoles electricity power consumption increases, why this time it would be any different ? this is in support of my point of view that the next xbox and ps4 would have at least the same power consumption (200 watt) and silicon budget of their predecessors if not more.

When you say I'm underestimating how power hungry the PS3/360 were when launched, are you talking about my estimate of ~70watt for RSX and ~50watt for Cell at 90nm?

If you are, I'll try and explain a bit further ( Though the following could have errors!)

RSX= ~70watt (A guess based on power consumption of the Cell and the Nvidia 7900GTX)
Cell= ~50watt (Based on IBM's own figures. Link: http://realworldtech.com/page.cfm?ArticleID=RWT022508002434&p=1
BD Drive= ~10watt
HDD= ~3watt
WiFi+Bluetooth+HDMI controller etc= ~3-5watt
XDR RAM+GDDR3 VRAM ~10-15watt?
EE+GS chip ~15watt? (Not sure how much this would use or if it was "always on")

Add on PSU losses assuming 80% efficiency, and that is ~200watt "at the wall". I think that is a reasonable guess of the breakdown?
 
I think you are underestimating how much power hungry the first models of ps3 and xbox360 were !

...

The data shows clearly there is a trend of increasing power usage with each new generation of consoles, with every new generation of consoles electricity power consumption increases, why this time it would be any different ?
I already explained the reasoning in my previous reply to you.
 
Status
Not open for further replies.
Back
Top