Predict: The Next Generation Console Tech

Status
Not open for further replies.
Why does it need to be 20x better?
Is there an expectation that the next generation will need to radiate 20x more heat?

Better yet, is the 20x solution when scaled down to 1x as economical as current cooling tech?
If not, it may still be rejected.
 
AMD just announced the 7970M based on a full-fledge Pitcairn core, though under clocked to 850mhz from 1000mhz.

In gaming, the 7870 consumes just 115w, under clocked should put it at 75w or less given the mobile PSU.

It would be cool to see a 7870-based APU paired with an asymmetrical 7970 for next-gen

The TDP of that card (7970M) is 100W, for reference.
 
According to the people that designed the Nanowick at Purdue, a laptop heatpipe cools 50 watts per centimeter, and the Nanowick design cools 550 watts per centimeter. In one article they mentioned using a hydrofluorocarbon and achieving 1000 watts per centimeter.


How much it would cost to make such a cooler for a laptop or console is the major question.


I suspect just as the next-gen chip designs will achieve over 10x times the performance of the XB360, the XB720 case and cooling solution will be 20x better.

This is very interesting. Great find.

May not be ready for next-gen consoles however.

Standard heatpipes however can be used for applications between 5W-600W per single pipe. There's no real need for any more exotic Nanowick variants as from the description, the addition of carbon nanotubes in the wick structure of the pipe would greatly increase the manufacturing cost I would imagine. Standard heatpipes should be more than sufficient.

I think that if Sony and MS want to design consoles with 300W GPU and 150W CPU TDPs then they'd certainly be able to design a cooling solution to efficiently cool them. The question then would be one of cost, which is the primary factor after all.
 
Last edited by a moderator:
Are vapor chambers used only with the hottest GPU chips right now? I was thinking that if it was less expensive than heat pipes it would have appeared in the lower end GPU too, specially in the same line. I know nothing about cost, but they'd obviously use something only if it costs less than the usual methods.

In a console with a much larger heat sink than a GPU (low fan speed, low noise). we'd need heat pipes for reach anyway, so in this case would it only replace the heat spreader?

I saw simple copper heat spreaders for leds which are 0.1°C/W, so at 100W it's only a 10°C delta to spread a 9mm slug to a more manageable 28mm. When a system allows a 60°C delta between the die temperature and the air temperature, even reducing that 0.1°C/W by 20 times, I doubt the heat sink would be a lot smaller because the whole chain still allows 0.6°C/W, the bulk of the job remains the fins surface area. However with a 250W GPU I guess they REALLY need this tech. Same thing for laptops where heat sink area is ridiculously small (i.e. large temp delta between fins temp and air, leaving little room for everything else).
 
AMD just announced the 7970M based on a full-fledge Pitcairn core, though under clocked to 850mhz from 1000mhz.

In gaming, the 7870 consumes just 115w, under clocked should put it at 75w or less given the mobile PSU.

It would be cool to see a 7870-based APU paired with an asymmetrical 7970 for next-gen

TDP on that chip is supposed to be 65Watt (Google told me)

So what exactly is Pitcairn? In a nutshell, take Cape Verde (7700) and double it, and you have Pitcairn. Pitcairn has twice the number of CUs, twice the number of ROPs, twice the memory bandwidth, and of particular importance twice as many geometry engines on the frontend. This works out to 1280 SPs among 20 CUs – organized as a doubling Cape Verde’s interesting 4/3/3 configuration – 80 texture units, 32 ROPs, 512KB L2 cache, and a 256-bit memory bus. Compared to Tahiti, Pitcairn still has 12 fewer CUs and as a result less shader and texturing performance along with the narrower memory bus, but it has the same number of ROPs and the same frontend as its bigger brother, which as we’ll see creates some very interesting situations.
http://www.anandtech.com/show/5625/...-hd-7850-review-rounding-out-southern-islands

So how does that chip compare to a PS3/360?
 
But if you take a standard core and underclock it to achieve a certain TDP then i would think that it could be done. Cooling in a console can be significantly more efficient in a console than in a laptop. There is just more space to build beefier HSF units. The original HSF for the 360 cpu is at least twice as thick as my currenty laptop and mine isnt and ultra book and it has a GT630 M gpu in it.

If you wanted to I am sure you could use it, there are many ways and criteria for binning but since total TDP can be higher in a console its 100% possible though maybe not desirable.
 
They can't bin for consoles but that doesn't mean they can't use pitcairn. Not that they are going to use pitcairn in a console launching more than a year from now.
 
They can't bin for consoles but that doesn't mean they can't use pitcairn. Not that they are going to use pitcairn in a console launching more than a year from now.

Yes, the rumors that the consoles coming out next Fall at earliest are going to use currently available GPUs irks me to no end, but that's all the media seems to want to report on. People forget the original Xbox was a hybrid Nvidia 3xx/4xx device before 4xx ever launched. Same with Xenos (unified shader architecture before R600 hit). Thus, I don't think a 10x + FLOP increase is asking too much for this generation. Ideally we see a Sea Island hybrid borrowing ideas from Sea Island's successor, depending on debut date.
 
Why does it need to be 20x better?
Is there an expectation that the next generation will need to radiate 20x more heat?

Better yet, is the 20x solution when scaled down to 1x as economical as current cooling tech?
If not, it may still be rejected.

The iPad and tablets are becomming popular. They are encroaching on the consoles realm.

The wall socket is an ally for future consoles. Just like in the SF novel Dune where "the spice must flow", in future console world "the wattage must flow". Tablets have physical limits as a result of being mobile and slim, so if I'm Microsoft I'd exploit the advantages a console has over a tablet: power consumption and a gamepad with advanced haptic feedback.
 
They can't bin for consoles but that doesn't mean they can't use pitcairn. Not that they are going to use pitcairn in a console launching more than a year from now.

They need to clock conservatively and they need to factor in a few faulty CUs. So you end up with a 7850 if you start out with a 7870.

They might bin GPUs and CPUs on power consumption and pair a GPU with higher power consumption with a CPU with lower power consumption (and vice verse) and extend the performance envelope that way.

Cheers
 
Status
Not open for further replies.
Back
Top