How long till this is used to Cool GPUs?

This is a tiny refrigerator made using standard practices today.

http://www.livescience.com/imageoftheday/siod_050427.html

The National Institute of Standards and Technology-designed refrigerators, each 25 by 15 micrometers, are sandwiches of a normal metal, an insulator and a superconducting metal. When a voltage is applied across the sandwich, the hottest electrons "tunnel" from the normal metal through the insulator to the superconductor. The temperature in the normal metal drops dramatically and drains extra heat energy from the objects being cooled.

The researchers used four pairs of these sandwiches to cool the contents of a silicon nitrate membrane that was 450 micrometers on a side and 0.4 micrometers thick. A cube of germanium 250 micrometers on a side, about 11,000 times larger than the combined volume of the fridges was glued on top of the membrane. This is roughly equivalent to having a refrigerator the size of a person cool an object the size of the Statue of Liberty. Both objects were cooled down to about -459° F.

The refrigerators are made using common chip-making lithography methods, which makes it easy to integrate them in production of other micro scale devices. These tiny fridges are much smaller and less expensive than conventional equipment. The fridges have applications such as cooling cryogenic sensors in highly sensitive instruments for semiconductor analysis and astronomical research.
Now why couldn’t you have a single cooling layer at the Top or bottom of a GPU (or cpu) that takes the place of these crazy cooling contraptions? It would also seem to lend itself to allowing pretty extreme clock Frequencies since the chip could be kept at below freezing.

Yo ATI.. lets get some real innovation going on here. ;)

R700 maybe?
 
Last edited by a moderator:
What conventional equipment are they referring to?

I'm still concerned about cost. I also wonder how much power the friges consume.
 
I'm sure the electrons are drained from the metal plates outside of the insulating material and not the chips this device would cool.

I wonder if the plates are also connected to the power supply as the plates themselves are not an infinite source or electrons.
 
It doesn't resolve the main issue of GPU cooling: dumping excess heat into the atmosphere. Really, all it would do is make the GPU run a bit cooler. A big heatsink and fan will still be required because the same amount of heat is still being generated.

There are lots of other problem as well, like needing a super conductor that will work at room temp.
 
All it is, is a heat pump. The heat from what its cooling has to go someware and todays GPU/VPUs produce way too much heat for this heat pump to get rid of.
 
The day superconductors can operate near/above room temperature, then this idea could be considered. Right now that is a long way away (at least it was several years ago when I was more knowledgable in the subject).
 
Heat would be carried by the electrons over the wires and that is where heat would be dissipated no? Given how once the wires hit the electric grib you basically have a SUPER heat sink if I understand things correctly.

I do see the superconductive material being a problem at room temperature unless some major break through has been made.
 
Bouncing Zabaglione Bros. said:
It's just a tiny version of a peltier, with all the problems and limitations that are associated with it.

If physics and costs were not part of the equation id like to see a cooling device using supercooled water.
 
gkar1 said:
If physics and costs were not part of the equation id like to see a cooling device using supercooled water.
Is that like a super saturated solution of ice in water :p
 
I read a while ago about integrating watercooling on the chips surface. They used piezo-waterpumps.

I hope this will be available sometime. :D
 
Nathan said:
It doesn't resolve the main issue of GPU cooling: dumping excess heat into the atmosphere. Really, all it would do is make the GPU run a bit cooler. A big heatsink and fan will still be required because the same amount of heat is still being generated.

Not really. There's a huge problem with semiconductors nowadays with heat _density_. 60-100W of heat aren't all that troublesome in and of itself, simple heatbulbs are in that range. If you put that amount of power in a <1cm^2 area it means more W/area than a stove hotplate! That doesn't really take into account 'hot-spots' either, as different areas of the chip have different average usage patterns.

On top of that, you have to think in 3D in contemporary IC-circuitry. The lower layers may even be insulated by the others to the heatsink as the chip itself isn't a stellar heat conductor. I've seen suggestions where you perforate the chip, as it were, to allow cooling to reach the entire chip in all dimensions.
 
Still going to have to cool the hot side of the TEC, so you wouldn't really be able to embed those things inside of a chip. They'd have to be surface mounted and have some sort of means of cooling the hot side in order to maintain the cooling efficiency of the cold side.
 
MPI said:
Not really. There's a huge problem with semiconductors nowadays with heat _density_. 60-100W of heat aren't all that troublesome in and of itself, simple heatbulbs are in that range. If you put that amount of power in a <1cm^2 area it means more W/area than a stove hotplate! That doesn't really take into account 'hot-spots' either, as different areas of the chip have different average usage patterns.

On top of that, you have to think in 3D in contemporary IC-circuitry. The lower layers may even be insulated by the others to the heatsink as the chip itself isn't a stellar heat conductor. I've seen suggestions where you perforate the chip, as it were, to allow cooling to reach the entire chip in all dimensions.
A peltier-style device will not help with hotspots or thermal gradients within the chip either.

The issue is not sucking the heat from the chips. Watercooling amply demonstrates that a copper conductor can efficiently deal with the power density requirements of modern ICs. The problem is where to put all that heat once it's out of the IC. Watercooling works well because it transfers the heat to a large radiator that can dispose of it efficiently. There really is no magic device that will remove the need for large GPU heatsinks/radiators. You can use various tricks to take the heat to a radiator that is further away, but the same amount of heat is still generated.

With regards to the original topic, a peltier or mini-fridge will reduce the GPU's temperature while increasing the heatsink's temperature. As a result, the heatsink will be more efficient, but this will be offset by the extra heat load of the peltier, so the net effect is limited. However, the GPU is running cooler. When power density becomes a greater issue, peltiers may start making an appearance.
 
gkar1 said:
Not quite. It is water below freezing point but kept at such high pressures that it does not solidify.

More to do with high osscillation - but if you remember the light water / dark water theory tests which had to achieve -50 Celcius to see if the light / dark phase seperation occured - it stalled at -48 Celcius; -50 proved unreachable - it always block freezes below -50 no matter that exotic precautions were taken - must have been amazing to see at large scale!
 
ERK said:
The day superconductors can operate near/above room temperature, then this idea could be considered. Right now that is a long way away (at least it was several years ago when I was more knowledgable in the subject).
Still is a long way away. We have some that will work with liquid nitrogen (and thus the next best thing to room temp superconductors), but they are very brittle and cannot be made into wires, and are thus useless for many applications.

This application, though, might well work okay for a ceramic-based superconductor, and could be a good way to take a liquid nitrogen cooling system to the next level.
 
Nathan said:
You can use various tricks to take the heat to a radiator that is further away, but the same amount of heat is still generated.
Well, actually, if you cool the chip down dramatically, the resistance of the materials also drops dramatically, reducing power consumption. I'm not sure of the full details of how this works with silicon processors, but I'm sure it would help significantly with exotic cooling techniques.
 
Back
Top