*spin-off* Feasibility of Water Cooling Radiator Setups

There's no guarantee that Microsoft wouldn't have specced the water cooler to permit the same load temps with a cheap, inconsistently installed block. Depending on the exact cause of the problem, it could have made things worse or not helped at all.

It was alleged that they were giving the Xenos chip and package very little thermal margin, and that Microsoft had taken lead on the design of the package used--an area it had limited experience with.
The temp fix was to add the thermal margin back in with a slightly better air cooler.
The final fix would wait until the chip and its substrate were redesigned, with help from engineering teams from AMD with more experience with issues such as thermal cycling.

Then they gave it an even cheaper air cooler.
 
It was poor cooling that caused RROD on 360 due to heat induced PCB warping.

It that case a cheap air cooler was not enough...

And the better engineered air cooling solution on the PS3 had a considerably lower failure rate. Either way, the launch 360 and PS3 systems simply had chips too big and too hot for their boxes.

I doubt MS & Sony will want to repeat that, and as such going with smaller chips that produce less heat would mitigate the need for an expensive cooling water solution or an expensive air cooling one, leaving a cheap and cheerful HS+F solution that doesn't break the bank as their best option.

Ultimately the cooling solution is dependant on the cores to be cooled.
 
Look, you are simply NEVER going to get the cooling-per-dollar out of water that you get out of air coolers. Can't be done. Plus you add a pump which is roughly a thousand times more likely to fail than a fan - especially a cheap pump - and, when it fails your cooling goes to ZERO whereas a heat sink still does some good.

You'll argue pumps never fail and you can use a good pump, etc. and that's baloney, pumps fail. I've had two pumps fail. So now the repair is a water-loop pump and the labor to replace it instead of a $3 fan and five minutes.

Like I said, go find a console development director willing to use water and help them with their resume.
 
Ok but by how much did your chipset temperature rise due to that? What about PSU, hard drives and unless GPU is under water as well, that one?
You can do that with air cooler just as well.

They don't rise, they fall...

You water cool a console and the actual temp inside the console will drop a lot.

All the chips that was cause heat build up inside a tiny case would be under water and wouldn't be dumping heat into the case.

Anyway.... back to building my computer..
 
Its just hard to explain to a group of people that have little to no experience water cooling any thing.
The problem with your empirical approach is it's open to false positives, hence why empirical evidence is only used so far in science (which is just the pursuit of the understanding of how things really working instead of the guesswork of yesteryear). It's akin to people swearing by experience that astrology really works, while there's no scientific support and further, less-biased investigation shows it doesn't. It's also akin to people swearing acupuncture works when there's no scientific explanation but there is plenty of empirical evidence. The problem is your preaching on the faith of experience when there is actually a science behind cooling which contradicts you. Either you disprove the science by showing a well devised and executed experiment supporting your theory, or you find it doesn't work how you think it does.

That doesn't mean to say your ideas are wrong, only that you can't support them in a proper debate beyond your confidence in water cooling. Perhaps water cooling has some amazing special power, like cold fusion was supposed to, so far unexplained by science but which can be proven with experiments that show it dumps heat energy more efficiently than other systems. In which case go ahead and prove it! At the moment we only have loose empirical and anecdotal evidence that doesn't address the key concerns, and a body of thermodynamic research and experiments that show water cooling cannot be more effective in extracting heat from a system thanks to using a water filled radiator.
 
Unless installing water cooling somehow magically makes chips produce less heat that doesn't make any sense. Or did you watercool your HDD's as well?
If there is a fan blowing over the radiator, and there's cool water flowing through the system, it's possible a small decrease in internal temperature could happen. Especially early on. I'd like to see the results of water cooling long term once the water has hit thermal equilibrium given all other components as being equal (same size radiator surface area as heat sink, same fan and airflow, etc.).
 
I was actually thinking about it the other day, not necesarily for consoles but moreso for PCs, why i haven't seen someone produce a fullblown refrigeration circuit for a PC tower?

I know compressors are considerably more expensive to run in terms of power consumption than pumps, as well as the fact that your cooling unit downstream of your compressor would be pretty huge (vapour/vapour phase heat transfer, thus you'd need considerably more sureface area in the unit). But there are still lots of crazy people in the world and i'd be intrigued to know if anyone had ever tried it?

Just thought i'd share a bit of my normal Monday morning musings with you all :p
 
why i haven't seen someone produce a fullblown refrigeration circuit for a PC tower?
Water vapor in air tends to condense on things that are colder than the environment. You'd pretty much need to create a full-blown fridge that won't allow external air inside. It's definitely doable but hardly worth the effort, especially considering that you'll need quite a bit more powerful system than your average fridge to keep a high-end 400W+ machine at normal temperatures.
 
I was actually thinking about it the other day, not necesarily for consoles but moreso for PCs, why i haven't seen someone produce a fullblown refrigeration circuit for a PC tower?
There were a couple of manufacturers in the past, the problem is that power has grown a bit ... making it more expensive and the niche even smaller. It always had a problem with condensation too obviously, just not worth the effort for a small extra OC. The silicon just doesn't care that much until it gets really cold.

A closed cycle system which could keep the system at LN2 temperatures would be interesting, but then you're talking about >10K$ systems the size of an actual fridge.
 
There were a couple of manufacturers in the past, the problem is that power has grown a bit ... making it more expensive and the niche even smaller. It always had a problem with condensation too obviously, just not worth the effort for a small extra OC. The silicon just doesn't care that much until it gets really cold.

A closed cycle system which could keep the system at LN2 temperatures would be interesting, but then you're talking about >10K$ systems the size of an actual fridge.

Ooh, this bit intrigued me. How does temperature actually affect the perfomance of a processor chip? Do the two temperature extremes affect the actual electrical properties of the silicon in any significant way? I'm intrigued to know.

I can vaguely recall reading somewhere about possible silicon alternatives for microprocessor production that currently only run at about 3-5 degrees kelvin. I've always wondered what role temperature has exactly on silicon core performance.
 
Last edited by a moderator:
AFAIK the effect on delay is mostly linear, it's just a question of magnitude ... from ambient to -10 is in a different order of magnitude from ambient to -200.

The liquid helium cooled circuit are generally not based on CMOS.
 
Doping concentrations will have some effect on the degree to which temperature matters (for the temperatures we're interested in). You can sort of see the indirect effects when you consider what happens when overclocking and altering voltages.

Regarding electron mobility, you can get one of two different phenomena taking precedence depending on the temperature related to scattering of carriers/holes.
 
I was actually thinking about it the other day, not necesarily for consoles but moreso for PCs, why i haven't seen someone produce a fullblown refrigeration circuit for a PC tower?

I know compressors are considerably more expensive to run in terms of power consumption than pumps, as well as the fact that your cooling unit downstream of your compressor would be pretty huge (vapour/vapour phase heat transfer, thus you'd need considerably more sureface area in the unit). But there are still lots of crazy people in the world and i'd be intrigued to know if anyone had ever tried it?P

Condensation, Noise, Power consumption and the need to insulate everything..

It always had a problem with condensation too obviously, just not worth the effort for a small extra OC. The silicon just doesn't care that much until it gets really cold.

Intel chips couldn't care less about the cold, AMD chips on the other hand LOVE the cold and give you more then just a 'small' overclock..

Vapochill

Whatever happened to them?

They went bump :(

Ooh, this bit intrigued me. How does temperature actually affect the perfomance of a processor chip?

As above, In overclocking terms dropping the temp to -40c allowed me to overclock my Phenom 2 x6 1075T to 4.8Ghz with 1.4v ( Which now happens to be the 9th fastest 1075T in the world, 4th fastest in Europe and the fastest in the UK )

The most I ever got out of it when water cooling was 4.2Ghz with 1.55v

2011-11-01135440.jpg
 
So what about cooling? How long untill we have a reliable, and efficient water-cooling system in our microsoft console?
 
When I read about this display plane chip (whatever it is that renders to different resolutions) I think of the illumiroom demo from Microsoft at CES.
 
Last edited by a moderator:
Back
Top