Nagorak said:
pcchen said:
For desktops, maybe. Many server processors have TDP well over 100W. For example, Itanium 2 has 130W TDP.
So what? You realize that servers are stored in huge cabinets, right?
They're also generally not around anyone who is going to be annoyed by the noise... Obviously it's "technically possible" to have hotter CPU/GPU, the question is whether it's practical. It's also "technically possible" to have a CPU the heat of a nuclear reactor, but that doesn't mean it would be a smart idea...
Servers don't always run in cabinets. First of all, today, people are squeezing servers into 1U, 1/2 U and 1/4U form factors, much smaller than your desktop. Secondly, not every business has a data center or co-location. Many small businesses run their servers in a normal office suite. Itanium and Opteron will eventually find their way into desktops (once they come down from $4000 per chip)
Secondly, the Itanium has about the same # of transistors as today's GPUs. GPUs are not hot simply because of the clock speed, but also because they are massively parallel. One day, there will be a 1 billion transistor GPU (as well as CPU), and it's going to be hot.
I frankly do not care if it is hot or loud. I pay a premium for power and progress. If you don't want a Dragster, buy a Civic. The top of the line future GPUs are going to put out heat and suck power like crazy. If you don't like it, but a "cut down" version tailored for heat, power, and sound. But just because some of us want muscle cars, and you want a quiet luxury car, don't tell me to trade power for comfort.
I am personally in favor of the graphics companies packing the most amount of transistors they can physically get into a process at the highest clock rate possible, with the most amount of RAM possible, at the highest clock rate and bus width. I don't want them to waste silicon trying to cut power consumption or heat. Leave that to the "mobile" or "mainstream" versions of the chips.
Hell, I would love it if water cooling was a standard feature sold by OEMs for hardcore users, rather than a hack that you have to mod yourself.
In other words, Nvidia and ATI can create quiet, low heat, low power versions for average users, but when it comes to speed, I want them to *spare no expense*