I dont, but many do.
LMAO, are you serious?
I dont, but many do.
No it isn't. Let's put things into perspective here. 100-400W for high-end graphics cards is NOT a whole lot of electricity. A plasma TV alone or a couple light bulbs can dissipate more than that. Not to mention things like AC and other household appliances like dishwasher washing machines dryers etc.Uhh, why on Earth wouldn't SLI/Crossfire be banned? That's huge electric use.
But what if the electricity doesn't come from a greenhouse gas-generating source?A lot of people believe in global warming. I dont, but many do.
Dont buy SLI or Crossfire.. you're preventing global warming...
Never thought of that. Good point.So if you had two machines available that had the same quietness, the same features, the same reliability and the same performance, etc. but one was 25% less expensive, most would choose the less expensive one. 12 months later, if they took their machine apart, examined it with a volt meter and noticed a sophisticated heat exchanger mounted on the chip, and realized that it had a higher power consumption than the other machine that they didn't buy, would they care much?
Hence SLI, no? At least, that seems as good a reason as its obvious marketing advantages ("faster than any other card!"). [Edit: I'm guessing] IHVs can only push die size so far before it becomes a cost liability for the majority of its sales, and SLI is the way out.As a result, in the future you will see more and more chips primarily increase performance by dramatically increasing transitor count rather than frequency.
When chips were silicon constrained, the priority was given to minimizing die area and maximizing freqency to maximize the performance/die area ratio and thus maximize the performance/cost ratio. As chips become power constained, you maximize the performance/cost ratio by maximizing the performance/power ratio instead. This leads to the rather interesting reversal of emphasis, since increasing the power efficency means reducing the frequency and increasing the die area (more transisters running at lower frequencies). This is because power scales up nonlinearly with frequency but performance only scales linearly with frequency. However, both power and performance scale linearly with transistor count. As a result, in the future you will see more and more chips primarily increase performance by dramatically increasing transitor count rather than frequency.
Area has a much better chance of increasing indefinitely than power does. After all, you can't extract much more than ~1 kW out of a wall socket!Area can't keep on going up forever the way it's going now, so frequency has to go up to increase performance per area.
It's all a big multi variable optimization. Which is why it is fun.
- Static power consumption scales linearly with area and accounts for a large amount of overall power consumption, but it stays constant with frequency.
- Area can't keep on going up forever the way it's going now, so frequency has to go up to increase performance per area.
- Power scales linearly with frequency... all other things equal. Going beyond a certain point, additional logic is needed to allow higher frequencies, but that amount depends big time with the frequency increase you're aiming for. And there are regions along this curve where increasing frequency is a better deal.
In other words, I don't really see a reversal of emphasis at all.
After all, you can't extract much more than ~1 kW out of a wall socket!
Sure it is, in North America (where it matters).Well this bit isn't true, you can usually get several kW out of the wall socket.
Sure it is, in North America (where it matters).
But desktop GPUs tend to be manufactured with processes that don't optimize for static leakage. My understanding is that TSMC's low-power process (which is used for handhelds and generally laptops, iirc) has much lower static leakage.[*]Static power consumption scales linearly with area and accounts for a large amount of overall power consumption, but it stays constant with frequency.
_xxx_ said:Well I tend to severely doubt that. Otherwise noone could use any heaters, boilers, AC, ovens etc. since they're all waaay over 1kW.
Nobody plugs those into a standard outlet though. I can't think of any major appliance that plugs into a normal wall outlet. We've got big heavy duty outlets for whenever those show up. Also keep in mind that a typical household circuit is 20A and likely used to wire an entire room.