Nvidia's biggest concern must be, how on earth are they going to be selling these as HPC cards? They are clearly huge power hogs, and probably too hot for a full rack..?
What do you think..? My bet is placed in no tesla card until b1
I really cant fathom how these cards can be called a success. Not with all that heat,power and noise at load.
Bit-tech's report was kinda negative on it too... even called it a potential flopI skimmed thru the reviews and found that only [H] had a disappointing take on the matter. Rest were saying that 480 is kinda good.
The thing is that the cost adds up. If you're a big torrenter and keep your computer on 24/7, 25W higher idle works out to $20 per year. 120W higher peak power could mean a new power supply, so you need another $50.Most consumers don't really shop based on efficiency, with the exception of mobile products where it effects battery life, if they did, there wouldn't be so many gas-guzzlers on the road, so many incandescents and halogens in people's homes. The customer mostly cares about performance and cost.
Most consumers don't really shop based on efficiency, with the exception of mobile products where it effects battery life, if they did, there wouldn't be so many gas-guzzlers on the road, so many incandescents and halogens in people's homes. The customer mostly cares about performance and cost.
Where are the consumer apps to bench these things? All I know is that there is a dxcs implementation of video transcoding in w7. No benches yet.Gaming is all well and good but does anyone know if reviewers have compute benchmarks (OpenCL/CUDA) to look at? All I can find is gaming atm.
The primary problem is that you don't get an objective measure of playability, and you only get one choice of settings.Why not look at the [H] charts? The absolute speeds aren't as relevant but you can still see where and how much each architecture spike.
It is the application/game that decides which DX-level to use, right? If so, in what way can the driver influence the which DX rendering path is preferred? If the reviews are comparing a DX9 path against the DX11 path, is it then simply not a case of the reviewers being clueless?And why is it ATI owns in DIRT2 DX11 on [H] but gets slammed on most every other site? DX9 path?
It is the application/game that decides which DX-level to use, right? If so, in what way can the driver influence the which DX rendering path is preferred? If the reviews are comparing a DX9 path against the DX11 path, is it then simply not a case of the reviewers being clueless?
The thing is that the cost adds up. If you're a big torrenter and keep your computer on 24/7, 25W higher idle works out to $20 per year. 120W higher peak power could mean a new power supply, so you need another $50.
I agree with you that higher clocks and 512 SPs would have lessened the perf/mm2, and consumers don't care about that when purchasing anyway, but as it is ATI feels no pressure to reduce prices. It's above launch MSRP after 6 months.
"We are currently keeping memory clock high to avoid some screen flicker when changing power states, so for now we are running higher idle power in dual-screen setups. Not sure when/if this will be changed. Also note we're trading off temps for acoustic quality at idle. We could ratchet down the temp, but need to turn up the fan to do so. Our fan control is set to not start increasing fan until we're up near the 80's, so the higher temp is actually by design to keep the acoustics lower." - NVIDIA PR