DavidGraham
Veteran
That IS bad .According to NVIDIA this should not happen. In their official reviewer driver (which I used), the NVIDIA Power limit is designed to be active for all applications, not only Furmark.
That IS bad .According to NVIDIA this should not happen. In their official reviewer driver (which I used), the NVIDIA Power limit is designed to be active for all applications, not only Furmark.
That, or getting a Lucid chip inside the card, but that would be too much of a longshot.
Umm... 1.2(voltage)^2*1.34(clock increase in % over default)=1.93 vs 0.97^2*1=0.94, 1.93/0.94=2.05X increase in power and consumption, no wonder that card is boomedouch
Umm... 1.2(voltage)^2*1.34(clock increase in % over default)=1.93 vs 0.97^2*1=0.94, 1.93/0.94=2.05X increase in power and consumption, no wonder that card is boomed
, I tried 1.2 V to see how much could be gained here, at default clocks
So where are all the people that where kvetching when 6990 broke the PCIE spec now with 590?
Even with default clock it's still impressive 1.44X of heat and power at stock cooler over already hot card with huge power consumptionNice calculation, now read the text again:
IMPORTANT: The supplied Geforce Drivers 267.52 for Geforce GTX 590 will not stop the card from overheating when overclocking. Please use newer versions from the Nvidia website and stay away from 267.52. Otherwise this may happen ...
Lucid in there would only act as a PEG switch.
Even with default clock it's still impressive 1.44X of heat and power at stock cooler over already hot card with huge power consumption
You mean rabid Nvidia fans like Kaotik?Rangers said:So where are all the people that where kvetching when 6990 broke the PCIE spec now with 590?
Can't say I'm terribly surprised it doesn't survive that voltage - this is way over what even gtx 580 uses.In my opinion a normal card should automaticly regulate voltage and clocks in hardware if needed and not blow up the card with bad drivers.
So, it turns out it actually does survive that, just not much more. I think that's really all you can ask for given the power consumption.As long you only increase clocks it should be fine but probably not that much headroom. If you increase voltage and clocks to GTX 580 levels, you're looking at 600W (unthrottled) furmark power consumption, and even in games you will reach about 500W. I don't know if the card is built to really handle that (VRM) or just goes up in smoke.
Does it matter? NV now talks about acoustics etc. when they plain and simply can't win on performance.
techpowerup said:Compared to AMD's Radeon HD 6990, NVIDIA's GTX 590 is a bit quieter in idle and load, but the difference is much smaller than what NVIDIA's presentation in the reviewer's guide suggests, where the graph starts at 40 dbA. Overall both cards should be considered noisy in 3D, but with temperatures in the 80°C range, there is not much that can be done to quieten down the cooler any further.
Of course it should, but in my opinion a normal man should think twice before doing such dumb things with voltage like TPU's author do in reviewIn my opinion a normal card should automaticly regulate voltage and clocks in hardware if needed and not blow up the card with bad drivers.
Actually it looks like it's also possible to blow up the card even without overvolting, which is a bit more worrying. From http://www.hardware.fr/articles/825-2/dossier-nvidia-repond-amd-avec-geforce-gtx-590.html:Of course it should, but in my opinion a normal man should think twice before doing such dumb things with voltage like TPU's author do in review
Though I'm wondering what part of the VRM blows up. Looks like it's a 2x5 phase design - GTX570 only used four phases, GTX580 six. Not sure why it wouldn't be able to handle power draw without OCP - GTX570 certainly had no problems like that.Un bug rend possible l’installation automatique par Windows d’un pilote précédent qui ne supporte pas la GTX 590 et n’active pas l’OCP globalement. Un test de charge lourde a ainsi fait exploser la consommation… et la carte.
Can't say I'm terribly surprised it doesn't survive that voltage - this is way over what even gtx 580 uses.
Some time ago I wrote:
So, it turns out it actually does survive that, just not much more. I think that's really all you can ask for given the power consumption.
You're right though a design which doesn't allow to set voltages that high would be preferable. I wouldn't quite say though it's only the driver's fault as imho setting it to that voltage is a deliberate attempt to kill the card not something the driver does on its own.