There's no doubt the GPU itself can handle it - but VRM is definitely different to GTX 580, so I wouldn't just expect it to be able to so as well. Some sites (like hexus) upped the voltage 0.05V, power got up 60W and they got the hint and stopped right there .Considering Nvidia endorses overclocking of their cards, they should at the very least have warned reviewers that GTX 590 cannot even remotely handle the same voltages that GTX 580 cards can.
Though you're right the VRM does seem to be designed without too much headroom - hopefully that doesn't affect lifetime in "normal" situations. Actually seems nvidia blames the driver now since OCP should protect the card even in this case (though if it were my card I wouldn't want to try it out...).
Interestingly, TPU actually got the best (by far afaict) OC with default voltage (before they blew the card) - almost too good to be true (they really got GTX 580 clocks with that voltage???)
It's worse their stock voltage was 0.938V - hence 1.2V results in 1.64x of power (assuming the square scaling holds). Hexus got a 60W power increase for just a 50mV increase.OlegSH said:Even with default clock it's still impressive 1.44X of heat and power at stock cooler over already hot card with huge power consumption