NVIDIA GF100 & Friends speculation

I wonder if this card killing issue had anything to do with the 2 day hold-up?

It wasn't just TPU who killed their cards, it happened to at least 3 reviewers and probably a few more.
 
IMO, a card like GTX590 having 1.5GB to deal with is laughable, to say the least.

It's a card marketed (and accordingly priced) to play 3D games (twice the framebuffer memory needed) with multimonitor setups and they give it 1.5GB to play with?

IMO, they would gain in halving the memory bus to 256bit and getting 2+2GB or even going all out (+$80? at that price point, who cares?) and getting 3+3GB in the card.

That, or getting a Lucid chip inside the card, but that would be too much of a longshot.
 
So where are all the people that where kvetching when 6990 broke the PCIE spec now with 590? :rolleyes:

I think that debate's already been settled. In both instances, we have a boutique product with tiny volumes that is too much of everything to matter in the realm where PCIE spec is useful.

I do like that Nvidia's cooler is quieter, and the board is more compact. It is possible that its survivability and OC headroom have suffered as a result, so it might explain why AMD's solution is what it is.

While it is quieter than the 6990, for my tastes it isn't really a quiet card, and in other ways it is as unappealing to me as the AMD board. I'm just not in the demographic that would fork over the kind of money for this type of card.

(There's a "It's not for you" comic from PA that probably applies.)

This is apparently what a missed process node looks like.
 
Sweclockers burned their card
IMPORTANT: The supplied Geforce Drivers 267.52 for Geforce GTX 590 will not stop the card from overheating when overclocking. Please use newer versions from the Nvidia website and stay away from 267.52. Otherwise this may happen ...
 
Lucid in there would only act as a PEG switch.

Not if they used Lucid's multi-gpu driver that divides each GPU work into separate 3D models (which would be the whole point).

But like I said, it would be a long shot.
 
Even with default clock it's still impressive 1.44X of heat and power at stock cooler over already hot card with huge power consumption

In my opinion a normal card should automaticly regulate voltage and clocks in hardware if needed and not blow up the card with bad drivers. :???:
 
Heh, so with 6990's they have the hardware to allow overclocks, but you may or may not be able to attain decent overclocks.

On GTX 590, you just blow up your card if you try? :D

Regards,
SB
 
In my opinion a normal card should automaticly regulate voltage and clocks in hardware if needed and not blow up the card with bad drivers. :???:
Can't say I'm terribly surprised it doesn't survive that voltage - this is way over what even gtx 580 uses.
Some time ago I wrote:
As long you only increase clocks it should be fine but probably not that much headroom. If you increase voltage and clocks to GTX 580 levels, you're looking at 600W (unthrottled) furmark power consumption, and even in games you will reach about 500W. I don't know if the card is built to really handle that (VRM) or just goes up in smoke.
So, it turns out it actually does survive that, just not much more. I think that's really all you can ask for given the power consumption.
You're right though a design which doesn't allow to set voltages that high would be preferable. I wouldn't quite say though it's only the driver's fault as imho setting it to that voltage is a deliberate attempt to kill the card not something the driver does on its own.

Most reviews seem to think GTX590 is preferable if only due to fan noise. I think this comes down to radial vs axial fan again - try to insert a card right next to these monsters and the results might be different. At the very least I'd expect the GTX590 to "catch up" in the dB metric, if it can handle it at all (not sure it's a good idea for the HD6990 neither, but traditionally this is where those blower fans did better comparatively).
 
Does it matter? NV now talks about acoustics etc. when they plain and simply can't win on performance.

Cool, that should win back them back plenty of the AMD fans who mass converted to acoustic jehova's back when Evergreen gave up on performance!
 
I'm not sure why people are talking about a win on acoustics, it's still loud.

techpowerup said:
Compared to AMD's Radeon HD 6990, NVIDIA's GTX 590 is a bit quieter in idle and load, but the difference is much smaller than what NVIDIA's presentation in the reviewer's guide suggests, where the graph starts at 40 dbA. Overall both cards should be considered noisy in 3D, but with temperatures in the 80°C range, there is not much that can be done to quieten down the cooler any further.

Doesn't seem like he thought noise was a big advantage.
 
In my opinion a normal card should automaticly regulate voltage and clocks in hardware if needed and not blow up the card with bad drivers. :???:
Of course it should, but in my opinion a normal man should think twice before doing such dumb things with voltage like TPU's author do in review
 
Of course it should, but in my opinion a normal man should think twice before doing such dumb things with voltage like TPU's author do in review
Actually it looks like it's also possible to blow up the card even without overvolting, which is a bit more worrying. From http://www.hardware.fr/articles/825-2/dossier-nvidia-repond-amd-avec-geforce-gtx-590.html:
Un bug rend possible l’installation automatique par Windows d’un pilote précédent qui ne supporte pas la GTX 590 et n’active pas l’OCP globalement. Un test de charge lourde a ainsi fait exploser la consommation… et la carte.
Though I'm wondering what part of the VRM blows up. Looks like it's a 2x5 phase design - GTX570 only used four phases, GTX580 six. Not sure why it wouldn't be able to handle power draw without OCP - GTX570 certainly had no problems like that.
 
Can't say I'm terribly surprised it doesn't survive that voltage - this is way over what even gtx 580 uses.
Some time ago I wrote:

So, it turns out it actually does survive that, just not much more. I think that's really all you can ask for given the power consumption.
You're right though a design which doesn't allow to set voltages that high would be preferable. I wouldn't quite say though it's only the driver's fault as imho setting it to that voltage is a deliberate attempt to kill the card not something the driver does on its own.

It's higher than stock GTX 580, but not higher than normal voltage overclocks for GTX 580.

http://www.techreaction.net/2010/11/30/review-asus-geforce-gtx-580-voltage-tweak/2/

That's 1.21 volts there. MSI's utility goes even further up to 1.3+ volts I believe. And hardware/BIOS mods go even higher. So, as a site that regularly overclocks Nvidia cards, why would they not expect it to either work or not work? And at the least they would have no idea it would just outright fry the card considering how GTX 580 cards manage it. I'm sure they considered 1.2 volts to be a safe stepping point considering the voltages that are pushed on GTX 580 cards. And I'm sure they expected some type of graceful failure, artifacts or blue screen or black screen or something, rather than sparks and smoke.

Considering Nvidia endorses overclocking of their cards, they should at the very least have warned reviewers that GTX 590 cannot even remotely handle the same voltages that GTX 580 cards can.

Regards,
SB
 
Back
Top