Babel-17
Veteran
It's still not AMD's problem if a case manufacturer makes an ATX case but doesn't correctly implement the ATX specifications. In other words, blame the case manufacturer's for cutting corners and ignoring the specs.
http://www.nvidia.com/object/product-geforce-gt-440-us.html
GT 440 announced today, now with 1GB GDDR5 memory.
No, no and once again no.
There won't be official 590, simple as that, they will not break the PCIE specs.
The "leaked pic" is from AIB custom project similar to ASUS MARS/ARES projects, and projects similar to those are the only possible "dual GF110" that might ever come
I can't figure out why people seem to think that it's important for a specialty product like his to stay within some power number that's written in a technical spec? Especially if the connectors make it abundantly clear that more than usual power is required.Silent_Buddha said:Whoa 2x 8 pin connectors. A ridonkulous cooling solution with 3 fans. If that's what Nvidia is planning, it certainly looks like they are getting ready to give PCIE certification a big middle finger.
Seems rumours are still not dying like you hoped.
http://www.xbitlabs.com/news/video/...ship_Graphics_Card_from_Nvidia_Resurrect.html
It's a custom EVGA design GTX460x2. Just compare how the area under the chips is populated: http://www.ixbt.com/video3/images/gf104/gtx460-scan-back.jpgLooks like 2xGTX570 (2x320 bit) memory interfaces: 6x2 on the front and 4x2 on the back.
What should the next wattage limit be?
A quad-card setup of 300W cards with the attendant system (a few hundred W to CPU and misc) to is starting to nudge into the uncomfortable (out of code?) range for 15 amp house circuits.
The SIG may not have an interest in defining the electrician's certification needed for next greatest GPU slot specification.
There is usually a safety margin built into the regulatory limit. I've seen 20% bandied about, but that isn't something I can state authoritatively. That would reduce the amount of margin significantly.I figure this is probably tongue in cheek, but since it made me curious let me share what I found.
First code in the US at least specifies that on a standard 110 volt outlet you have 15 amps so 1650 watts of power available (remember this is assuming the fixed 110 voltage of the outlet).
Such a requirement is beyond what the SIG concerns itself with. There's no real interest in researching and ratifying a standard that requires home modification and contract work.I think people who are really planning on running 4 dual GPU cards (that I would guess cost in the $500-600 range each) either have enough money to pay an electrician to change a fuse or enough technical know how to do it themselves.
It's a custom EVGA design GTX460x2. Just compare how the area under the chips is populated: http://www.ixbt.com/video3/images/gf104/gtx460-scan-back.jpg
That dual GF110 board is a few months old too. Old news.
There is usually a safety margin built into the regulatory limit. I've seen 20% bandied about, but that isn't something I can state authoritatively. That would reduce the amount of margin significantly.
Such a requirement is beyond what the SIG concerns itself with. There's no real interest in researching and ratifying a standard that requires home modification and contract work.
I don't think the time window needs to be that long.The safety margin only applies if you plan 100% load - ie, the video cards would have to be pushing their full 300 W 24/7, 365. Otherwise it is fine to go up to max load as long as you don't run it more than 80% of the time.
The SIG cares enough to define a high-end specification that caters to GPU cards. I am not aware of other product lines that commonly approach the 300W limit. They would be aware of the design bullet points for these top end cards.The SIG doesn't care how many cards the computer has. It currently allows 75W from the slot, 1 8 pin connector (150W), and 1 6 pin connector (75W) for a total of 300W. That is independent of you running 1 card in a personal machine of 3000 in a GPU farm.
There's probably an evaulation of many factors, but I would imagine that expected deployments of a product on commercially available and in-spec platforms is a worthwhile thing to consider.I really doubt they base the calculations for card wattage on a massively nonstandard setup (a machine with 4 PCI slots instead of 2 running 4 dual GPU cards). I think it kind of borders on absurd to suggest they would.
That board seems to put the limit at 3-way graphics.
Then the current 300W specification is not out of date and does not need increasing.Like I said - red herring. This card is about getting the fastest single slot card on the market. I doubt NVidia cares what the PCI SIG thinks about their card, and I doubt that the SIG will bother trying to research a specification based on one niche market card setup that might pull too much power if joe random sets one and starts a whole suite of benchmarks running nonstop all day for a year.
I'm waiting for the version with the external power cord.
Because the card used more power than the AGP specification allowed for, a special power supply called Voodoo Volts was to have been included with it. This would have been an external device that would connect to an AC outlet. Most of the prototype cards utilized a standard internal power supply drive power connector.