Profit of NV40 parts

There has been some unconfirmed rumours floating around in various forums saying that neither Molex has to be connected as long as the computer is turned off!!





:p
 
First, only "enthusiasts" (and by that I assume he means people who play 3d games) would ever buy a 6800U in the first place, unless they were quite insane, of course...

Naw, I know a guy who bought the most amazing stick-shift, and yet he only knows how to drive an automatic. He gets his girlfriend to drive him around (damn lucky bastid BTW)

There are tons of people who buy expensive things that they have no idea of how to setup, nevermind use. Case in point, early Video Casette Recorders.
 
I tend to believe the two molex is needed because of the AWG rating of the wires too.

It comes down to voltage (really amperage) drops due to wireloss. It is doubtful, but not totally impossible that the current draw would get to the point where the wire would become a fire-hazard. If it was touching a 70 celusius CPU heatsink and at max current draw, well then ya it would be a fire hazard (but I would totally blame the guy who put it together that crappily first)

Using only one connector would strain the powersupply, and any better quality powersupply would automatically compensate for it.

However: A lot of cheap powersupplies have the 3.3 and 5 volt rails "tied" together as well as the rails. That is they all fluctuate together which can cause system instability. A really good powersupply will have independant circuitry on every voltage and rail (although really its about as rare as finding a 6-phase power motherboard)
 
I would have to agree that this is related more to the AWG rating of the wire. I highly doubt the card pulls more current than one wire can support.

When some regular guy buys the card and has to plug in both connectors however they will probably use the same line from the PSU. Most lines i've seen off of PSUs only have 2 molex connectors on them. If you plug them both in you don't have to worry about some idiot plugging one into the video card and the other into something else that draws a large amount of current. It might be more of a safety precausion than anything else. If you make them use both molex connects it's unlikely they can hook anything else up to that wire giving the card all the availble current that the wire would have. It wouldn't necessarily matter if you only connected one and left the other hanging.

In the case you used 2 seperate wires then there should be enough room between both wires. I'd think it's more about isolating an entire rail to power the card.

I suppose it could be because of cheap wires used in PSUs but who has a PSU capable of putting out enough power to run the system that this thing is likely to be in that uses really cheap wires. If a company makes a PSU capable of putting out well over 400+W of power it's highly likely they're going to use wires that can at least handle that much power.
 
So it will auto down with one connect plugend in? Cause what I think your saying is too much draw could overheat the wire and melt. The wire being so below AWG spec on a average/ below average PSU, that 5amp* pull could melt it?

But if your running a split from the same single lead it will still draw the same amps and melt anyways.....unless it cycled the current between the split on a single lead to stay safe...( BAH NO WAY. There cant be that kind margin for fire and doom.) Or then you would not split the lead you would use two leads, Still cycle ( for class action protection) but would be a steadier current?But if ran on onelead/one connect, it would still clock down (or not clock up) for fear of fire doom and death!? i dont think so.
*http://www.spodesabode.com/content/article/nv40r420/2
a 22awg can load 8amps?
 
I am kind of thinking they should do what the old Voodoo cards did with they external power brick to feed the card.

And I am thinking of the OEM side of things for a company like Dell. How many 6800U's do you think Dell will ever use? I am thinking not many if any. With an external brick solution that could be a different story.
 
I was thinking more on the lines that the card alone could pull say 85% of the rated current for a given line. Making them plug the second molex connector into the card would prevent them from hooking another device up to the same wire which could in theory pull that remaining 15% or more. If you use 2 wires then it would go half and half. It's doubtful any other device in a common system would be pulling more than 50% of one rail. You'd have to be powering a RAID array with a bunch of Y connectors to get that kind of draw out of any other component I can think of offhand.

Allowing for a voltage drop because of the wire is a possibility but there are regulators on the card itself to drop that voltage down even further so I don't think the drop would be enough to be an issue.
 
PaulS said:
Just as I'm listening:

- The GPU is affected heavily by the compiler, which is shipping at 1.0 and is very early. They expect big performance boosts in later drivers

Y'know, there was a time when you could mostly accept that at face value from NV. Too bad they trashed their own rep.

Still, given the relative differences between the previous generations of the two cards (i.e. NV4x vs NV3x as opposed to R4xx vs R3xx) it would seem to make sense that NV would 1). Have the most problems with legitimate bugs (eesh, what a concept) early on and 2). Have the biggest *legitimate* performance delta out there to capture from driver improvements as they learn the ins and outs of their new baby. Add in the expected late mass availability of NV4x to get enough of them out there for the enthusiast community to prod at its innards, and it is likely to be another three months before we have a solid idea of the comparative performance between NV4x and R4xx.
 
The maximum allowable current for common PSU wires is:

20 Gauge - 11 amps
18 " - 16 amps
16 " - 22 amps

These values do have a safety margin built into them so 12 amps is not likely to set a 20 gauge wire ablaze but nvidia still has to design for less current than this.

How much current is the gf6 drawing? I dunno. If you assume 100 watts from the 5V line though you get 20 amps... which is too much for 18 and 20 gauge wire though.

Keep in mind that the physical melting of the wire is not the only problem...the plastic connectors used on molex cables occassionally make pretty bad contact themselves...I've seen them blackened before from this driving much smaller loads than a gf6.
 
As seen from here [ http://www.spodesabode.com/content/article/nv40r420/2 ]

From SpodesAbode GF6 measurements:
12v Rail: 5.0 Amps, 60 Watts
5v Rail: 3.5 Amps, 17.5 Watts
AGP Power: 46 Watts theoretical maximum

Both rails combine for 77.5 Watts. Maximum theoretical draw of GF6 is 123.5 Watts.

From SpodesAbode 9800XT measurements:
12v Rail: 2.2 Amps, 26.4 Watts
5v Rail: 3.5 Amps, 17.5 Watts
AGP Power: 46 Watts theoretical maximum

Both rails combine for 43.9 Watts. Maximum theoretical draw of 9800XT is 88.9 Watts.

The AGP power was not measured, but comes from the maximum the specification allows. The X800 XT draws less power than the 9800XT. 8)
 
BRiT said:
Both rails combine for 77.5 Watts. Maximum theoretical draw of GF6 is 123.5 Watts.

The only thing I can think of then is that the worst case power draw calculated by nvidia is larger than what the website tested, and they employ a generous safety margin. Which makes sense if you consider that other devices have to share the same wires (hard drives, cd drives and such).
 
ninelven said:
It is clear that Jen-Hsun was specifically talking about overclocking.

You'd have thought that would have been obvious from what I said on the very first page, when I rounded up what was said in the call:

PaulS said:
- 6800U designed specifically for enthusiasts, has lots of "frequency headroom", and comes with an optional second power connector, required only for overclocking

But going by some of the replies, I guess not ;)
 
BRiT said:
The AGP power was not measured, but comes from the maximum the specification allows. The X800 XT draws less power than the 9800XT. 8)

You know, that is really quite something. I can only assume though that is because the X800XT is done on the low K .13 micron process while the Radeon 9800XT is done on the .15 micron process.(Was it done in low K though?) But a further reduction in the cards power consumption would be attributed to the GDDR3 memory wouldn't it?
 
Sabastian said:
You know, that is really quite something. I can only assume though that is because the X800XT is done on the low K .13 micron process while the Radeon 9800XT is done on the .15 micron process.(Was it done in low K though?) But a further reduction in the cards power consumption would be attributed to the GDDR3 memory wouldn't it?
No. The only low-K to come out of the R3xx generation was in the 9600XT, I believe--testing the process for the results we see now in the high end.
 
cthellis42 said:
Sabastian said:
You know, that is really quite something. I can only assume though that is because the X800XT is done on the low K .13 micron process while the Radeon 9800XT is done on the .15 micron process.(Was it done in low K though?) But a further reduction in the cards power consumption would be attributed to the GDDR3 memory wouldn't it?
No. The only low-K to come out of the R3xx generation was in the 9600XT, I believe--testing the process for the results we see now in the high end.

Yeah that is what I thought.. So what of the GDDR3 reducing the cards power draw? Is that an accurate assesment? I knew the 9600XT was a low K .13 micron part, it was the first.
 
Sabastian said:
Yeah that is what I thought.. So what of the GDDR3 reducing the cards power draw? Is that an accurate assesment? I knew the 9600XT was a low K .13 micron part, it was the first.

Well gddr3 is lower power but it's also running at a significantly higher frequency; 365 vs 560 (more than 50% higher).
 
AlphaWolf said:
Sabastian said:
Yeah that is what I thought.. So what of the GDDR3 reducing the cards power draw? Is that an accurate assesment? I knew the 9600XT was a low K .13 micron part, it was the first.

Well gddr3 is lower power but it's also running at a significantly higher frequency; 365 vs 560 (more than 50% higher).

So the higher clock rate on the memory effectively negates that lower power draw. Thanks for that.
 
- AnandTech : 500/550MHz
- CHIP : 520/560 MHz
- HardOCP : 520/560 MHz
- Toms Hardware Guide : 525/575 MHz
- Extreme Tech : 520/600 MHz
- Beyond3D : 520/550 MHz

The stock speed is 520/560.

However, Toms ran their tests at stock speed and the Extremetech number is incorrect - it's unclear what memory clock they got (550 or 560?).
 
Back
Top