Profit of NV40 parts

digitalwanderer said:
Atomahawk said:
What I was going to say was said, 1 molex no go.
You actually tried and confirmed this? (Not doubting, just checking)

What happened?

Just like Kyle reported and BFG results, and will get worse if the PSU has a good load already applied for other components.

Their are electrical standards involved that regulate the sales of PSU's and other components in our computers. So if Nvidia says 2 molex's theirs a reason. Not all PSU are made well, they have no choice but to give those specs.
 
3Dmark moves a lot of shaders. UT2004 doesn't. Since the GPU is doing more mathematical operations at the 3dMark test, that should be why it's showing arctifacts there under those conditions.

If that above is true, expect it to happen in some other shader intensive title if one molex is used.
 
So much for Far Cry and Doom III on one molex then. I'd like to see a list of titles that would work with one molex and the ones that don't.

Now as for the CEO lying to investors on the CC yesterday afternoon. Lets see him bail out of this one. He could get fired for what he said on the CC. :oops:

He stated "Normal operation can be done with one molex, and two would be needed for the gaming enthusiast"

What is normal operation....Microsoft Word=one molex?
What is a gaming enthusiast....anyone who runs a 3D app=two molex?

His cocky little mouth just got him in some serious poopoo with investors.
 
Just thought i'd come out of lurking for a second to point something out you guys are seeming to miss...

I keep reading that the 2 molex connectors are there to supply more 'power'. This is not really correct.

It is highly likely that the dual molex connectors are there because the wire gauge used on some cheaper power supplies is too small to safely support the current draw...they either will heat up and pose a fire hazzard, or the resistance is too high which reduces the voltage the graphics card 'sees'. The connector itself is also a problem, molex connectors are all over the place when it comes to connection quality, they can cause the same problems as the wire gauge. I've had a molex connector melt from a poor connection... Using 2 lines from the PSU results in roughly half the current running through each set of conductors (reduces the effective wire gauge by 3 sizes). It does not supply more 'power' cause both of those lines are still running from the same 'rail', they are simply connected in parallel.

So running only one connector should not fry the video card assuming the connection is good and the wire gauge is sufficient. It's just there for the sake of safety in case comeone tries to run the card off a PSU with rusty connectors and 30 gauge wire.
 
I don't seem to follow your logic. I was under the assumption that voltage really does do the damage, but the current does. If you were to hook those up in parallel you are essentially reducing the resistance creating a higher current. That then creates more heat. Hey it's been awhile since I've done some electrical work, but I could be wrong. I also seem to have read somewhere that you had to use molex connectors from different lines. Not on the same line or you could pop the PSU.
 
don't seem to follow your logic. I was under the assumption that voltage really does do the damage, but the current does. If you were to hook those up in parallel you are essentially reducing the resistance creating a higher current. That then creates more heat. Hey it's been awhile since I've done some electrical work, but I could be wrong. I also seem to have read somewhere that you had to use molex connectors from different lines. Not on the same line or you could pop the PSU.

Reducing resistance does increase current and yes, you would be reducing the resistance connecting them in parallel BUT it would be the resistance of the wires and the connectors your reducing, not the graphics card. The graphics card has an internal resistance that is much greater than that of the wires, connecting the wires in parallel does not change this. Effectivley the circuit is the a resistor representing the wire in series with a resistor representing the graphics card.

If the resistance of the wire is high compared to the vid card then the 5V will distribute itself proportionately to the wire and the vid card (say 4V to the card and 1V to the wire). This is bad.

If the resistane of the wire is negligable compared to the graphics card then the card will recieve 4.99V and the wire will get 0.01V. This is good.

As long as the graphics card does not lose it's internal resitance somehow more power and hence more heat will not be generated. Increasing the wire gauge also increases the heat capacity of the wire...it requires more power to heat it to a given temperature. This is why a really thin wire is more likely to melt under a given current than a thick one.

And yes... they have to be different lines. That's where I got the 'parallel' bit from. Using 2 connectors from the same line is the same or worse than using a single connector. Everything I said above assumes you are using 2 separate 'lines'.
 
Stryyder said:
NVIDIA is reporting they expect a rise in margin between 1 to 1.5% by Q3 2004.

A lot of you have been saying that the NV40 is a part that they will be offering at a reduced margin because of manufacturing and yield issues. Anyone want to tackle this apparent discrepency??

Sure. Very simply, what nVidia expects, and what nVidia gets, need not be congruent...;) Discrepancy eliminated.

For reference, check back to published '02 nVidia expectations as to their nV30 market penetration predictions for nV30 in '03, a few months prior to nVidia's cancellation of nV30. Some monster discrepancies there between what nVidia publicly stated it expected from nV30 and what nVidia got from nV30...;)

Also, you can obtain percentage increases in margins by cutting costs, as margin rates will not necessarily reflect downturns in profits or volume (IE, you can increase your precentage of margin and experience a drop in total revenue at the same time, etc.)
 
I was under the assumption that voltage really does do the damage, but the current does.[

This is true, current is the likely cause of physical damage. But where the voltage comes in is that of the card is only getting 4V...either you will get corruption becuase the transistors cannot switch properly OR the power circuitry will attempt to maintain a constant power. Since Power=I(Current)*V(Voltage), lowering V while keeping P the same means you are increasing I (current). A poopy connection/supply wire can result in lowering V at the card.

Increasing voltage too much can kill the card too...but with computer enthusiasts this tends to be the result of heat rather than the voltage itself. Not that voltage wont kill it by itself either...just keep your tesla coil away from the card.
 
Dithium said:
So much for Far Cry and Doom III on one molex then. I'd like to see a list of titles that would work with one molex and the ones that don't.

Now as for the CEO lying to investors on the CC yesterday afternoon. Lets see him bail out of this one. He could get fired for what he said on the CC. :oops:

He stated "Normal operation can be done with one molex, and two would be needed for the gaming enthusiast"

What is normal operation....Microsoft Word=one molex?
What is a gaming enthusiast....anyone who runs a 3D app=two molex?

His cocky little mouth just got him in some serious poopoo with investors.

Actually he covered his rear quite effectively. Under normal non-gaming issues the one plug is fine. It is only "enthusiast" use that requires both connectors. Games that aren't shader intensive run fine, those bleeding edge (ie. enthusiast) games will require both connectors.

It probably has more to do with resistance within the board limiting the amount of power available to the GPU. Having a second power supply in a different location can deliver more voltage if it enters the circuit in a location closer to the destination because there is less resistance.
 
Hey guess what! Jen-Hsun Huang speaks again!

"The NVidia line of cards is now so advanced that you no longer need to use either molex connector's. They're just there as a conveninece for computer enthusiasts" :rolleyes:


How many non-gamers are going to buy the 6800 Ultra?
 
ThumperZ said:
Actually he covered his rear quite effectively. Under normal non-gaming issues the one plug is fine. It is only "enthusiast" use that requires both connectors. Games that aren't shader intensive run fine, those bleeding edge (ie. enthusiast) games will require both connectors.

I'm glad you think he "covered his rear quite effectively," although I'd suggest wearing pants would have been more effective...;)

First, only "enthusiasts" (and by that I assume he means people who play 3d games) would ever buy a 6800U in the first place, unless they were quite insane, of course...;)

Second, what is the utility in even suggesting that mode of operation for the 6800U? Are we to presume that the CEO of nVidia recommends that people can remove the side cover and unplug one of the connectors prior to booting up to browse the Internet, then power down, remove side cover, and plug the second connector in, replace side cover, and boot up prior to playing a 3d game? Rinse and repeat endlessly? This is what JHH terms an "advanced" product?

Please...tell me this is all a bizarre joke...;) I cannot believe a grown man would make such a statement with a straight face. Heh...;) Much better of course for anyone purchasing a 6800U to simply plug in both connectors and forget it. Hmmm....maybe JHH can persuade Michael Dell to install "Molex Connector Toggle Switches" on a line of his boxes 'specially for the "advanced" 6800U......Nah...forget that.
 
useless_engineer said:
If the resistance of the wire is high compared to the vid card then the 5V will distribute itself proportionately to the wire and the vid card (say 4V to the card and 1V to the wire). This is bad.

If the resistane of the wire is negligable compared to the graphics card then the card will recieve 4.99V and the wire will get 0.01V. This is good.
I was under the impression though that these graphic cards mainly (or exclusively) use the 12V rail. In that case, current will not be that high (below 9A), and so less voltage drop (as well as less "cable heat"). It would be nice if someone with a multimeter could measure current on 5 and 12V lines...
 
mczak said:
useless_engineer said:
If the resistance of the wire is high compared to the vid card then the 5V will distribute itself proportionately to the wire and the vid card (say 4V to the card and 1V to the wire). This is bad.

If the resistane of the wire is negligable compared to the graphics card then the card will recieve 4.99V and the wire will get 0.01V. This is good.
I was under the impression though that these graphic cards mainly (or exclusively) use the 12V rail. In that case, current will not be that high (below 9A), and so less voltage drop (as well as less "cable heat"). It would be nice if someone with a multimeter could measure current on 5 and 12V lines...

The 12 V supply does have its uses, but delivering power isn't it. If the 12 V is used at all, it's for other things. The chips run at a voltage between 1 and 3.3 V, the NV40 core runs at something between 1.4 and 1.8 V. They get that power from a switching regulator that is connected to the 5 V line(s).
 
the amp, assumptions are only good if your MB is a good one and you dont have any other 12v devices...... 13amps on the 12v rail is low to just bad.
 
Why is he a liar, specifically? Most executives spin things in their company's favor, and speak negatively about their closest competitors. You think Dave Orton of ATI doesn't spin things in ATI's favor relative to NVIDIA? This is just reality. Neither ATI nor NVIDIA are saints, they are held accountable to their shareholders first and foremost.

I can't remember when anybody from ATI ever said anything bad or merely anything containing any information about NV's products, but I have heard multiple times Jen say things about Ati's products, especially during the FX hype days.

And was it Jen or someone else at NV who said this board is filled with morons?
 
DiGuru said:
Yes! These are the numbers I was looking for.
:oops: :oops: :oops:

I wouldn't have believed it, if I hadn't seen that. Extremely extreme!!!
This can't be a surprise. For cpus, exactly the same has happened - they used to draw from 3.3V/5V, but now have a separate 12V connector.
Though the NV40U numbers are a bit on the high side, maybe it doesn't use the 3.3V voltage from the AGP slot at all (for simpler voltage regulator design, thus limiting power draw from AGP to 26W max?).
5A should be no problem though IMHO for a single connector (it certainly shouldn't be a problem for the wire).
I'm really wondering why quite a few PSU manufacturers still build supplies which can provide 50A at 5V, but hardly 10A at 12V - or maybe they are doing it on purpose, so you have to buy a new one real soon...
 
Hi all,

I've already do some test with one molex for my NV40U preview.

If you didn't plug any Molex, the NV40U speaker goes crazy and the PC did not boot

If you plug only the lower one Molex, the NV40U speaker goes and the PC did not boot

If you plug only the upper one Molex, the card is working at 400/550 but there is some artefact in GPU intensive 3D app like 3DMark.
 
Back
Top