Profit of NV40 parts

Discussion in 'Graphics and Semiconductor Industry' started by Stryyder, May 6, 2004.

  1. Dean

    Newcomer

    Joined:
    May 26, 2003
    Messages:
    34
    Likes Received:
    0
    Location:
    Charlottetown, Canada
    There has been some unconfirmed rumours floating around in various forums saying that neither Molex has to be connected as long as the computer is turned off!!





    :p
     
  2. ZenOps

    Newcomer

    Joined:
    May 5, 2004
    Messages:
    36
    Likes Received:
    0
    Naw, I know a guy who bought the most amazing stick-shift, and yet he only knows how to drive an automatic. He gets his girlfriend to drive him around (damn lucky bastid BTW)

    There are tons of people who buy expensive things that they have no idea of how to setup, nevermind use. Case in point, early Video Casette Recorders.
     
  3. ZenOps

    Newcomer

    Joined:
    May 5, 2004
    Messages:
    36
    Likes Received:
    0
    I tend to believe the two molex is needed because of the AWG rating of the wires too.

    It comes down to voltage (really amperage) drops due to wireloss. It is doubtful, but not totally impossible that the current draw would get to the point where the wire would become a fire-hazard. If it was touching a 70 celusius CPU heatsink and at max current draw, well then ya it would be a fire hazard (but I would totally blame the guy who put it together that crappily first)

    Using only one connector would strain the powersupply, and any better quality powersupply would automatically compensate for it.

    However: A lot of cheap powersupplies have the 3.3 and 5 volt rails "tied" together as well as the rails. That is they all fluctuate together which can cause system instability. A really good powersupply will have independant circuitry on every voltage and rail (although really its about as rare as finding a 6-phase power motherboard)
     
  4. Anarchist4000

    Veteran Regular

    Joined:
    May 8, 2004
    Messages:
    1,439
    Likes Received:
    359
    I would have to agree that this is related more to the AWG rating of the wire. I highly doubt the card pulls more current than one wire can support.

    When some regular guy buys the card and has to plug in both connectors however they will probably use the same line from the PSU. Most lines i've seen off of PSUs only have 2 molex connectors on them. If you plug them both in you don't have to worry about some idiot plugging one into the video card and the other into something else that draws a large amount of current. It might be more of a safety precausion than anything else. If you make them use both molex connects it's unlikely they can hook anything else up to that wire giving the card all the availble current that the wire would have. It wouldn't necessarily matter if you only connected one and left the other hanging.

    In the case you used 2 seperate wires then there should be enough room between both wires. I'd think it's more about isolating an entire rail to power the card.

    I suppose it could be because of cheap wires used in PSUs but who has a PSU capable of putting out enough power to run the system that this thing is likely to be in that uses really cheap wires. If a company makes a PSU capable of putting out well over 400+W of power it's highly likely they're going to use wires that can at least handle that much power.
     
  5. NitroX

    Newcomer

    Joined:
    Mar 4, 2003
    Messages:
    6
    Likes Received:
    0
    Location:
    My Basement
    so
    I guess the 124watt power supply in my e-machine just will not cut it then
    dam :twisted:
     
  6. karlotta

    karlotta pifft
    Veteran

    Joined:
    Jun 7, 2003
    Messages:
    1,292
    Likes Received:
    10
    Location:
    oregon
    So it will auto down with one connect plugend in? Cause what I think your saying is too much draw could overheat the wire and melt. The wire being so below AWG spec on a average/ below average PSU, that 5amp* pull could melt it?

    But if your running a split from the same single lead it will still draw the same amps and melt anyways.....unless it cycled the current between the split on a single lead to stay safe...( BAH NO WAY. There cant be that kind margin for fire and doom.) Or then you would not split the lead you would use two leads, Still cycle ( for class action protection) but would be a steadier current?But if ran on onelead/one connect, it would still clock down (or not clock up) for fear of fire doom and death!? i dont think so.
    *http://www.spodesabode.com/content/article/nv40r420/2
    a 22awg can load 8amps?
     
  7. Scott@bjorn3d

    Newcomer

    Joined:
    Mar 11, 2003
    Messages:
    5
    Likes Received:
    0
    Location:
    Sweet Home Alabama
    I am kind of thinking they should do what the old Voodoo cards did with they external power brick to feed the card.

    And I am thinking of the OEM side of things for a company like Dell. How many 6800U's do you think Dell will ever use? I am thinking not many if any. With an external brick solution that could be a different story.
     
  8. Anarchist4000

    Veteran Regular

    Joined:
    May 8, 2004
    Messages:
    1,439
    Likes Received:
    359
    I was thinking more on the lines that the card alone could pull say 85% of the rated current for a given line. Making them plug the second molex connector into the card would prevent them from hooking another device up to the same wire which could in theory pull that remaining 15% or more. If you use 2 wires then it would go half and half. It's doubtful any other device in a common system would be pulling more than 50% of one rail. You'd have to be powering a RAID array with a bunch of Y connectors to get that kind of draw out of any other component I can think of offhand.

    Allowing for a voltage drop because of the wire is a possibility but there are regulators on the card itself to drop that voltage down even further so I don't think the drop would be enough to be an issue.
     
  9. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    213
    Location:
    Uffda-land
    Y'know, there was a time when you could mostly accept that at face value from NV. Too bad they trashed their own rep.

    Still, given the relative differences between the previous generations of the two cards (i.e. NV4x vs NV3x as opposed to R4xx vs R3xx) it would seem to make sense that NV would 1). Have the most problems with legitimate bugs (eesh, what a concept) early on and 2). Have the biggest *legitimate* performance delta out there to capture from driver improvements as they learn the ins and outs of their new baby. Add in the expected late mass availability of NV4x to get enough of them out there for the enthusiast community to prod at its innards, and it is likely to be another three months before we have a solid idea of the comparative performance between NV4x and R4xx.
     
  10. useless_engineer

    Newcomer

    Joined:
    May 7, 2004
    Messages:
    6
    Likes Received:
    0
    Location:
    ON, Canada
    The maximum allowable current for common PSU wires is:

    20 Gauge - 11 amps
    18 " - 16 amps
    16 " - 22 amps

    These values do have a safety margin built into them so 12 amps is not likely to set a 20 gauge wire ablaze but nvidia still has to design for less current than this.

    How much current is the gf6 drawing? I dunno. If you assume 100 watts from the 5V line though you get 20 amps... which is too much for 18 and 20 gauge wire though.

    Keep in mind that the physical melting of the wire is not the only problem...the plastic connectors used on molex cables occassionally make pretty bad contact themselves...I've seen them blackened before from this driving much smaller loads than a gf6.
     
  11. BRiT

    BRiT (╯°□°)╯
    Moderator Legend Alpha Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    12,278
    Likes Received:
    8,478
    Location:
    Cleveland
    As seen from here [ http://www.spodesabode.com/content/article/nv40r420/2 ]

    From SpodesAbode GF6 measurements:
    12v Rail: 5.0 Amps, 60 Watts
    5v Rail: 3.5 Amps, 17.5 Watts
    AGP Power: 46 Watts theoretical maximum

    Both rails combine for 77.5 Watts. Maximum theoretical draw of GF6 is 123.5 Watts.

    From SpodesAbode 9800XT measurements:
    12v Rail: 2.2 Amps, 26.4 Watts
    5v Rail: 3.5 Amps, 17.5 Watts
    AGP Power: 46 Watts theoretical maximum

    Both rails combine for 43.9 Watts. Maximum theoretical draw of 9800XT is 88.9 Watts.

    The AGP power was not measured, but comes from the maximum the specification allows. The X800 XT draws less power than the 9800XT. 8)
     
  12. useless_engineer

    Newcomer

    Joined:
    May 7, 2004
    Messages:
    6
    Likes Received:
    0
    Location:
    ON, Canada
    The only thing I can think of then is that the worst case power draw calculated by nvidia is larger than what the website tested, and they employ a generous safety margin. Which makes sense if you consider that other devices have to share the same wires (hard drives, cd drives and such).
     
  13. ninelven

    Veteran

    Joined:
    Dec 27, 2002
    Messages:
    1,699
    Likes Received:
    117
  14. PaulS

    Regular

    Joined:
    May 12, 2003
    Messages:
    481
    Likes Received:
    1
    Location:
    UK
    You'd have thought that would have been obvious from what I said on the very first page, when I rounded up what was said in the call:

    But going by some of the replies, I guess not ;)
     
  15. Sabastian

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    991
    Likes Received:
    1
    Location:
    Canada
    You know, that is really quite something. I can only assume though that is because the X800XT is done on the low K .13 micron process while the Radeon 9800XT is done on the .15 micron process.(Was it done in low K though?) But a further reduction in the cards power consumption would be attributed to the GDDR3 memory wouldn't it?
     
  16. cthellis42

    cthellis42 Hoopy Frood
    Legend

    Joined:
    Jun 15, 2003
    Messages:
    5,890
    Likes Received:
    33
    Location:
    Out of my gourd
    No. The only low-K to come out of the R3xx generation was in the 9600XT, I believe--testing the process for the results we see now in the high end.
     
  17. Sabastian

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    991
    Likes Received:
    1
    Location:
    Canada
    Yeah that is what I thought.. So what of the GDDR3 reducing the cards power draw? Is that an accurate assesment? I knew the 9600XT was a low K .13 micron part, it was the first.
     
  18. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    8,396
    Likes Received:
    247
    Location:
    Treading Water
    Well gddr3 is lower power but it's also running at a significantly higher frequency; 365 vs 560 (more than 50% higher).
     
  19. Sabastian

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    991
    Likes Received:
    1
    Location:
    Canada
    So the higher clock rate on the memory effectively negates that lower power draw. Thanks for that.
     
  20. thegrommit

    Newcomer

    Joined:
    Oct 5, 2003
    Messages:
    192
    Likes Received:
    1
    The stock speed is 520/560.

    However, Toms ran their tests at stock speed and the Extremetech number is incorrect - it's unclear what memory clock they got (550 or 560?).
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...