Profit of NV40 parts

reever said:
I can't remember when anybody from ATI ever said anything bad or merely anything containing any information about NV's products, but I have heard multiple times Jen say things about Ati's products, especially during the FX hype days.

And was it Jen or someone else at NV who said this board is filled with morons?

Dave Orton said:
We were expecting Nvidia to push ahead. But I would say that there were surprises in some areas. The die size is bigger than we thought and the power consumption is higher than we expected. They also followed up architecturally to 16 pipes. We expected them to end up expanding internal processing instead. I am also surprised that they have not been able to reach a higher core frequency....

Who is he talking about there? It seems to be none other than nvidia
http://www.tomshardware.com/hardnews/20040507_060001.html
 
Sabastian said:
So the higher clock rate on the memory effectively negates that lower power draw. Thanks for that.

Well I don't know that it does. Merely speculating. It was reported that gddr3 used half the power with up to double the frequency. I doubt it does both of those at the same time.
 
Sxotty said:
Dave Orton said:
We were expecting Nvidia to push ahead. But I would say that there were surprises in some areas. The die size is bigger than we thought and the power consumption is higher than we expected. They also followed up architecturally to 16 pipes. We expected them to end up expanding internal processing instead. I am also surprised that they have not been able to reach a higher core frequency....

Who is he talking about there? It seems to be none other than nvidia
http://www.tomshardware.com/hardnews/20040507_060001.html

The die size is bigger than we thought and the power consumption is higher than we expected.
For most people this was true.


I am also surprised that they have not been able to reach a higher core frequency
After seeing the 5800 and 5900 playing with the 500MHz mark, this was kind of expected.

The only FUD-like comment was this:
. We expected them to end up expanding internal processing instead.
Since NV obviously improved shader performance I guess it's a given fact.
 
Power Connector

I've seen the guts of plenty of power supplies, and I seriously doubt that the 6800U can tell the difference between two discrete 4-pin molex conectors, and two that have been daisy-chained together. My guess (and it's only a guess) is that the 6800 would probably not glitch if you connected both power connectors from a single power suppy lead.
 
BetrayerX said:
The only FUD-like comment was this:
. We expected them to end up expanding internal processing instead.
Since NV obviously improved shader performance I guess it's a given fact.
I think you misunderstood Dave Orton there. This just means ATI expected Nvidia to have fewer more complex pixel pipelines (something like 8 uber-pipes with maybe 2 texturing units each, and lots of shader alus) instead of 16 "simple" pipes. That would have been more of a traditional evolution of nvidias design, NV40 is quite a radical change in design.
 
useless_engineer said:
The maximum allowable current for common PSU wires is:

20 Gauge - 11 amps
18 " - 16 amps
16 " - 22 amps
Where did you get these numbers?

Most PSU's that I've used have either 20 gauge or 22 gauge, but lets go with 20 gauge. Here they say that 1.5 amps is the max for 20 gauge wire, although they use a conservative 700 circular mils per amp rule. In fact, your numbers match the "Maximum amps for chassis wiring" values on that chart, which the page says is for wiring in air.

Generally, the lowest rule I've seen is 300 circular mils per amp, most places saying 500 is the minimum (although one guy on a forum said 200 is an absolute minimum). 300 gives a 20 gauge max current of 3.5 amps.

In other words, most electricians suggest keeping current below 3.5 amps in a 20 gauge wire.

We see here that the 12V line can have 5 amps running through it. When games become more intense on the GPU with more dual-issued pixel shaders you could see an even higher current draw.

The second connector is not as "optional" as NVidia is saying, and you better use two different wires as opposed to what HardOCP did if you want to be safe. Hardcore gamers can play for hours at a time.
 
I was simply responging to someone that stated ATI did not discuss their competitor and since NV did they were evil.
 
6800 Power supply woes

Although there are a lot of recommendations for the max current that a wire can safely handle, most power supplies I have seen even back to the AT era has been at least 20ga, and most of the 300W plus models out there are equipped with 18ga wire. If they aren't you should probably be looking for a new p/s anyways. Recommendations are just that, and experience has shown that 18ga wire can take a lot of juice. I've seen plenty of Peltier setups that drew more than 70 watts off the 12 volt line without issue.

This is going to turn out to be a distinction between cheap power supplies and good ones, regardless of the output per rail specifications that are often quoted. I did a test yesterday with my Antec 480W power supply (a very nice one I might add, I reviewed it on my site) and found that with a multimeter, you can get continuity between hot leads from separate power supply output cables (turn p/s off first please). In other words, the hots all meet up inside the power supply. No big reveleation here, it has been known for some time. So the question only remains whether a single power supply lead can deliver the juice without making a significant voltage drop occur.

As long as the power supply can deliver the juice, and the wiring is done to decent tolerances, I don't see a problem with daisy chaining two connectors together at the power levels we are discussing here.

Now, I don't actually have a card to test. I'm not rich :) My somewhat educated guess is that the power distribution channels within the card itself are "expecting" that both sockets are receiving 12 volts. The card can't possibly know wheter you have daisy chained the cables together, and as long as it is getting a steady stream of electrons from the power supply, everything *should* be O.K.
 
Slappi said:
PaulS said:
Just as I'm listening:

- Margins to improve throughout the year, far beyond NV3x levels
- Profits to be flat to slightly up next quarter, with XBox finally rising after 2 declining quarters in a row
- NV4x biggest leap NV have ever taken
- Shader 3.0 "dramatically extends programmability"
- A dozen games and big engines are going to support 3.0
- 6800U designed specifically for enthusiasts, has lots of "frequency headroom", and comes with an optional second power connector, required only for overclocking :!:
- The GPU is affected heavily by the compiler, which is shipping at 1.0 and is very early. They expect big performance boosts in later drivers
- Quadro FX4000 is showing 38x performance in a specific app (didn't pick up which one)
- 3% Market share gain came from mid to low end boards, as opposed to the enthusiast segment.
- Shader 3.0, Superscaler, and programmable video processor made up the transistor count over and above ATi.
- 40% more transistors, yet only 9% bigger die. Costs 10-15% lower, with better process capacity.
- Redundancy and re-configurability in the NV4x, and "immunity from defects", to achieve great yields moving forward.
- Multiple NV4x chips taped out, a couple of them are back from an unspecified fab, and by Q4 everything will be based on NV4x. They refused to be any more specific than "by the end of the year".
- Talked about R420 being effectively a 3 year old architecture, and customers will pick the NV4x over that based on features such as Shader 3.0, since theirs is the only true next gen GPU
- Game Developers "clamouring" for their Shader 3.0 and FP Filtering tech, building lots of boards specifically for devs. Easier to write, with better performance. Conditional branches are heavily touted.
- ATi have hand picked boards to be sent to reviewers, each at a different speed (?), potentially confusing consumers. NVIDIA don't do this, and their boards are much better overclockers that the XT (due to the hand picking mentioned previously, apparently).
- PS2.0 vs PS3.0 will be a "glaring" difference, but HSI vs. native PCI-E is identical (including bi-directional bandwidth). Competitors are doing much the same, so it's costing them money as well.
- Competitors are going to hurt badly with PCI-E, since they'll have to build 4 or 5 extra boards.
- NV4x can operate in 3 modes: AGP8x, PCI-E, or PCI-E with HSI.
- Started arguing with one caller, who pushed them about whether PS3.0 will actually make any difference NOW. They went back and forth several times, and it got reasonably heated and amusing. They said they'd call the guy back after the conference call :LOL:


That guy is a piece of work.

I still own a good bit of NVDA shares and after hearing that load of BS I am thinking of selling.

The guy lies. He is a liar. I have really lost faith in this company.

What a dum-dum. :D
 
Scarlet said:
Evildeus said:
The part comparing the Nv die to the R420 is interesting. Anyone can give some more insight on this?
Sure. If you look at the various Intel events of the last several months, it's pretty clear ATI has a true native PCIE interface and nV does not.

I will say it out loud:
Jen-Hsun Huang is an honorless bald-faced liar who will state anything to get what he wants.

What he said about the ATI product is materially false and made in front of analysts, not an off-hand comment made in a materially non-relevant setting. If ATI had half the mind too, they could make a legal issue of it. I believe the legal term in comercial misrepresentation - http://www.cojk.com/resources/article-detail.php?id=5

He didn't get what he wanted with NV40, which was uncontested performance leadership, and now he is resorting to out-right fabrication.

Who is smoking hallucinogenic substances Mr. Huang?

You didn't, by any chance, come from the Driverheaven or Rage3D did you? :LOL:
 
Back
Top