Tech-Report blasts GeForce FX

DemoCoder said:
Nvidia needed .13um to squeeze 125M transistors into the process. Their die simply may not have been possible at .15um. Thus, .13um gave them a side benefit of increased clock scaling potential.

That seems like an odd way of looking at it. One could just as easily argue the equally odd view that nVidia needed .13u to meet their clock speed goals, and the ability to pack more transistors on the die was a side benefit.

Absurd, right? Singling out any one reason for moving to a .13u process will always sound absurd. Reality is that they did it for many various reasons, and clock speed target was certainly one of them. It allowed them to have the "package" that they desired, and that "package" had a cost.
 
BoddoZerg said:
$200 more than 9700pro? :eek:

People always exaggerate the price of nVidia cards... when GeForce3 was launched, people quoted numbers as extravagant as $580, by the time it hit store shelves the prices were well in the reasonable sub-$400 range.

I seriously doubt that NV30 will end up costing $150-$200 more than R300.

Actually I believe NVidia originally was trying to sell them for around $500+, but it didn't work out because of all the outrage over the price (either that or it was just rumored). The truth is, Nvidia only has themself to blame for this "exaggeration" because the GF2 Ultra DID sell for over $500. If the GF FX comes out at $500 it won't even be their first $500 card, which is pretty damn sad IMO. I always consider the R9700 to be kind of expensive, but even its MSRP of $400 looks like a bargain compared to N-rip-off-ya pricing.

Fortunately, since they're no longer head and shoulders above the rest, Nvidia can't afford to charge $500 for their parts. If they do, they'll end up losing market share, even if the card is the best. You can only charge a premium price for a premium product, not one that's marginally better.
 
NVidia will sell their cards for whatever price the market will bear. If the market won't bear a $500 NV30, then they will sell at $400. If no one buys @ $400 vs a $300 R300, then they will sell at $300. It's that simple.
 
DemoCoder said:
NVidia will sell their cards for whatever price the market will bear. If the market won't bear a $500 NV30, then they will sell at $400. If no one buys @ $400 vs a $300 R300, then they will sell at $300. It's that simple.
Not for everybody in this board ;)
 
DemoCoder said:
NVidia will sell their cards for whatever price the market will bear. If the market won't bear a $500 NV30, then they will sell at $400. If no one buys @ $400 vs a $300 R300, then they will sell at $300. It's that simple.

Isn't that what I just said? ;)
 
DemoCoder said:
NVidia will sell their cards for whatever price the market will bear. If the market won't bear a $500 NV30, then they will sell at $400. If no one buys @ $400 vs a $300 R300, then they will sell at $300. It's that simple.

Not that simple... they do need to make money off it as well.... they aren't gonna give 'em away. But then again it is XMAS time and if the market can't bare[sic] it, they wont mind spending $300 to £400 million on R&D and then give the fruits of their labour away for a dollar.

;)
 
No, they don't neccessarily need to make money off of the NV30 GeForceFX specifically. They could make the majority of their money off the NV31 or cut-down equivalent and subsidize the high end by accepting very low margins. Also, they will make a huge chunk of change off the DCC/Workstation market which seems to delight in paying more for a card.

NVidia could very well accept losses until yields improve to maintain their market position.
 
DemoCoder said:
NVidia will sell their cards for whatever price the market will bear. If the market won't bear a $500 NV30, then they will sell at $400. If no one buys @ $400 vs a $300 R300, then they will sell at $300. It's that simple.

Absolutely correct. However, I don't doubt that nVidia will price the NV30 at a retail higher than the 9700, IF for no other reason that they feel their fans will pay more for it - and they are probably right. UNLESS it's a real dog of a card.... and I don't believe that for a moment. Also, as the "new" card on the block, it will be a while till prices drop...

As far as what die size is better....well, ATM, doesn't matter. ATM, the Radeon 9700 is by far the best card you can buy....NOW. This may change in a few months....no, let me say it WILL change. No card stays on top for very long! We all know that everyone will move to smaller die sizes - just for cost factors (at some point!) But this is an old arguement - mature process vs. pushing the envelope. Back 3 years ago, pushing the envelope was the right thing to do. 4 months ago, mature process was the right thing to do. Face it, ATI was right on this one.
 
DemoCoder said:
NVidia will sell their cards for whatever price the market will bear. If the market won't bear a $500 NV30, then they will sell at $400. If no one buys @ $400 vs a $300 R300, then they will sell at $300. It's that simple.

Techincally....nV dose not sell video cards to us the PC consumers :) Sorry its a short week and I am trying to have fun being one of the few that had to work this short week..ggggrrr

I think we will see the GFX cards price slightly higher than the R9700 MSRP to match the GFX slightly higher perfromance.
 
DemoCoder said:
No, they don't neccessarily need to make money off of the NV30 GeForceFX specifically. They could make the majority of their money off the NV31 or cut-down equivalent and subsidize the high end by accepting very low margins. Also, they will make a huge chunk of change off the DCC/Workstation market which seems to delight in paying more for a card.

NVidia could very well accept losses until yields improve to maintain their market position.

nVidia only sells chips. You can't just assume that it can retail for $500 if the market accepts or it or $300 if it doesn't.

Can nVidia reduce chip prices by $200? nVidia may be willing to take a loss on the GF-FX Ultra in order to gain recognition and subsidize costs for their lower lines, but board manufacturers won't be.

There is some minimum cost associated with putting the board together. I don't know what that is, but it establishes some bottom line price that the board can sell for. The price will only drop below that point after the market has shifted making the GF-FX a mid or low range product, and it is no longer possible to make a profit given the initial cost of manufacturing the boards (leftover stock essentially).

Board makers will only enter into the fabrication and marketing of a product if they feel it will be profitable. It will be hard to convince ASUS for example that they should sell the GF-FX at or near cost in order to increase their likelihood of making greater profits on lower end products. In fact, were nV to pressure them into this by manipulating contracts, it would probably break some law somewhere.

If the board makers are forced to sell at a loss due to market forces, they will either drop nV like a skunky beer, or go bankrupt.
 
But the major costs are the core and the RAM, and NVidia sells both to the OEMs. NVidia could sell the core and DDR2 to the OEM at below or at cost. The major cost for the OEM would be assembly which shouldn't be that much different than any other card.
 
NVIDIA isn't Microsoft; they can't sell NV30 chips below cost and still expect to be in business the next day. They've got hundreds of millions of dollars in R&D they need to recover from this line of products, and it's not going to be recovered from budget chips alone, especially if they're losing more money every time they sell a high performance chip.
 
Does anyone know if the exotic cooling solution is required by the memory-- in other words, would a more conventional heatsink work if the GPU was clocked sower with the memory at 500 MHz, or does the memory require very aggressive cooling on its own?

I don't think nVidia would sell NV30s at a loss. The chip's marketing function (as a performance leading chip that is the flagship of the NV3x line) is accomplished even if they only sell a few of them, so their is no reason to price them below cost. If they can't recoup development expenses by selling them at consumer-level prices, there is no point sending good money after bad and subsidizing them to maintain market share in the relatively small high-end market.
 
I don't agree. BOM costs for initial production boards often go beyond the OEM/AIB cost until yields are under control. Sometimes this can even take an extra spin of the ASIC. The real money maker for nVidia is not the NV30, but the NV31 and NV31M variants. They may initially lose money on NV30, but that's not important because the high end is strictly for mindshare. What will dictate the success/failure of this product line is penetration into the consumer lineup with NV31[M]. 4x1, 128bit bus, DX9 shader capable. Main competition is RV350/M10.
 
Crusher said:
NVIDIA isn't Microsoft; they can't sell NV30 chips below cost and still expect to be in business the next day. They've got hundreds of millions of dollars in R&D they need to recover from this line of products, and it's not going to be recovered from budget chips alone, especially if they're losing more money every time they sell a high performance chip.

Yes, it's commonly understood that software companies have little in common with hardware companies...Especially because with software once you recover your development costs it's all gross profit from that point forward--pretty difficult to sell software at a loss even if you wanted to (unless it's brand-new software which hasn't yet sold enough to recoup development costs.) But otherwise, good point...

...if nVidia's OEMs can't make a profit on nVidia's reference designs, either nVidia lowers its prices so that a profit for the OEM becomes possible, or the OEMs stop selling the product and move to something else and nVidia gets nothing. This is why it's so terribly important that nVidia be cognizant of the competition, but also seek to best the competition whenever possible--being able to do that in the past has allowed nVidia to pass ATI in company size. As it would appear that ATI has awakened, and ATI has effectively one-upped nVidia in the 2nd half of '02, and indeed for much of the first calander quarter of '03 at least, we look to have some exciting competition on our hands as nVidia battles to hold on to what it's got and ATI battles to get back what it lost.
 
While the price of copying and distributing software falls to near zero, it certainly does cost money after you've recouped the development costs. There's this little thing called customer support.

Ask Derek Smart how much time and money was spent on developing all those BC3k patches. :)
 
CMKRNL said:
...What will dictate the success/failure of this product line is penetration into the consumer lineup with NV31[M]. 4x1, 128bit bus, DX9 shader capable. Main competition is RV350/M10.

If that's the case, I don't know whether to be disappointed in the delay of NV31 or to be happy for the release of RV350 0.13 LP ASIC relatively quickly... I anticipated NV31 SKU competition with the 9500/Pro series products. The mobile revisions will indeed be most interesting. The price squeeze will definately be on...
 
Does anyone know about ATI R300's 96bit precision format?

96bits are 4-components vector, so:

24 + 24 + 24 + 24 = 96bits

About each 24bits my guess is:

sign bit: 1bit
exponent bitst: 8bits?
mantissa bits: 15bits?
 
My guess would be

sign bit: 1bit
exponent bitst: 7bits
mantissa bits: 16bits

since it wont cause 16 Bit Unsigned Integer components to lose accuracy.
 
I would guess:

sign: 1 bit
exponent: 8 bits (or else ieee754 single-precision numbers can cause overflow)
mantissa: 16 bits (15 stored)

where the most significant bit of the mantissa is never stored explicitly, because we know that its value is always 1. I seem to remember actually reading these numbers somewhere, but cannot figure out where ...
 
Back
Top