My take on the disaster that is the GFFX. (long post)

BOM = Bill Of Materials -- All the parts that are needed.

That is to say, the DDR2 memory makes up half the cost for a NV30 card.
 
martrox said:
Well, yes it can and will. Remember that the GFFX was the first in a series of new products off the same design. A design which was born of a flawed view of the world, a world that nVidia ruled without competition. We have already seen that a GFFX down clocked to the speed of a 9500Pro is barely competitive with it, and just what does that say about the other products that are going to be derived from the NV30? It may be years before nVidia can catch up to ATI. Is this the end of nVidia? Probably not – lets hope not! We need nVidia to be competitive with ATI, as competition only helps us all. But it is the dawn of a new world in the graphic market, one that nVidia no longer rules with impunity.

Oh, good post...! Excellent! I agree in all the particulars.

I think your take on the psychology driving nVidia is right on target and a very astute observation. They are chock full of hubris, and just like any character with a tragic flaw--it may well do them in in the end. nVidia's going to proceed with this abomination, and commit more of its shareholder's money to this black hole--unbelievable really. Frankly, I don't know how grown men could look at this thing and HEAR it---whi-i-i-i-i-i-i-i-i-i-i-i-i-i-i-ine!--and keep a straight face...! If I was a nVidia employee right now I'd either be sobbing hysterically or laughing maniacally. Whew! I saw a recent interview with a nVidian in which the guy was falling all over himself unconsciously apologizing for the fan, saying things like, "...in normal use the fan doesn't even have to come on." Poor fellow, it was obvious that he was saying these things and quite unconsciously betraying his real feelings about it--for instance that 3D gaming wasn't "normal use" (which it of course is for people who would buy it) and indicating that the fan "not coming on" was the preferred condition to experience with the product. He might as well have bluntly said--"Hey, this is a cool product as long as you don't play 3D games with it!" This is really quite something. I'm not sure that 3dfx ever blundered to this extent--I mean, even though they cancelled the V5 6K at the end citing it was just too impractical, still they were shipping the V5 5.5K the whole time and it was a very good product.

You see this phenomenon all the time with large companies that have some bucks to throw away--look at the hundreds of millions of dollars Intel threw after Rdram and Rambus--thinking it could spend its way into dominating the market--and was proven wrong and is right now pushing JEDEC to ratify a DDR 400 standard. If the market is bigger than Intel and can resist Intel, resisting nVidia won't be any trick at all.

The funny thing is that nVidia's not even a flea on Intel's back but it sure is acting like it's got the clout to "force" the market into something it doesn't want. I think that like a bull battering its head against a wall nVidia is going to proceed with this thing even if the whole world itself signs a collective telegram: "Earth to nVidia, earth to nVidia! We don't want your hairdryer-cum-leafblower experience! Do something else!" I don't think nVidia is listening; and if the company is listening to anything, all it's hearing is "whi-i-i-i-i-i-i-i-i-i-i-i-i-i-i-ine!"....;) Poor buggering sods...;)
 
RussSchultz said:
So the only people interested in DDR2 is NVIDIA?

Not router manufacturers like Cisco and Lucent. Not high end server manufacturers. Nobody?

You keep calling it "NV30 memory subsystem". DDR2 is simply a high speed memory, not designed specifically for the NV30.

For the most part, no. The only current customers for the super high speed DDR2 & DDR parts that Samsung has out are Nvidia, ATI, and whomever else is left. That's why you'll find it in the 'Memory->Graphics Memory' section of their web site. It's also why Infineon lists their high end DDR parts under 'Specialty DRAMs->Graphics RAM'.

Router makers generally don't use DDR because of the latency involved in page misses. They tend to prefer a lower overall bandwidth combined with a fixed, predictable latency. High-end server manufacturers wouldn't use the high speed bins of DDR2 because of the limitations in the implementation and the difficulty of scaling up the total amount of memory present.
 
Sorry to sound a bit off-topic, but I just remembered this quote and thought it was particularly entertaining now :) (not that it wasn't when it was first published..)

Originally from here

Essentially all modes of antialiasing are available at all resolutions without any performance hit. Greatly improved image quality, with no drop in frame rate!

:LOL:
 
sumdumyunguy said:
Because why should Samsung as the seller do nvidia the buyer any favors. Samsung should realize that they have alomst complete pricing power in the short run. If Samsung is not charging nvidia (up fornt too, by the way) an arm, leg, & the other arm, then Samsung shareholders should bring suite & or heads should roll. It's all about maximizing profit. In other words, sell the least you can sell for the most profit.
Well i think you should go back to economics lesson ;). The point you make is maximising profit in the short run while Russ point is maximising the profit in the long run. And trying to get as much as possible in the short run is not maximising profit it's mismanagement!!!

Actually, as CMKRNL put in, the price of DDR2 seems to be 50% more expansive than the 325%Hz counterpart. But, that's public information. we don't know how much Ati pays for it's RAM and Nvidia's.

There're many way for Samsung and Nvidia to take an aggreement to be cost effective and profit maker for both part. And that does mean: DDR2 is not necessaraly more expansive than DDR.

We don't know the state of production, of demand, the expectation for the futur and the global agreement between Samsung and Nvidia, and i think it's far fetched to put as fact something you can't prove and moreover know till no-one knows the real contract betwenn Nvidia and Samsung.

PS: Sorry doing my PhD on economics and finance ATM ;) and hope to finish it this year...or the next one :D
 
samsung recently announced gddr3 production...so the long run of gddr2 may be hampered...

heres the tidbit from vr-zone...
Samsung announced the development of what the company calls the world's fastest SRAM, based on DDR3 SRAM technology. Reportedly built around 90-nm (0.09-micron) process technology, the 72-Mbit DDR3 SRAM from Samsung was manufactured, by using conventional 248-nm krypton-fluoride (KrF) lithography tools. Targeted for high-end servers and workstations, the new DDR3 SRAM operates at speeds of 1.5-gigabits-per-second and requires only 1.2-Volts for low-power consumption. Based on a “breakthroughâ€￾ cell technology, the chip measures 0.79-micron square. The company is already sampling a 32-Mbit DDR3 SRAM. Mass production of the 72-Mbit DDR3 SRAM is expected in the second half of 2003.
 
Evildeus said:
...The point you make is maximising profit in the short run while Russ point is maximising the profit in the long run. And trying to get as much as possible in the short run is not maximising profit it's mismanagement!!!...

What is SR? What is LR? (Given this static context.)

PS: Sorry doing my PhD on economics and finance ATM ;) and hope to finish it this year...or the next one :D

Cool. Which school, thesis topic? PM me if you prefer.
 
stevem said:
Evildeus said:
...The point you make is maximising profit in the short run while Russ point is maximising the profit in the long run. And trying to get as much as possible in the short run is not maximising profit it's mismanagement!!!...

What is SR? What is LR? (Given this static context.)

PS: Sorry doing my PhD on economics and finance ATM ;) and hope to finish it this year...or the next one :D

Cool. Which school, thesis topic? PM me if you prefer.
SR: I would say NV30.
LR: The length of the contract with Nvidia. Could be 5, 10, 25, etc years :) More precisely, i would say as long as Nvidia uses DDR2.

Well, it's public (in France at least ;)) so
University: University Paris IX Dauphine
Subject: Money, business cycle and free banking school in the Austrian Economics thought ;) (not the titled but not far from :D)
 
Sorry, wasn't meant to be a pop quiz. More like a commentary on institutional use of SR point estimates as proxies for dynamics...

Good luck in successfully completing your studies. I played around with business cycle harmonics in equities (derivatives) a while ago. Mainly Black-Scholes vs stochastic option pricing models, etc.

Edit: Mods - Sorry for this OT post.
 
gokickrocks,

That's DDR3 SRAM that Samsung are producing - don't confuse with DRAM as used on graphics cards. They are entirely different beasts and have different uses.

The only mainstream graphics device currently available which uses SRAM is the GameCube, I believe?
 
stevem said:
Sorry, wasn't meant to be a pop quiz. More like a commentary on institutional use of SR point estimates as proxies for dynamics...

Good luck in successfully completing your studies. I played around with business cycle harmonics in equities (derivatives) a while ago. Mainly Black-Scholes vs stochastic option pricing models, etc.

Edit: Mods - Sorry for this OT post.
Ok :)

Sorry mods back on topic :)
 
A more intresting aspect of our little debate is that you believe nvidia will not accept 3 to 5% margins at rollout. Are you not aware of the 'loss leader" markteting concept? Whereby the seller will take little or no margins or sometimes even a slight loss in order to (mostly) entice you purchase said discounted product now in the hopes of higher margin purchaes & also (what I think in this case) to gain marketshare. AMD has been successful enough at this to be a valid competitor against a behomenth such as Intel. I see no reason (at this point in time) why nvidia would be immune to such a strategy vis a vis other graphics board distributors.

That was me.. and yes I am aware of this concept you talk so highly of and I am also aware of the fact that NVIDIA will not take that route.

So unless you can back this speculation up I suggest you drop it.. and I will drop my speculation of NVIDIA not accepting 0-5% margins.

Your whole argument is flawed as you see no reason but I see several. NVIDIA dont need to take marketshare that urgently is one reason. They are the market leaders.
 
There's also the concern if not losing *as much* money, though. If they didn't release it at 500/500 for $399, and possibly released it as a slower chip, charging as much or more, they might not sell any. Then they would be out their entire R&D budget with no possibility to recoup those expenses. At least selling the chips/boards/etc. at competitve performance for very little profit, they would be more likely to sell them. They'd still be losing money in the long run, just not as much as if they tried to sell cards with a good profit margin that nobody buys. It would also help preserve their image, which is crucial to NVIDIA's success right now.
 
NVIDIA have already informed AIB partners that they would like them to not markup so highly... but I think the partners will ignore that and is why you are seeing 650 euros being quoted...
 
NVIDIA have said pre-order prices are $399. Why specify only pre-order prices? Are they subsiding the pre-orders? Will they actually go up after the pre-orders?
 
Since the FX architecture needs developer support, I think NVidia might be spending a little money subsidizing boards so that the NV30 appears to be a viable product so it can capture some developer attention.
 
Evildeus said:
sumdumyunguy said:
Because why should Samsung as the seller do nvidia the buyer any favors. Samsung should realize that they have alomst complete pricing power in the short run. If Samsung is not charging nvidia (up fornt too, by the way) an arm, leg, & the other arm, then Samsung shareholders should bring suite & or heads should roll. It's all about maximizing profit. In other words, sell the least you can sell for the most profit.
Well i think you should go back to economics lesson ;). The point you make is maximising profit in the short run while Russ point is maximising the profit in the long run. And trying to get as much as possible in the short run is not maximising profit it's mismanagement!!!

Actually, as CMKRNL put in, the price of DDR2 seems to be 50% more expansive than the 325%Hz counterpart. But, that's public information. we don't know how much Ati pays for it's RAM and Nvidia's.

There're many way for Samsung and Nvidia to take an aggreement to be cost effective and profit maker for both part. And that does mean: DDR2 is not necessaraly more expansive than DDR.

We don't know the state of production, of demand, the expectation for the futur and the global agreement between Samsung and Nvidia, and i think it's far fetched to put as fact something you can't prove and moreover know till no-one knows the real contract betwenn Nvidia and Samsung.

PS: Sorry doing my PhD on economics and finance ATM ;) and hope to finish it this year...or the next one :D

Yes Evildues you are correct..in rational markets. The recent dot-com bust & current market status indicate that the markets are anything but rational.
 
Ilfirin said:
Sorry to sound a bit off-topic, but I just remembered this quote and thought it was particularly entertaining now :) (not that it wasn't when it was first published..)

Originally from here

Essentially all modes of antialiasing are available at all resolutions without any performance hit. Greatly improved image quality, with no drop in frame rate!

:LOL:

They're perfectly correct. They were benchmarking Quake I and Starcraft. All available resolutions (up to the very high 640*480!), all AA modes... no performance hit! Woohoo!
 
Back
Top