My take on the disaster that is the GFFX. (long post)

Althornin said:
Tahir said:
If peltiers arent any good then LN2 cooling should have been considered.
Well, personally, i think the card should have just used phase change cooling.
Come with its own mini-vapochill unit.

No I must disagree. The best method would be to explicitily state that FXFLOW can only be used on the darkside of the moon. This will have the biggest advantage of producing zero noise and will reman dust free as there is no atmosphere to produce any noise or movement to stir any moon dust.

Now some of you may think that in a vacuum there is no noise anyway but Star Trek and Star Wars have dispelled that myth.

Anyway, ignore me, please do carry on fellows.
 
MuFu said:
It really *has* to be considerably faster than R350 and I'm sure it will be - in fact, if nVidia has its wicked way with developers then I can see this nightmare scenario developing in which nVidia-centric games really steal ATi's thunder.

MuFu.

People always say this, and yet I feel it is totally nonsense. Unless you play Code Creatures, ATi cards perform just as well in games as Nvidia. The only exception where Nvidia may have had the lead was back when they were the only ones who supported T&L.

Uttar said:
The R400, based on current rumors, would be godly. But it would surprise me if ATI could pull it off with 0.13, it would sound a lot more like something to do with 0.09

People said the same thing about the R300, so I wouldn't make any assumptions. ATi are obviously better at creating power saving/efficient designs, no doubt due to their experience in the mobile market (and Nvidia's offerings there are still quite pathetic, really).

Fuz said:
This whole thing about the NV35 coming out soon and putting Nv back in line just doesn't make sense. I am not saying they are not capable, but it just doesn't make good business sense.

Nv have spent alot of money developing the NV3x core, and they need to recoup that cost. They need to sell as many NV30's as possilbe.

This is another thing people always say that has no logical basis. The NV35 is derivative of the NV30 technology. Any money that went into the R&D of the NV30 also can be seen as having gone into the production of the NV35. All NV35 sales can be seen as offsetting that initial investment as much as NV30 sales would (although there might be some minor additional cost overhead).

Also, the bottomline it all comes down to is if profits across the board are greater than expenditure then you are doing fine. The whole "gotta make up the R&D costs with sales" is a nice general way of looking at things in order so you don't start hemorrhaging money on a single project. But the bottomline is if a project ends up going bust, then you're better off just swallowing the loss and moving on with something newer and better.

As far as I can tell NVidia just threw the NV30 on the market to make up a little money and regain the performance crown. They'll probably just pump the NV35 out ASAP to really recoup the project costs. All the same, the earliest I see the NV35 is six months from now. It would be really strange to have it come out just 2-3 months later than the NV30. Then again, if it has better margins, runs cooler, performs better and is just all around more competitive, then it may happen.

The same applies to the current favourite of speculation - R400. Some people here seem to be expecting it as early as july, well I don't think so! If R350 hits in late march/april as is currently expected, then you can also say goodbye to the possibility of seeing R400 before late fall...

This is a no brainer, IMO. New ATi architectures always come out in the fall. :rolleyes: ;)
 
martrox said:
While there are excellent posts here disagreeing about my conclusion of the dire strates that nVidia is in - How do you guys feel about my statements on the causes of this. Did nVidia get blindsided by the R300? Did they push the GFFX beyond it's intended speeds to make up for the lack of performance as compared to the R300?

When I first saw R300 vs GF4 numbers in AA+AF, I really wondered (might be a post around here somewhere or Rage3d) if ATI had managed to pull off a discontinuity and if we really appreciated that fact. I think the GFFX numbers --6 months late and extreme cooling measures-- confirms that original opinion. Which is another way of saying I agree with you.
 
all that r&d that nvidia spent wasnt just on the gf fx ,theyll use everything theyve learnt on the gf gx to bring out whatever card they need to come out on top again ,and im sure theyve been working over time not just getting the nv30 upto scratch but also on the nv35/40 ,carmack did say there there are cards comin out soon that will faster with d3 .so yes i do expect the nv35 to come out by midyear or just after
 
Kind of premature to make that statement though, the FX isn't even available yet...
eek13.gif
 
borntosoul said:
all that r&d that nvidia spent wasnt just on the gf fx ,theyll use everything theyve learnt on the gf gx to bring out whatever card they need to come out on top again ,and im sure theyve been working over time not just getting the nv30 upto scratch but also on the nv35/40 ,carmack did say there there are cards comin out soon that will faster with d3 .so yes i do expect the nv35 to come out by midyear or just after

nVidia has been averaging roughly an 11 month span between core changes for the past few years, and you think now they are going to magically trim that down to less than half of that?
 
RussSchultz said:
Fuz said:
antlers4, that is why I asked about the pricing. At $399, the margins must be damn low.

Really? How do you know this?

Please, give us a cost break down.

Without BOM costs, we obviously can't. But we can make some suppositions for the preceived high cost/low margin per SKU @ $399:

1. 120m+ trans 0.13u ASIC
2. 128MB 500MHz DDRII RAM
3. 12 layer PCB
4. Dustbuster (Sorry, I couldn't resist...)

Conversely, the leverage Nvidia can extract is in the bundling of ASIC/DDRII/FX Flow to AIB/OEMs. Additional to their stated rationale of maintaining high QC, it may be that placing early production orders though 1 or 2 AIBs results in economies of scale making initial pricing @ $399 possible.

BTW, talk of recouping NV30 costs via the sale of NV30 ASICs totally ignores corporate (re)investment & R&D practice. The NV30 project expenditures (as with all R&D) will be amortised over successive products & deflated over financial years. Let alone the tax benefits...

Edit: typo
 
RussSchultz said:
Fuz said:
antlers4, that is why I asked about the pricing. At $399, the margins must be damn low.

Really? How do you know this?

Please, give us a cost break down.

Actually, you are correct. I can not give you a cost break down, so I should not have said "margins must be damn low" cause I can't really back it up.

What I was getting at is that margins must be lower on the FX at $399 than a 9700P at $399. Of course, jmho. I say this becuase the FX has a 12 layer PCB compared to 10 on the R300, 500Mhz DDR-II compared to 325Mhz DDR and then you factor in the cost of adding that exotic cooling device. I may be wrong, if so, please educate me?
 
For the AIB manufacturer, the primary cost driver is the memory.

I have no earthly clue how much DDR2 costs, but I _suspect_ it is cheaper clock per clock compared to DDR. It MAY be that 500Mhz DDR2 is cheaper than 325Mhz DDR. Why? Because DDR2 was designed to be cheaper.

A 12 layer PCB is only marginally more expensive than a 8 layer board.

None of us have an idea what the cost of the chipset itself is, or the yeild so that we could make valid comparisons.
 
RussSchultz said:
I have no earthly clue how much DDR2 costs, but I _suspect_ it is cheaper clock per clock compared to DDR. It MAY be that 500Mhz DDR2 is cheaper than 325Mhz DDR. Why? Because DDR2 was designed to be cheaper.

I doubt it. DDR2 at 500 MHz may be cheaper than DDR1 at 500 MHz, but I doubt it's cheaper clock for clock across the board. Maybe it will be eventually, but right now with Nvidia being the only company even using it???
 
RussSchultz said:
For the AIB manufacturer, the primary cost driver is the memory.

I have no earthly clue how much DDR2 costs, but I _suspect_ it is cheaper clock per clock compared to DDR. It MAY be that 500Mhz DDR2 is cheaper than 325Mhz DDR. Why? Because DDR2 was designed to be cheaper.

The last I heard, Samsung was selling 500MHz DDR2 parts for about twice the cost of their 350MHz DDR parts of the same size. They may be cheaper at the same clock speed, but the high end memories still command quite a premium.
 
RGB said:
The last I heard, Samsung was selling 500MHz DDR2 parts for about twice the cost of their 350MHz DDR parts of the same size. They may be cheaper at the same clock speed, but the high end memories still command quite a premium.

And anything else would defy logic, I might add. Those 500MHz DDR2 is cutting edge - and the bleeding edge as far as cost goes. The fact that they need cooling should be taken as a hint IMHO.
 
LeStoffer said:
And anything else would defy logic, I might add. Those 500MHz DDR2 is cutting edge - and the bleeding edge as far as cost goes. The fact that they need cooling should be taken as a hint IMHO.
Well it seems that there are issues on the process of DDR2 meaning it's really hot ;)

On the price, well anybody know the agreement with samsung? :)
 
LeStoffer said:
And anything else would defy logic, I might add. Those 500MHz DDR2 is cutting edge - and the bleeding edge as far as cost goes. The fact that they need cooling should be taken as a hint IMHO.

And Rambus runs at 1Ghz and up. So what?

Because its faster doesn't mean its cutting edge, or more expensive to make. There's all sorts of tricks you can play (like having 4 banks running at 1/4 total speed) that allow you to accomplish an external signalling speed that high.

Like I said, I don't know the cost, but neither do you. :p
 
Like I said, I don't know the cost, but neither do you.

But simply supply and demand "laws" of economics tells us that 500 Mhz DDR-II is more expensive.

How many manufacturers are there of 300-300 Mhz DDR? In what quantities are those ram chips availabel?

How many manufacturers of 500 Mhz DDR-II are there, and in what quantities are they available?

No, I doin't have numbers on these myself, other than I know of only one manufacturer (Samsung) offering 500 Mhz DDR-II. If we're going to speculate, It seems very nonsensical to guess anything other than Radeon's memory being cheaper than GFFX Ultra's.

As you know, Russ, even if it did "cost less" to make 500 Mhz DDR-II vs. 300 Mhz DDR (and I doubt that's true at the moment), that doesn't mean much when availabilty is much more limited.

Edit...

ATI is probably going to run into similar cost issues with GDDR-III. At least initially. Though ATI seems to have several DRAM manufacturers supporting the GDDR-III standard, so competetion might keep the price of those chips down a bit. As per usual though on the bleeding edge, there will probably only be one initial manufacturer of the Ram, making the first products based on it costly.
 
If you'll note, I say "MAY" in capital letters. I doubt its less expensive also, but I also doubt its much more bleeding edge or expensive than 375mhz DDR.
 
RussSchultz said:
If you'll note, I say "MAY" in capital letters. I doubt its less expensive also, but I also doubt its much more bleeding edge or expensive than 375mhz DDR.

Actually I would wager, that DDRII right now is much more exspensive than normal DDR. If it is not then Samsungs' chaebol (board of dircectors) should fire all top management immediately & then Samsung shareholders should sue said management for improper invesetment of capital.

These disparate companies care nothing for each other (except to the extent that predatory costs would cause a customer to go out of businesses, thus negating future sales). Also, becasue nvidia is essentialy launching the card themselves, with the board vendors as proxies, I would wager that they are eating the initial round. I fully expect that they will take a charge against earnings this quarter for that very reason (wether they explicity state that specific reason or not).

Also, it seems that many here are assuming that nvidia is "suddenly" going to experince some quantumn design increase vis a vis the NV35. Remember, nvidia said that the R300 could not be manufactured on 0.15. They also stated a 256 bit bus was not neccessary. Mercedes Benz can mass manufacture cars with 10 mm tolerances on all external seams. GM can not. I fully expect that some mangement heads will roll within the next 6 months. Nvidia's task is not merely to catch up (in respect to product cycles), but to go back to the future of 6 months. In other words, they need to get approx 12 months ahead of ATI in process technology. I find that highly unlikely witihn the next 24 months.
 
sumdumyunguy said:
Actually I would wager, that DDRII right now is much more exspensive than normal DDR. If it is not then Samsungs' chaebol (board of dircectors) should fire all top management immediately & then Samsung shareholders should sue said management for improper invesetment of capital.

You're speaking greek. Why would DDR2 being less expensive than DDR make Samsung fire people?
 
Back
Top