3dfx's curse?

misae said:
How would a GFFX at 400/400 be faster than an R300? :rolleyes:

I didn't say a FX 400/400 would be faster (in general) than a R300. However, Anand includes 400/400 benchmarks in his review and in some it does beat the 9700 pro.

In any case, if a 400/400 FX had come out before the 9700 pro, people would have said that it was not an FX killer, that the 9700's slight speed advantage in some benchmarks would not make up for it's lower raw fill rate and more limited shaders--people would have talked about it using brute-force bandwidth to make up for its older technology. It's just an example of how being first to market gives you an advantage when people form perceptions.
 
antlers4 said:
I didn't say a FX 400/400 would be faster (in general) than a R300. However, Anand includes 400/400 benchmarks in his review and in some it does beat the 9700 pro.
Only in codecreatures and Commanche4. I'm not sure that means anything. But, at least in UT2k3 it's very, very close... I'd say the non-ultra is just slightly slower than the 9700 Pro.
 
misae said:
They still have to recoup the money spent on the design of the Radeon9700 Pro and also need to look at their inventory. No point in ATI shooting itself in the foot and honestly these first few numbers only go to show that the R350 is not needed so soon.
However if ATI relax then you can expect NVIDIA to pull the rug from under their feet before you can say "Dustbuster!"

The cost of additional engineering on a refresh part is probably pretty small. Since the R350 will still be based on the R300 technology, any sales of it will still be used to recoup the money spent on the design of the R300.

antlers4 said:
If they could've got this same card out in June at 400/400 or so (which I suspect was the plan at some point) nobody would have given a damn about the 9700 (it would have been a little faster on some benchmarks, but using "old" technology and more limited shaders) when it came out in July.

Shows the difference 7 months can make--a card that would've been a masterstroke that solidified NVidia's position at the top of the heap is now something they have to sell overclocked and even so it is perceived as a disappointment, leaving them vulnerable to ATI.

If the NV30 had come out in June at 400/400, I think the R9700 Pro would still have given it a run for its money. It would have stolen some of ATi's thunder, for sure, but at 400/400, it would lose to the R9700 in benchmarks. Especially if ATI had pushed the envelope a little bit and clocked the R9700 at 350/350.

Whoever comes out first always has the advantage and in a tie Nvidia would win due to their mindshare.
 
antlers4 said:
If they could've got this same card out in June at 400/400 or so (which I suspect was the plan at some point) nobody would have given a damn about the 9700 (it would have been a little faster on some benchmarks, but using "old" technology and more limited shaders) when it came out in July.
I don't think the FX was ever supposed to come out in June. At the soonest, it was to have come out around September-October.
 
Going back to the Xbox comment, I can't seem to believe that the GPU for the Xbox is the reason for such a delay. ATi is sitting inside of the Gamecube right now, and heck it didn't slow ATi down any. I just think that the NV30 was finished, it's just that they didn't expect to have the competition they did from ATi. When the R300 was announced, nvidia had to maybe modify and tweak the heck outa the thing to make it comparable. The R300 came out of nowhere to blow away everything, so nvidia was caught way off guard. My guess was that nvidia thought that ATi was going to come out with a GF4 equivilent card, not a card that was a generation past it. The 8500 was initially targeted for the GF3, but turned out to be an excellent performer against the GF4 line, so maybe ATi just decided to allow the 8500 to be direct competition in that segment and go right for the glory.

Of course this is just my point of view of course.
 
Performance is pretty good - it's just too late and comes at too much of a cost in terms of heat/noise/cost. Of course this is in the gaming domain where memory bandwidth currently plays a massive role. GeForceFX will really come into its own in DX9 titles that use complex shaders. Once this card has dev support it will really shine... as it will in non-realtime scenarios.

NV35 should be a lot faster. Low-K fab will yield a 20-25% raw clockspeed increase and a 256-bit memory should help bandwidth issues. The ASIC will incorporate many fixes/optimisations of the NV30 design and I expect nVidia will once again oversee PCB fabrication to cater for the *very* tight QC manufacturing such a board will require. If Cg has taken off by the Summer, NV35 should put nVidia back in the game - perhaps riding on the back of some crazy tech demos.

The big problem nVidia have now is NV31 - they wanted a 400-450MHz part to compete with the Radeon 9500P and possibly 9700NP. Current yields are struggling above 300MHz (I heard this independently and CMKRNL seems to think the same thing), leaving a huge "hole" in nVidia's fall lineup. The silicon is extremely buggy, being a bastardised NV30 architecture with functionality patched up using fastex software shader hacks. Not quite sure what they are going to do about that one. NV34 is similar but a lot of the buggy NV31 functional blocks are simply not there.

A note for the camp that still insist the the Xbox project was not the one of the main reasons nVidia are currently behind in terms of product cycles - I am playing my "insider info card" here and saying YES - IT WAS, BELIEEEEEEEVE!!! Unless at SC they tell the troops that M$ is to blame in order to keep moral up and cover for shoddy management... :LOL:

Flipper development was pretty much done when ATi aquired ArtX so it didn't constrain them at all in terms of R&D resources.

Xbox 2 will be different - this time M$ will have to earn a contract with nVidia if they want one, not the other way round. I'll stop now because I am starting to paraphrase from something sitting in my inbox... ;)

MuFu.
 
Velocity said:
Going back to the Xbox comment, I can't seem to believe that the GPU for the Xbox is the reason for such a delay. ATi is sitting inside of the Gamecube right now, and heck it didn't slow ATi down any.
But ATI didn't produce the Flipper chip. They purchased the company that did.

It's reasonably certain the the #1 reason for all the delays is problems with the .13 micron process. Also remember that nVidia was committed to "no compromises" on the FX. I believe that they were specifically referring to the full support of 16-bit/32-bit floating point color formats, whereas the R300 Only supports 24-bit precision (and, consequently, 16-bit, but it will probably not be anywhere near the performance of the FX when using half-floats).
 
All, imho.

The comparison to 3dfx is noted in some respects. But in the big picture quite small:

nVidia still has a huge market share; 3dfx didn't at the time.

nVidia still has a strong name-brand recognition; 3dfx's was tarnished.

nVidia brings in large revenues and financially strong; 3dfx wasn't.

nVidia has very strong partners; 3dfx was alone.

nVidia is a powerful force with developers; 3dfx's Glide was dying a slow death.

nVidia is very diversified; 3dfx wasn't

It is the mainstream FX products that will probably fuel the revenues for sometime and getting a headstart here at .13 may pay huge rewards. The only problem in my mind is their flagship product suffering in some respects here. Flagship products that hold the performance crown do bring in nice revenues -- but also mindshare that helps drive sales to all products and gives them instant credibility.

I don't know how anyone could expect nVidia to dominate like the past with a 128-bus. All I expected was to be competitive and I think the part certainly offers that. Again, IMHO.
 
MuFu said:
A note for the camp that still insist the the Xbox project was not the one of the main reasons nVidia are currently behind in terms of product cycles - I am playing my "insider info card" here and saying YES - IT WAS, BELIEEEEEEEVE!!! Unless at SC they tell the troops that M$ is to blame in order to keep moral up and cover for shoddy management... :LOL:

Xbox 2 will be different - this time M$ will have to earn a contract with nVidia if they want one, not the other way round. I'll stop now because I am starting to paraphrase from something sitting in my inbox... ;)

MuFu.

Just curious but what areas of focus were led astray with MS? As I uderstand it, the card itself was basically what they were designing anyway for the desktop. So design seems an unlikely candidate. Management focus, production, support, money?
 
MuFu said:
NV35 should be a lot faster. Low-K fab will yield a 20-25% raw clockspeed increase and a 256-bit memory should help bandwidth issues.

MuFu.

So the NV35 runs at 500MHz ( 400MHz x 1.25 ) then?

I hope they not use the dustbuster in the NV35 too. That would be a disaster. After hearing the mp3's I would say this beast will never be in any of my computers. I'm not insane enough to work wíth such an loud graphis-card in my comuter ever.
 
I'm not insane enough to work wíth such an loud graphis-card in my comuter ever.

I heard the fan is not as noisy when you are running 2D apps. Is that true ?
Also it only runs @ 500MHz for 3D apps, and slows to 300MHz for 2D, because it can't sustain that speed for prolong period.

These benches are really dissapointing, but if ATI isn't releasing R350 that outperform the FX, near FX released, I think I'll pick FX up.
 
MuFu said:
Xbox 2 will be different - this time M$ will have to earn a contract with nVidia if they want one, not the other way round. I'll stop now because I am starting to paraphrase from something sitting in my inbox... ;)

MuFu.

Earn a contract? If ATI wins the X-Box contract and nVidia doesn't have some great alternate revenue source in the pipeline (Gamecube 2 maybe but I doubt it would have to potential to be as lucrative as a MS X-Box 2 contract) things could start to get a bit ugly. Also the preception of being a leader would be somewhat tarnished.
 
Back
Top