GeFX canned?

WaltC said:
Ratchet--

Happened to see your post as I was finishing this one. They haven't killed nv30--they've only killed nv30 Ultra products--they won't be making anything with a Dustbuster or running at 500MHz with 1 GHz DDRII ram onboard. But they will be making the 400MHz/800 nv30 Dusbuster-less, non-R300-competitive products on schedule, apparently.
I realize that, but you'd have to be a pretty hard core fan-boy to choose a GFFX non-Ultra over a 9700 Pro wouldn't you? I mean, the non-Ultra has literally nothing going for it compared to the 9700 Pro - definitly no speed advantage, no image quality advantage, and (for a guesstimate) no price advantage... I call it like I see it, and what I see is a DOA nV30 non-Ultra. As far as I'm concerned, the nV30 is no more.

Having said that, nVidia must also have realized that the non-Ultra was in absolutly no position to go head-to-head with the 9700p when they decided to scrap the GFFX Ultra. I can't possibly see a company cut their own throat like that - logic says that they have to have something waiting in the wings... what it is, I have no clue...
 
Ratchet said:
I realize that, but you'd have to be a pretty hard core fan-boy to choose a GFFX non-Ultra over a 9700 Pro wouldn't you? I mean, the non-Ultra has literally nothing going for it compared to the 9700 Pro - definitly no speed advantage, no image quality advantage, and (for a guesstimate) no price advantage... I call it like I see it, and what I see is a DOA nV30 non-Ultra. As far as I'm concerned, the nV30 is no more.

Having said that, nVidia must also have realized that the non-Ultra was in absolutly no position to go head-to-head with the 9700p when they decided to scrap the GFFX Ultra. I can't possibly see a company cut their own throat like that - logic says that they have to have something waiting in the wings... what it is, I have no clue...

I have to tell you I think that's a rather illogical assumption you make at the end there...the nv30 Ultra was the "something" they had hoped to make competitive with the 9700P. Isn't that obvious? If there had been anything else, do you really think they'd have gone to the extremes of even proposing something like the Dustbuster solution in the first place?

I don't. I believe if they had anything else they would've announced *that* instead, and scrapped the dustbuster from the beginning. Besides, for the past six months nVidia has had nothing remotely competitive with the 9700P to offer--nothing. I can't see what makes it so imperative "now" versus then. The simple truth is they couldn't do it then and they can't do it now.

Let's see...how many years would you say the ATI architectures ran behind the nVidia architectures? Do you think somehow that nVidia is "immune" to being behind, or has some sort of supernatural dispensation to protect the comany from it? If so, it's already failed because nVidia has been behind for the past six months. Categorically behind.

I see no "logical reason" why nVidia even has to catch up anytime soon, much less start winning the race anytime soon.
 
What a waste...think of all those countless engineering hours, all the cunning PR hype, and giving a big bonus to the guy who came up with the dustbuster solution...all down the drain.
 
WaltC said:
Ratchet said:
I realize that, but you'd have to be a pretty hard core fan-boy to choose a GFFX non-Ultra over a 9700 Pro wouldn't you? I mean, the non-Ultra has literally nothing going for it compared to the 9700 Pro - definitly no speed advantage, no image quality advantage, and (for a guesstimate) no price advantage... I call it like I see it, and what I see is a DOA nV30 non-Ultra. As far as I'm concerned, the nV30 is no more.

Having said that, nVidia must also have realized that the non-Ultra was in absolutly no position to go head-to-head with the 9700p when they decided to scrap the GFFX Ultra. I can't possibly see a company cut their own throat like that - logic says that they have to have something waiting in the wings... what it is, I have no clue...

I have to tell you I think that's a rather illogical assumption you make at the end there...the nv30 Ultra was the "something" they had hoped to make competitive with the 9700P. Isn't that obvious? If there had been anything else, do you really think they'd have gone to the extremes of even proposing something like the Dustbuster solution in the first place?

I don't. I believe if they had anything else they would've announced *that* instead, and scrapped the dustbuster from the beginning. Besides, for the past six months nVidia has had nothing remotely competitive with the 9700P to offer--nothing. I can't see what makes it so imperative "now" versus then. The simple truth is they couldn't do it then and they can't do it now.

Let's see...how many years would you say the ATI architectures ran behind the nVidia architectures? Do you think somehow that nVidia is "immune" to being behind, or has some sort of supernatural dispensation to protect the comany from it? If so, it's already failed because nVidia has been behind for the past six months. Categorically behind.

I see no "logical reason" why nVidia even has to catch up anytime soon, much less start winning the race anytime soon.
... but would they have ditched the Ultra if they didn't have something to take it's place. I doubt it. Like I said, they'd be cutthing their own throat to simply drop the Ultra with nothing to carry them through to the NV35. That's my point. Whatever it is, they might not be ready to announce yet, which is why we haven't heard anything...

Then again, maybe it's like BRiT said - maybe they can't produce the GFFX Ultra reliably - in which case they'd have no choice to scrap the design... woe is nVidia...
 
Ratchet said:
... but would they have ditched the Ultra if they didn't have something to take it's place. I doubt it. Like I said, they'd be cutthing their own throat to simply drop the Ultra with nothing to carry them through to the NV35.
If they would be losing money on each Ultra produced, then yes, it would be smart (maybe) to ditch it - even without another product comming soon,
 
Althornin said:
Ratchet said:
... but would they have ditched the Ultra if they didn't have something to take it's place. I doubt it. Like I said, they'd be cutthing their own throat to simply drop the Ultra with nothing to carry them through to the NV35.
If they would be losing money on each Ultra produced, then yes, it would be smart (maybe) to ditch it - even without another product comming soon,
that's a good point. It's not like they are selling a consoles and have game sales to prop them up.
 
Its sort of funny in seeing how this news was posted on NvNews front page, but was recently removed. Seems like someone is in denial. :rolleyes:
 
Ratchet said:
that's a good point. It's not like they are selling a consoles and have game sales to prop them up.

Right...and if nVidia was forced to sell it to the OEMs and make guarantees to them and other things the sum total of which meant that nVidia's OEMs weren't willing to pay what nVidia needed them to pay so that nVidia could turn a profit--that would be it.

That's the thing--it's the pressure of the competition from ATI--because the OEMs know they do not have to pay nVidia what it demands to sell a successful, competitive high-end 3D product that is in high demand. If they have to they can go to ATI and get a better deal. You can also be sure that ATI OEM pricing in the market is also affecting what nVidia OEMs are having to pay for the plain nv30 chips--it's certainly not what it would be if nVidia had no competition from ATI in this market. All of this in turn drives down the price to consumers.

What I've been trying to get across here is the idea that nVidia has been beaten--last product cycle, this product cycle, too, and it may continue this way for some time in the future. ATI has a fantastic architecture in the R300, very similar to nVidia's former TNT2-based architecture which it enlarged and expanded virtually all the way through the GF4. When companies move to new architectures the rule books fly out of the window--current performance is no longer based on past successes because the company no longer is standing on the bulwark of the older, successful architecture. Now it's ATI with a fantastic architecture to build on for the next few years. It will be interesting indeed to see how nVidia responds. All of it though will be good for consumers. I for one am delighted that ATI "woke up" and decided to become competitive in the 3D chip marketplace--not just because I think the 9700P is the best 3D card I've ever owned, but because when a single company gains control of a market segment unchallenged, innovation is often the first thing to go. I think innovation is something we can look forward to now for a long time to come. *chuckle* "3D Wars" haven't been this much fun since 3dfx, Lord rest their pudgy souls...;)
 
BRiT said:
Its sort of funny in seeing how this news was posted on NvNews front page, but was recently removed. Seems like someone is in denial. :rolleyes:

*chuckle* Maybe [H] is having a good time pulling a fast one on all of us...;) Ah, well, if so....it's certainly kept things interesting.... :D
 
WaltC said:
demalion said:
1) There is no reason I've heard so far to believe the nv30 effort has delayed the nv35, and I'd tend to think nVidia would have had to be rather foolish to change their plans to have that result when they hit the nv30 snags. I tend to believe that the nv35 delay would primarily be determined by the marketplace, given the original intended launch schedule...as we've guessed during the nv30 delay period, focus on preparing the nv35 was likely a priority.
In this light, killing the 5800 Ultra seems very sane if they've recently reached the conclusion that they can bring it to market soon enough. I think it lends validity to the May(?)/June launch rumors with the nv35 (as much before the launch of the R400 as possible), and indicates that the issues (I think) Mufu has hinted at are on-track for resolution.

Aren't you just about sick and tired of "rumors" when it comes to nVidia and its products?

Well, I'm in a fair bit of amazement that such rumors are the majority of what they've managed to deliver successfully, and that feeling has served to numb any annoyance, I guess. :-?

That's about all it's been for the past 8-9 months out of nVidia--rumors and speculation galore--and at the end the stark and naked truth:

nVidia has no chip with which to compete with the R300 from ATI.

Eh? No, they do have a chip that competes fairly well. It just isn't feasible to release it as a product. Also, the non Ultra still competes, just not very successfully from the standpoint of those who picture nVidia as the performance leader. The Ultra has served its purpose of providing benchmarks that can be used to show the "GeForce" at an advantage to the "Radeon"

There were plenty of good, solid reasons to cancel the GF FX Ultra which have absolutely nothing whatever to do with "nv35" (*chuckle* That's so funny--after hearing "nv30,nv30,nv30,nv30".......ad infinitum for the past six months--and how it would "wipe the floor" with the puny R300 from ATI.)

Hmm...well, yes there are, but why are you telling me? I'm the guy who was lambasting people for making assumptions after E3 that it was preposterous that the R300 couldn't be faster than the nv30.

But on to your reasons...

How about:

Walt's list of GF FX Ultra flaws

Do you think any of this is news? Well, I'm not sure what you mean by "too many heatsinks" but I'm not particularly curious... ;)
This isn't the nvnews forums Walt, nor Rage3D, you don't have to keep pointing out things like this when no one is contesting them (atleast not at such length). All I was commenting on (read the text again) was specifically that I don't see any reason at all to assume the nv35 is necessarily delayed because the nv30 was so late...hence terms like "lends validity to the rumors".

Specifically, I think this leaves room for the nv35 to come out before fall (and presumably the R400...I don't think ATi has a great deal of reason to hurry the launch of that even if they could...I think the R350 is likely to compete well enough with the nv35), and opportunity to re-associate nVidia and the GeForce, for whatever amount of time, with the concept of "performance leadership".

This does leave technical issues to be worked out, and I don't have the confidence in nVidia engineers that I would with ATI engineers at this point, but even just adding a 256-bit bus would help the GF FX catch up quite a bit, even before considering the other ideas the engineers may have in mind.

Repeating myself...given the prior hints of the nv35 being the focus of intensive "debugging", it seems likely, in my opinion, that this info about cancelling the 5800 Ultra parts strengthens the likelihood of rumors of a May/June launch schedule. If you disagree with this, a brief reply like that at the end could have sufficed....

Nowhere do I indicate that I disagree that the 5800 Ultra is a flawed part, and I've mentioned the flaws prior. I don't mention them again because they've been mentioned quite a few times already....

Those are just in-your-face, undeniable observations each of which by itself might be reason enough to cancel the product. Taken in total they are overwhelming.

Yes, yes....similar outlooks have been well established. For my part, that is why I was using terms like "sane"...the Ultra just strikes me as a computer OEM dud.

more reasons that I thought were discussed adequately, and I don't see my post contesting.

That's enough. As you can see there are plenty of reasons to cancel this product, none of which are remotely connected to the mythical "nv35." (Ever heard the story about the boy who cried 'wolf'?)

? Yes, I saw it before too...? This first half was a waste of time, IMO. :-?

2) 500/800 doesn't sound very sane to me...but 400/"1000" does.
(a) I'd expect heat issues with "DDR II" RAM to be less severe than those for the GPU, and yields at 400 MHz sound profitable.
(b) Bandwidth is the GF FX's primary problem.
(c) Not having read Kyle's comment yet, it seems like a plausible minor miscommunication.

400/800 is the real deal. Maybe nVidia will up the spec for the non-Ultras. But at a selling price of $300, don't count on it. nVidia already knows the non-Ultras won't compete with R300, and certainly not with R350. Why throw good money after bad? Let it compete with the $300 ATI cards....which *chuckle* will probably be the R300 anyway--just as soon as ATI ships the R350.....:D

I think you make a good point, and I tend to agree. See above with my later post about the memory clock speed.

BTW, I disagree about your bandwidth statement. Even with virtually *the same* raw physical bandwidth as a 9700P, the Ultra was much slower per clock, and that had nothing to do with bandwidth at all.

Well, I've discussed this before...clocking the RAM the near the same frequency as the core with a 128-bit bus is more limiting than clocking near the same frequency with a 256-bit bus. Each card having roughly the same fillrate, this indicates a situation where the GF FX architecture is much more likely to "choke" as I termed it, and I think make it more likely to get greater returns from increasing RAM clock frequency (assuming there are no issues with such a memory clock disparity between core and RAM...I assume nVidia has their interface well in order).

But I agree the returns in performance are likely not to be deemed worth it for the increase cost, though I don't have any definite idea of the cost difference.

I will say however that as the latencies in DDRII seem to be a fair amount higher than those in DDR I, and as nVidia's nv30 bus didn't seem optimal to me in the sense of even being very good at 128-bits wide, you might get by with a broad interpretation of "bandwidth" here--certainly not anything that putting slightly faster DDR II ram on the non-Ultra would solve. I think the GF FX non-Ultra might make good competition for the 9500Pro, maybe.

I'm also not convinced that the RAM on the GF FX is best considered to be "DDR-II" in regards to latencies. But that's another discussion (no, really... we've had that discussion in another thread...).

I think you'll have a wait on the nv35 though, I really do, as I think it will take significantly more than a 256-bit bus to make it competitive with R350/R400, specifically R400--and that will be a much greater challenge than R300 was, I believe.

Now this is a brief statement of disagreement. I still don't know why you felt the majority of the first half of your reply was necessary.

To reply briefly in turn, I also don't think the nv35 will successfully compete with the R400, and I think nVidia has been focusing on getting the nv35 ready as soon as possible for quite a while (since the 9700 launch atleast). I think it is the best glass of lemonade they can make from the situation, and I think they are preparing it as fast as they can.

I'm not going to try and dissuade you at all about your belief on when they are going to deliver it (because I don't have any strong opinion that it is wrong), but I do find issue with your idea of "nv35 can't arrive soon because the nv30 just arrived."
 
X-Reaper said:
It's really looking more to be true now. PYN has updated there website.
They remove the ULTRA name and lowerd the specs.

http://www.pny.com/home/products/Vcard_fx.cfm

GeForce FX Specifications
CineFX Engine providing cinematic-quality special effects
400MHz core Clock
400MHz memory clock (800 MHz Effective memory Clock)

Yes, you're right--I find this particularly persuasive since I had visited the page earlier and saw the Ultra FX specs. I'm sold. Thanks for the link.
 
WaltC said:
One small thing I resent out of all of this is nVidia making a big deal last year about how nv30 was the first chip co-designed by 3dfx and nVidia engineers. The one technology that I expected to see used from 3dfx in the nV30 would have been some kind of major FSAA improvement from nVidia along the lines of the T-buffer and 3dfx's hardware RGSS. Certainly it wouldn't have been the same thing, but frankly judging by the poor FSAA in the sample products that were reviewed it didn't look like nVidia had an interest in 3dfx's ideas on FSAA at all--which were certainly always years ahead of nVidia's. In fact, the only thing at all that even remotely smacked of a 3dfx approach was using the post filter to apply a sort of pseudo-FSAA to the nv30's 2x and QC modes--3dfx made good use of the post filter in the V3 and the V5, but never for FSAA, as I recall. Did anyone else notice anything about this chip that might be remotely connected to a 3dfx approach or technology? I mean, we'll still have the mid-range non-Dustbuster 5800's to play with presumably, so it isn't like we won't find out. But if any of you guys review the 5800 later on I'd appreciate it if you remembered what nVidia said about the design of this chip and looked for anything 3dfx related in it--be great, too, if nVidia would provide some info on this--maybe let a couple of the ex-3dfx guys talk about it and their role in its development...? That'd be nice.

I think you're mystifying complicated engineering quite a bit are you there Walt? *chuckle* This topic and the term "3dfx technology" are getting quite old. The engineers from former 3dfx were absorbed by a larger body of engineers and the buck stops there. Anything they had been working on 2 years ago is now obsolete. Asking if "their approach" was used in the design of the NV30 is just silly, of course it was...since nVidia and 3dfx both used immediate mode rendering designs. As far as RGAA goes, nvidia apparently couldn't make it work without a huge tradeoff in performance. It did work quite well in the VSA architecture with two chips, but consumers expect better than a 50% drop in performance at 2x AA these days. Either they couldn't make it work, or thought they had something better, and failed. Anything that a T-buffer could have done can now be done with multiple buffers. Exactly what kind of information are you expecting them to provide? Some sweeping statement like "this particular transistor was designed with 'Mofo tech' in mind". That's not going to happen.

I found this particularly amusing

WaltC said:
(6) 75W of heat displacement

(7) Too many heatsinks

Next time you go into a frothing tirade (though no one was disagreeing with you), you might find that heat dissipates and doesn't displace. And how many heatsinks is too many exactly? I see 2 on the GFFX, but I recall there also being 3 on the GF2 Ultra, 1 for the chip and 2 for the memory.
 
Anyone have a working and complete Voodoo5 6000 version 3 or better they want to trade for a working v1.1 GFFX 5800 Ultra? Drop me a line.

Damn you Kyle... damn you....

I *almost* want to make this trade....

But after much consideration, I decided there is no chance. I just will not downgrade my FSAA quality 'that' much.
 
demalion
Eh? No, they do have a chip that competes fairly well. It just isn't feasible to release it as a product. Also, the non Ultra still competes, just not very successfully from the standpoint of those who picture nVidia as the performance leader. The Ultra has served its purpose of providing benchmarks that can be used to show the "GeForce" at an advantage to the "Radeon

I simply cant believe im reading stuff like this. Seriously.

The Ultra has *served its purpose* ??? what a bunch of nonsense. So now its an accepted practice for a company to overclock a product so far, that they cant release it like that, and somehow it counts as a real product. Well guess what, that crap does not fly. Ati could have done the same freaking thing to the R300 any time they wanted.
 
Hellbinder[CE said:
]demalion
Eh? No, they do have a chip that competes fairly well. It just isn't feasible to release it as a product. Also, the non Ultra still competes, just not very successfully from the standpoint of those who picture nVidia as the performance leader. The Ultra has served its purpose of providing benchmarks that can be used to show the "GeForce" at an advantage to the "Radeon

I simply cant believe im reading stuff like this. Seriously.

The Ultra has *served its purpose* ??? what a bunch of unethical <bleep> nonsense. Give me a break. And you guys wonder why i think there is general favrotism among developers and internet sites. :rolleyes:

So now its an accepted practice for a company to overclock a product so far, that they cant release it like that, and somehow it counts as a real product. Well guess what, that crap does not fly. Ati could have done the same freaking thing to the R300 any time they wanted.

No one said it was ethical.... :rolleyes:
 
I know, I edit my comment...

But it continually irritates me that people will pass it off, and not get upset about this kind of conduct.
 
Back
Top