GeFX canned?

Dio said:
S3 were in a good part responsible for this, because between the excellent texture unit design and free trilinear on the Savage3D and Savage4 and the limitations of a 64-bit VRAM bus, it was actually a bit faster to texture out of AGP than out of VRAM in most cases, and they demoed that quite convincingly.

I remember S3 for doing the first viable texture compression--the original UT even included a special S3-compressed texture pack. AGP texturing, however, never proved itself in any 3D game I am aware of, and today is completely overshadowed by 3D cards with their own much faster onboard local ram buses. I do remember a lot of silly people running synthetic benchmarks with 50mb and 100mb single files which they then said "proved" the value of AGP texturing for 3D gaming. Problem is at the time you could barely find a single file in a 3D game over 1 megabyte, let alone 50mbs....*chuckle* And of course today that nonsense is at last behind us forever (hopefully.)

Although, there are still some slow folks around who actually expected 3D games to show a performance spike when they moved from AGP x8 from AGP x4--I even read one site which was "disappointed" (and obviously baffled) because there was no difference in performance that they could chart. It seems like you could explain to people over and over again how the onboard texturing on a 9700P is 20x faster than AGP x4 and 10x faster than AGP x8, and they *still* expected to see a performance difference when moving to AGP x8. AGP texturing was clearly nothing but a marketing gimmick spread by ignorant people at the time who couldn't grasp the fact that 3dfx tried to communicate, which was why use AGP texturing at all when your onboard ram bus is a multiple of times faster? Always seemed pretty simple to me--even at the time of Savage 3D. There were those of us who got it then, and those who didn't. I would hope that today the vast superiority of the local ram bus for texturing in 3D games over that of the AGP bus is abundantly apparent. Sigh...even at the time a person could expound on how AGP texturing was developed by Intel at a time in which vram for video cards was $50-$75 dollars a megabyte, if not higher, and AGP texturing was developed for a 3D scenario in which the most onboard ram you'd see on a 3D card was 8 megabytes. The only real AGP-texturing 3D card ever made that I'm aware of was the Intel I7xx series, which featured no more than 8 megs of onboard ram and a heavy dependence on AGP texturing. The card was beaten soundly by 3dfx products at the time which didn't use AGP texturing at all, and eventually Intel folded up shop and got out of the 3D graphics chip business completely. IE, the only "real" AGP texturing 3D card ever made was a dog and a complete flop--utterly non-competitive with 3D solutions at the time which did not rely on AGP texturing at all.
 
X-Reaper said:
ben6 said:
Hrm, just a silly point, but the Ultra is still available for pre-order

http://www.pny.com/home/products/verto.cfm

Not really. You noticed they removed the Ultra from the name and lowered the specs???

GeForce FX Specifications
CineFX Engine providing cinematic-quality special effects
400MHz core Clock
400MHz memory clock (800 MHz Effective
Welll i think you should click on the pic.... and then you find that it's a GF FX ULTRA in pre-buy ;)
http://www.compusa.com/promos/geforcefx/default.asp?cm_ven=pny&cm_cat=link&cm_pla=ros&cm_ite=gfx

500MHz core Clock
500MHz memory clock (1GHz Effective memory Clock)
 
THe_KELRaTH said:
The only place I've seen in UK where you could pre-order the Ultra is at Special Reserve - and right now it seems the cancel button is getting alot of attention.

I remember a recent review where the Ultra was overclocked slightly to 530 and after a short period it was overheating and dropping back to 300/600. Is it possible that the reason for the cancellation was just that NV couldn't stop it from overheating at 500Mhz in a typical home enviroment ie. "closed case + extended use".

You can get it from

http://www.komplett.co.uk/k/kl.asp?AvdID=1&CatID=24&GrpID=1&t=278&l=2

as well
 
Mulciber said:
Well as american auto consumers, I always remember the saying "no replacement for displacement". I rather enjoy gargantuine heatsinks ;)

Ah, my sweet 427 Chevy--427 cubic inches of *displacement* with which I could burn a swath of rubber for a quarter mile, and which got all of 8 miles to the gallon...;) Hopefully, it's now returned to some junkyard somewhere and is finally and forever at peace...;) (Probably recycled for the metal 20 years ago...;))

I wont understand the mindset of anyone purchasing the 5800Ultra. I cant even stand my measly 3k rpm 80mm panaflo, which is towards the bottom end of the list as far as noise output goes.

Yes, the hairdryer sound was quite something...the heat, the size...it's just amazing to think that since the 9700P shipped in September that the 5800 Ultra was what nVidia threw most of its resources into, and actually presented to the market, as a "competitor."

Moving to the 5800 non-ultra is a decision that nVidia should have made back in november before word of the idea even leaked. My guess is they were hoping the performance would warrent that solution, but the fact that it doesn't has made them just lower their face and admit defeat. (which of course they never will in more than symbolicly)

I really think they found the market nowhere near as gullible as it used to be. And of course it didn't hurt at all that the standard to which it was compared was the 9700P.

As to admitting defeat, nVidia already has. Here's the link:

http://www.bayarea.com/mld/mercurynews/business/5093187.htm

Here's the appropriate text:

Huang is confident that Nvidia will end up back on top. ``Tiger Woods doesn't win every day. We don't deny that ATI has a wonderful product and it took the performance lead from us. But if they think they're going to hold onto it, they're smoking something hallucinogenic.''

I applaud Huang for being honest--it's so much nicer than trying to maintain a fiction everyone sees through.
 
Fuz said:
demalion said:
This decision does, in my estimation, seem connected with this enhanced chip design and rumors that it could be released earlier than some had thought (i.e., in May/June), though I don't equate this (May, June) with the nv35 being "ready to go". Reasons include:
  • We've had hints of the "reliable sort" that nv35 has been in "debugging" for a while.
  • This decision came after this "debugging" process has had time to be evaluated.
  • The simple addition of a 256-bit bus is a pretty conservative baseline for the expected performance of such a chip, and easily achieved in the time allowed (and, also indicated by "hints of the reliable sort").
  • IMO, getting a product that is less unreasonable than the GF FX 5800 Ultra (as based on the nv30) is a relatively low target.

Excuse my ignorance, but I was under the impression that adding a 256-bit bus was anything but "simple", relatively speaking.
When I say "time allowed", I'm talking about "when they recognized that the nv30's 128-bit bus was not enough".
IMO:
The shortest time this would be is the time before the R300 launch that they realized it would have a 256-bit bus (I'm pretty sure that would be pretty early last year at the latest).
The longest time is from the beginning of the design of the nv35, if they would have banked on it being a good way to improve performance (this seems pretty likely to me).

When they started focusing efforts on "overclocking" the GF FX to ultra levels might be an indicator as to the latest date possible for the former.

I'm not talking about all of a sudden just now deciding to "tack on" a 256-bit bus.

Also, wouldn't the NV3x core need to be designed with support for both 128 and 256-bit bus from the get go? Similar to how the R300 supports both DDR and DDR-II.
I try to stay away from the term "need to be designed" without any real idea for nVidia's capability to adapt. I'd also say the board design for such a part is something else they would have started at the same time as they recognized a 256-bit bus might be a good idea (which could conceivably have been much earlier than the above dates).

It would certainly seem to me that considering such possibilities as early as possible is the most efficient approach to execution.
 
Dave H said:
...

Frankly I have a difficult time seeing how this was a smart move for Nvidia. They've already taken the credibility hit/ridicule for the FXFlow, so where's the benefit in pulling the part now?

Benchmarks are still showing the GF FX Ultra "beating" the 9700 Pro.

This results in reasons for customers fond of their name brand assuming nv35 will "beat" the R350, and is enough to "contest" ATI's performance crown.

This still leaves the most dedicated customers (pre-order customers) having cards to brag about to consumers who would not have purchased one in any case, and reviewers to benchmark them to consumers in the same fashion.

Finally, this prevents the negative impact (having power supply problems, being annoyed by the noise) from being driven home to a large number of consumers, and keeps them as abstract as "something they heard about" when looking at benchmark results that will still be associated with a purchasable product (the "GeForce FX" brand name). It remains to be seen whether print magazines will facilitate this (well, if your opinion of print magazine reviews leaves room for doubt).

Perhaps yields in the 500 MHz bin were too low, or there are cost problems for 500 MHz DDR-II, but if that's the case, better to quietly scale back volume of the Ultra part rather than cancel it altogether. After all, "Ultra" parts are allowed to be rare.

Hmm...I see what has happened as a "'terminal' volume scale back", not a "cancel it altogether". The Ultra parts will be rare, not non-existant.

Problems with the reliability of the Dustbuster?

We've covered the list of problems enough, I think. :LOL:

:idea:
Perhaps concerns they'd need to swallow too many returned units from irate retail customers??? Perhaps indicated by some recent consumer testing with the new and slightly improved fan version...?

I'm not so sure it would be nVidia being concerned with that, but more like nVidia being concerned about OEMs being concerned or dealing with such. Not being the only big kid on the block means you get away with a lot less.
 
Well lets hope some reviewers hang onto the NV30 Ultra so that it can be tested in comparison with the R300 core under 3Dmark2003.
If it follows the same trend as the shadermark tests then just switching to 256bit wont help much.
 
Dave H said:
My guess is that this won't really have a huge effect on R350 release date. For one thing, ATI will still want it to hog the spotlight for as long as possible before NV35 comes out. For another, let's not forget that even if this news is exactly as Kyle reported it, the Ultra will still exist, albeit as a very rare part (pre-orders only). This is different from e.g. V5 6000. Point is, don't expect websites to retract their Ultra reviews. (Kyle already said he won't be.)

Seriously, though...now that the news is out that the product has been cancelled, who'd want to buy the noisy, hot thing...? I mean, if nVidia has such little confidence in it that they are withdrawing it from the market before it ships, why should a consumer pre-order it?

On the other hand, this might give ATI reason to hold back the review NDA on R350 until a little closer to shipping than they might otherwise have done. (Presumably they'll still "launch" at CeBit, but that could mean anything.) And presumably the price of the 9700 Pro will be a bit higher than it otherwise would have been.

I think ATI would be wise to press ahead while it has the advantage. The more distance it can put between its products and nVidia's, the better. The worst thing a company can do, as has been proven by more than one company, is to withhold its viable technology from the market when it is able to ship it. For instance, it would have been far better for nVidia to have shipped the nv30 non-Ultras last year ASAP, and to plunge immediately into nv35 development, than to do what it did. ATI needs to do exactly as nVidia did and pace itself by itself, and let the "chips" fall where they may. nVidia didn't "wait" on 3dfx or ATI to ship products prior to nv30--it shipped them according to its own internal time table and didn't much worry about what the other guys were shipping. This is the attitude ATI should adopt, I think. I hope they do.

Frankly I have a difficult time seeing how this was a smart move for Nvidia. They've already taken the credibility hit/ridicule for the FXFlow, so where's the benefit in pulling the part now? Perhaps yields in the 500 MHz bin were too low, or there are cost problems for 500 MHz DDR-II, but if that's the case, better to quietly scale back volume of the Ultra part rather than cancel it altogether. After all, "Ultra" parts are allowed to be rare.

Problems with the reliability of the Dustbuster?

:idea:
Perhaps concerns they'd need to swallow too many returned units from irate retail customers??? Perhaps indicated by some recent consumer testing with the new and slightly improved fan version...?

I think the "slightly improved" fan version is nothing but the original fan tuned to turn completely off when not running a 3D game, and at a slightly slower speed when running on high.

No doubt the fan would prove the achilles heel of the design, not only from a marketing standpoint as many would find it distasteful enough to warrant them overlooking the product entirely, despite any other assumed superlatives it might have had, but also from a warranty standpoint as the fan would be a likely candidate for failure (compared to the other components) which would provoke an immediate RMA. This is no ordinary fan which a consumer himself could easily and cheaply replace if needed.

A concern I had about the fan from the start is how an end user would keep it clean. Kyle at [H] so far has been the only one I've seen comment on this practical aspect, and he mentioned the use of canned air. My opinion is that you'd have to remove the card from your system, unscrew the plastic fan housing, and then blow or wipe the accumulated dust and dirt off, reassemble and reinsert it into your system. The question would be of how often this would have to be done, of course. The heat-pipe baffling inside the fan housing makes a perfect trap for lint, dust, and dirt particles. If it gets stopped up, the card over heats, the clock-throttle comes on, and your MHz drop to 300--until you clean the fan.

Even with a clean and properly operating fan there were certain objections to the clock-throttling mechanism by people who looked at it. Anand said that when he tried a modest over clock the clock-throttle activated and dropped to ~300MHz, right in the middle of the game. So as predicted, the over-clocking potential of the Ultra is nil, since the card was already over volted and over clocked right from the factory--the clock-throttle was necessary in case of fan failure, or in case of general overheating of the chip. Some of the artifacts I saw in the [H] review which were attributed to the drivers looked suspiciously like artifacts I have seen in the past when trying to overclock a 3D chip too high, resulting from over heating.

At any rate, the thermal and electrical problems inherent in the design undoubtedly complicated manufacturing of the product and added to the expense, and I think in the end someone at nVidia just simply "woke up" and realized this was a choice of cutting your losses now, or continuing on ahead and risking millions of dollars more in losses. The company did the right thing, IMO, both for consumers and for its shareholders.
 
THe_KELRaTH said:
Well lets hope some reviewers hang onto the NV30 Ultra so that it can be tested in comparison with the R300 core under 3Dmark2003.
If it follows the same trend as the shadermark tests then just switching to 256bit wont help much.

Heh-Heh....personally, I keep hoping reviewers will drop 3D Mark entirely and simply stick to benchmarking games....I think that's much more valuable information.
 
WaltC said:
Always seemed pretty simple to me--even at the time of Savage 3D. There were those of us who got it then, and those who didn't. I would hope that today the vast superiority of the local ram bus for texturing in 3D games over that of the AGP bus is abundantly apparent.
No arguments that today the balance is hugely tilted against AGP texturing.

Back in the Savage3D days it was a big bonus. Since it only had 64-bit SDRAM it could saturate the VRAM bus completely with just 16-bit Z + colour at 1 pixel/clock.

Therefore, since it had enough latency comp to absorb completely the AGP cache fetch, it was almost always faster to texture from AGP because it was 'free' extra bandwidth.

Now it is very different, because with DDR and 256-bit buses there is 8x more bandwidth available on the VRAM bus and HyperZ and colour compression crank the effective bandwidth up even more. In contrast, AGP has got only 4X more bandwidth and the latency compensation requirements are way up so getting good AGP performance costs more $$$ than it used to.
 
Colourless said:
Anyone have a working and complete Voodoo5 6000 version 3 or better they want to trade for a working v1.1 GFFX 5800 Ultra? Drop me a line.

Damn you Kyle... damn you....

I *almost* want to make this trade....

But after much consideration, I decided there is no chance. I just will not downgrade my FSAA quality 'that' much.

I can respect that. :)
 
Heh-Heh....personally, I keep hoping reviewers will drop 3D Mark entirely and simply stick to benchmarking games....I think that's much more valuable information.

I belive it's totally unfair to Futuremark that you are talking so much trash to 3Dmark2003 like that. If it's anything it's supposed to be, it'll be more stressful on the GPU instead of a combination of things like 3Dmark2001... It should give a good indication of cards strengths and weaknesses in GPU limited situations.
 
MikeC said:
BRiT said:
Its sort of funny in seeing how this news was posted on NvNews front page, but was recently removed. Seems like someone is in denial. :rolleyes:

I'd like to get a confirmation from NVIDIA in regards to the cancellation of the GeForce FX Ultra before posting it. In addition, HardOCP asked us last August not to link to any of their stories.

Going to be hard to get confirmation from NVIDIA PR when they are do not know it themselves.

And yes, I asked TypeDef to stop linking us as he was copy and pasting the entire contents of our news posts. Certainly you have posted many links of ours in the past weeks, I don't know why all of a sudden you would stop posting links now.

Maybe you trying to be nice to NVIDIA and get that fansite card once again?
 
FrgMstr said:
Going to be hard to get confirmation from NVIDIA PR when they are do not know it themselves.

Huh? Are you saying that it is the OEM's that is going to dump the ultra? Or are you saying that nVidia PR wont come forward about this until later (on Monday)?
 
LeStoffer said:
FrgMstr said:
Going to be hard to get confirmation from NVIDIA PR when they are do not know it themselves.

Huh? Are you saying that it is the OEM's that is going to dump the ultra? Or are you saying that nVidia PR wont come forward about this until later (on Monday)?

We got our information on this through some VERY ODD channels, but still reliable. I do not think NV PR was aware of this issue when we posted it.
 
Back
Top