GeFX canned?

My guess is that this won't really have a huge effect on R350 release date. For one thing, ATI will still want it to hog the spotlight for as long as possible before NV35 comes out. For another, let's not forget that even if this news is exactly as Kyle reported it, the Ultra will still exist, albeit as a very rare part (pre-orders only). This is different from e.g. V5 6000. Point is, don't expect websites to retract their Ultra reviews. (Kyle already said he won't be.)

On the other hand, this might give ATI reason to hold back the review NDA on R350 until a little closer to shipping than they might otherwise have done. (Presumably they'll still "launch" at CeBit, but that could mean anything.) And presumably the price of the 9700 Pro will be a bit higher than it otherwise would have been.

Frankly I have a difficult time seeing how this was a smart move for Nvidia. They've already taken the credibility hit/ridicule for the FXFlow, so where's the benefit in pulling the part now? Perhaps yields in the 500 MHz bin were too low, or there are cost problems for 500 MHz DDR-II, but if that's the case, better to quietly scale back volume of the Ultra part rather than cancel it altogether. After all, "Ultra" parts are allowed to be rare.

Problems with the reliability of the Dustbuster?

:idea:
Perhaps concerns they'd need to swallow too many returned units from irate retail customers??? Perhaps indicated by some recent consumer testing with the new and slightly improved fan version...?
 
MikeC said:
Mulciber said:
Why would they ask something like that? :?:

Heh. It's a long story that Typedef can explain to all of you one day. Bascially, they said we were stealing their news and riding their coattails.

But were going off topic and I promise not to say anything else about it in this thread (unless I get provoked :LOL: ).

Seeing just yesterday a claim that xbitlabs is supposedly copying/pasting benchmark results you got me curious there.

I wonder what it takes to provoke you. Since nothing really interesting is going on lately, I could use a nice soap-opera to have a good laugh at it (no insult intended). :D

Anyway back on topic: I couldn't care less about the possible cancellation of the FXUltra. I'm curious as to what they'll do beyond that for the ultra high end and when. I don't see from recent rumours NV35 or anything similar being that close to tapeout at the moment. Anyone got any reasonable ideas?
 
Just saw this (apologies if earlier than 3 pages ago it was confirme) by Warp2search

Kyle Bennet from [H]ardOCP reports that sources have told the [H] staff since Tuesday this week that only pre-ordered Ultra cards will be sold and shipped to customers. After that only non-ultra cards will be available. Guess Nvidia is going straight for NV35 reference design in fear of ATI's yet unannounced R350. Who ever would have thought that half a year ago. Here is Kyle's wording:

As we noted here earlier this week, the GeForceFX 5800 Ultra will never make it to retail. Those of you that PreBuy the cards will still get an Ultra model with the FX Flow cooling unit. Those who don't will have the opportunity to get the non-Ultra version (400/800) off the retail shelves for a price of US$300.00. This information is unconfirmed at this time, but has been what we have been told repeatedly by different sources since Tuesday of this week.
 
Hellbinder[CE said:
]I know, I edit my comment...

But it continually irritates me that people will pass it off, and not get upset about this kind of conduct.

No one said it was ethical, or even acceptable. We're just speculating that maybe Nvidia just released the Ultra to reviewers to score higher in the benchmarks... It may or may not be true, but, in a way, the Ultra has served its purpose if that was Nvidia's intent.
 
nggalai said:
Well, I wonder. I had an edifying encounter yesterday night: I went to have a couple of G&Ts with some old friends, who also happen to consider themselves "hardcore gamers".

Hardcore gamers... :LOL: I bet they play counter strike? ;)
 
I feel so vindicated!!!!!!! :LOL: :LOL: :LOL:

http://www.beyond3d.com/forum/viewt...p;postdays=0&postorder=asc&highlight=

Ok, here's my take. What we will see is a totally different card with the non Ultra - cheaper PCB & 400mhz DDR I, for cost considerations. At the time of the original timeframe (last August/September), 400mhz DDr wasn't available, and tht's why nVidia was going with DDR II. At 400mhz, there is absolutely no reason to use DDR II - more expensive/hotter/greater latiency/12 layer PCB - so why use it? With DDR I running at 400mhz & a 400 mhz .13 FX, I can see where it will be somewhere between a 9500Pro and a 9700..... not a best buy, but at least a decent, usable DX9 card...... at least this will help keep nVidia's customers from jumping ship......

Ok, how bout NV35? Well, do you guys think there's any chance that it will be anything more than a NV30 without the bugs and a 256 bit memory bus? Maybe low-K will help it run at a decent temp at 500mhz, but just how are you going to see 600+ mhz on these things without extensive & expensive cooling solutions? (this applies to ATI, too). I'm sure there may be a few "extras thrown in - hopefully, nVidia will rethink the IQ & FSAA for the NV35. nVidia has a hard road to hoe, and I can see little chance of them taking the performance crown from ATI with the NV35. IF a NV30 at 500/500 is no better - with IQ taken into consideration - than a R300 at 325/310, then just what makes you think a NV35 at 500/500 with a 256 bit memory bus will be that much better than a 9900PRO at 425/400? Not to mention, unless nVidia has the NV35 "waiting in the wings" - which I personally don't believe - I think the June/July timetable is pretty close - then the NV35 will be blown away by the R400 within 2 months..... I can see a paper launch for the R400 sometime by say....July.
 
martrox said:
I feel so vindicated!!!!!!! :LOL: :LOL: :LOL:

http://www.beyond3d.com/forum/viewt...p;postdays=0&postorder=asc&highlight=

Ok, here's my take. What we will see is a totally different card with the non Ultra - cheaper PCB & 400mhz DDR I, for cost considerations. At the time of the original timeframe (last August/September), 400mhz DDr wasn't available, and tht's why nVidia was going with DDR II. At 400mhz, there is absolutely no reason to use DDR II - more expensive/hotter/greater latiency/12 layer PCB - so why use it? With DDR I running at 400mhz & a 400 mhz .13 FX, I can see where it will be somewhere between a 9500Pro and a 9700..... not a best buy, but at least a decent, usable DX9 card...... at least this will help keep nVidia's customers from jumping ship......

Jebus jumped up man! I cannot believe you're spreading this to another thread. What makes you think that DDRI at 400mhz is going to be dissipating less heat and be less susceptible to signal noise than DDRII at the same frequency?
You keep spouting this crap off in every thread....wheres the proof? :?:
 
I have to say that there is a small bunch of people out there, me included, who are weird enough to just like new technology and overclocking rather than games and IQ and who might just buy a non ultra.

People like me saw the FX ultra as something with a "novel heatsink / fan" that was to be removed as soon as possible and something better fitted as standard, whether that be extreme ( LN2 ) medium ( Peltier cooler + water ) or modest ( SK7 plus 80cfm fan ).

Now nvidia, apparently, have done this for us. And, seemingly, they have reduced the price down to $399. And I assume that all the cores that are now going at 400MHz without dustbuster would have been going at 500MHz with dustbuster and so fitting an extreme, medium or modest as above might make this a bit of an overclockers paradise :D

Or am I just seeing the silver lining ?

Personally, although pulled, I was excited to see the first 0.13 micron gpu/vpu and for it to hit 500Mhz ( a massive jump from 300 or so at 0.15) and have been disappointed at reading fellow geeks ( a term of praise for once ! ) slag off something just because it comes from a certain company.

It's the way of the future. Like the Wright brothers first plane, it didn't fly as high as balloon, but it was the way forward.
 
As we noted here earlier this week, the GeForceFX 5800 Ultra will never make it to retail. Those of you that PreBuy the cards will still get an Ultra model with the FX Flow cooling unit. Those who don't will have the opportunity to get the non-Ultra version (400/800) off the retail shelves for a price of US$300.00. This information is unconfirmed at this time, but has been what we have been told repeatedly by different sources since Tuesday of this week.

Amazing....the similarities to the 3dfx VSA-100 launch continue!

3dfx: cancel Voodoo5 6000...only lwer models (4500 and 5000) making it to market.
nVidia: Cancel GeForce FX "Ultra" model...only lower non ultra making it to market.

Only slight difference in this case, is that some Ultra models will apparently make it into consumers hands. (Those who for whatever reason pre-ordered one.)

This stinks of more blatantly unethical marketing from nVidia.

"Hey, lets give out FX ULTRA boards for benchmarks...but then not make them available for purchase!!" At least 3dfx didn't give out V5 6000s for benchmarks.

I can only hope that if this rumor is indeed true, that web-masters stop benchmarking the Ultra, and stick to the 400/400 speeds.
 
Mulciber said:
Jebus jumped up man! I cannot believe you're spreading this to another thread. What makes you think that DDRI at 400mhz is going to be dissipating less heat and be less susceptible to signal noise than DDRII at the same frequency?
You keep spouting this crap off in every thread....wheres the proof? :?:

Listen, you are the one that doesn't listen. I've now posted in 2 different threads on this. To me, it's YOU that doesn't listen. I've given you far more proof than you have given in disproving what I have said, so quit busting my cahonies. And stop getting personal, I haven't jumped on you, even though you have followed me to every thread I've posted in.
Again, PLEASE read:

http://www.beyond3d.com/forum/viewt...p;postdays=0&postorder=asc&highlight=

http://www.beyond3d.com/forum/viewt...p;postdays=0&postorder=asc&highlight=
 
Joe DeFuria said:
This stinks of more blatantly unethical marketing from nVidia.

"Hey, lets give out FX ULTRA boards for benchmarks...but then not make them available for purchase!!" At least 3dfx didn't give out V5 6000s for benchmarks.

I can only hope that if this rumor is indeed true, that web-masters stop benchmarking the Ultra, and stick to the 400/400 speeds.

Agreed, though I think we should wait for confirmation on this from Nvidia before taking it too far.
 
This stinks of more blatantly unethical marketing from nVidia.

"Hey, lets give out FX ULTRA boards for benchmarks...but then not make them available for purchase!!" At least 3dfx didn't give out V5 6000s for benchmarks.

they are still available for purchase as all are preorders at present.

To be honest I hardly think you can say "unethical marketing" from nVidia. Surely all marketing is unethical to some degree?
 
Evildeus said:
LeStoffer said:
Typedef Enum said:
Words cannot even begin to describe...

I have to say that I'm in disbelieve myself. When I think about the fact that we have heard about CineFX since July 2002 this is just turning out be a joke. :devilish:
I don't think everybody is laughing at Santa Clara :oops:

Perhaps the ATI West Coast Development team is :p
 
Dave H said:
My guess is that this won't really have a huge effect on R350 release date. For one thing, ATI will still want it to hog the spotlight for as long as possible before NV35 comes out. For another, let's not forget that even if this news is exactly as Kyle reported it, the Ultra will still exist, albeit as a very rare part (pre-orders only). This is different from e.g. V5 6000. Point is, don't expect websites to retract their Ultra reviews. (Kyle already said he won't be.)

On the other hand, this might give ATI reason to hold back the review NDA on R350 until a little closer to shipping than they might otherwise have done. (Presumably they'll still "launch" at CeBit, but that could mean anything.) And presumably the price of the 9700 Pro will be a bit higher than it otherwise would have been.

<snip>

I agree with this. nvidia has now removed the pressure ATI needed to release the 9900 Pro. The non-ultra is a non-concern.

That said, I believe it is in ATI's best advantage to release the 9900P and keep ahead with *their* development cycle. If the rumors from inq are true, then the 9900 is already well into its way for production, so there's no reason to stop now. The RV350 chips are probably finished as well, so they might as well keep it up. Plus, they'll have something that'll keep them in the spotlight and probably keep them competitive against the nv35, depending on how powerful the nv35 is.

0wn1ng the performance crown for almost an entire year is well worth whatever associated costs there might be by releasing another set of cards. They need to make sure the name "radeon" and "ati" is in the minds of everyone that considers a card in the future.
 
Shady Marketing:

What's more, the GeForce FX's innovative new architecture includes an advanced and completely transparent form of
Antialiasing: Jaggy vs. Smooth
lossless depth Z-buffer and color compression technology. The result? Essentially all modes of antialiasing are available at all resolutions without any performance hit. Greatly improved image quality, with no drop in frame rate!

http://www.nvidia.com/view.asp?IO=feature_intellisample

0,3363,i=19917,00.jpg


Thats appears to be a lie..now ATI's PR:

-a-glance:
Fastest* 3D gaming performance with next-generation VPU architecture
Complete DirectX® 9.0 support for unprecedented realism and sophisticated visual effects
SMOOTHVISIONâ„¢ 2.0 technology provides new levels of image quality with advanced full-scene anti-aliasing (FSAA) and anisotropic filtering
Revolutionary new video features including VIDEOSHADERâ„¢ and FULLSTREAMâ„¢ technologies
Featuring CATALYSTâ„¢ - ATI's industry-leading software suite with frequently scheduled free updates providing additional features and performance over the product's lifetime
features

Fastest* 3D Gaming Performance
128MB DDR memory accelerates the latest 3D games
256-bit memory interface removes hardware performance bottleneck and provides end users with faster 3D graphics
Industry's first 8-pixel pipeline architecture, providing twice the rendering power of any currently competing product.
Supports the new AGP 8X standard, providing a high-speed link between the graphics board and the rest of the PC (2.0 GB/sec)
Highest Level of Realism
First to fully support DirectX® 9.0 and the latest OpenGL® functionality
New SMARTSHADERâ„¢ 2.0 technology allows users to experience complex, movie-quality effects in next-generation 3D games and applications
SMOOTHVISIONâ„¢ 2.0 technology enhances image quality by removing jagged edges and bringing out fine texture detail, without compromising performance
128-bit floating-point color precision allows for a greater range of colors and brightness
Revolutionary New Video Features
Unique VIDEOSHADERâ„¢ engine uses programmable pixel shaders to accelerate video processing and provide better-looking visuals
ATI's new FULLSTREAMâ„¢ technology removes blocky artifacts from Streaming and Internet video and provides sharper image quality
Only Products Designed and Built By ATI Offer:



I see no FSAA free ads here..there is many more examples.
 
Hmm, the more I think about it, the more 400Mhz DDR-I might be logical.

DDR-II *is* more expensive than DDR-I at the same clock rate.
DDR-II got higher burst lengths and other stuff.

The problem to use 400Mhz DDR-I, thus, is that the NV30 only got a DDR-II memory controller.
But AFAIK, you can *force* DDR-I to have higher burst lengths. Basically, you'd put it in DDR-II compatibility mode.

DDR-II, clock for clock, is not really better or worse. It's simply different. But it's also different in a way, AFAIK, which needs more transistors to have an efficient memory controller.
Thus, it makes NO sense at all to use a memory controller based on DDR-II if you're using DDR-I. The NV30, however, is already designed with a DDR-II memory controller, and it would cost a lot more to create yet another ASIC.

Thus, it would make sense to use DDR-I at 400Mhz instead of DDR-II at 400Mhz, but you'd have to force it to run at DDR-II burst lenghts.
AFAIK, it is possible to force DDR-I to run at DDR-II burst lengths. But...

Is it really possible to run DDR-I at DDR-II burst lengths? And are there other factors than burst lenghts which might unable DDR-I to be emulated as DDR-II? I'd really like some info on that. Anyone got a link or know such information?


Uttar
 
demalion said:
This decision does, in my estimation, seem connected with this enhanced chip design and rumors that it could be released earlier than some had thought (i.e., in May/June), though I don't equate this (May, June) with the nv35 being "ready to go". Reasons include:
  • We've had hints of the "reliable sort" that nv35 has been in "debugging" for a while.
  • This decision came after this "debugging" process has had time to be evaluated.
  • The simple addition of a 256-bit bus is a pretty conservative baseline for the expected performance of such a chip, and easily achieved in the time allowed (and, also indicated by "hints of the reliable sort").
  • IMO, getting a product that is less unreasonable than the GF FX 5800 Ultra (as based on the nv30) is a relatively low target.

Excuse my ignorance, but I was under the impression that adding a 256-bit bus was anything but "simple", relatively speaking. Also, wouldn't the NV3x core need to be designed with support for both 128 and 256-bit bus from the get go? Similar to how the R300 supports both DDR and DDR-II.
 
Interesting PNY uses a cybernetic organism for the graphics. In Star Trek parlance such an organism is often referred to as a "drone", a being who's thoughts are subsumed by and subject to the "collective" hive mind.
 
The only place I've seen in UK where you could pre-order the Ultra is at Special Reserve - and right now it seems the cancel button is getting alot of attention.

I remember a recent review where the Ultra was overclocked slightly to 530 and after a short period it was overheating and dropping back to 300/600. Is it possible that the reason for the cancellation was just that NV couldn't stop it from overheating at 500Mhz in a typical home enviroment ie. "closed case + extended use".
 
Back
Top