nVidia closing gap with GFFX 5900XT?

Zvekan

Newcomer
http://techreport.com/reviews/2003q4/geforcefx-5900xt/index.x?pg=1

Is nVidia really going to sell this chip in mass quantities with price for card of only 200 USD with COD bundle?

If so it could really give Radeon 9600XT run for money as 5900XT offers great performance in older titles and only lacks some DX9 advanced functionality and pixel shader speed (but not by such large margain).

Also wont it seriously damage FX5700 Ultra sales as they are almost equally priced and 5900XT offers supstantial better performance and better bundle.

How can nVidia price it so competitivly as it is based od full NV35 (maybe NV38) core that has quite a nice number of transistors and has a 256 bit bus that requires more complex PCBs?

Zvekan
 
Zvekan said:
How can nVidia price it so competitivly as it is based od full NV35 (maybe NV38) core that has quite a nice number of transistors and has a 256 bit bus that requires more complex PCBs?

The same way ATI can sell Radeon 9800 Non-pros for $200 too.

Companies can sell products for whatever they want...depending on how much profit they're willing to give up for market share.
 
It looks like the 5700 ultra will have to move down in price or perish.

9600xt's are significantly less than $200 so I don't know that the 5900se is a competitor yet.

The 5900 XT is quite a bit faster than the 5700 Ultra in Tomb Raider, and could potentially be faster than the 9600 XT. All that with an unoptimized DirectX 9 code path, too.

Could someone tell me what the hell is that supposed to mean? It's a friggin TWIMTBP game.
 
That review is a complete and utter joke.

1) They benchmark 3DM2K3 with unapproved drivers and publish the results, which I believe is forbidden by FutureMark. Note the old trick of "I can't see the cheating". Also note how they refer to "aggressive driver optimizations", without mentioning that those *cheats* are explicitely forbidden by FutureMark.

2) The TR : AoD is highly humorous, of course, especially the part about the V49 patch
The V49 benchmark apparently fails to correctly load Tomb Raider's GeForce FX-optimized code path, so it's more a reflection of how the GeForce FX cards perform with default DirectX 9 code than anything else.

I suppose V49 for TR : AoD "disabled the Nvidia unified compiler" too ?

3) The biggest joke is in the high dynamic range test, though :
None of our GeForce FX cards renders the "rthdribl" demo perfectly, possibly because of their lack of support for floating point texture formats, but the 5900 XT still performs well.

And they still publish the benchmark results... "It doesn't do what it's supposed to do, but damn it does it fast." I suggest that for the next set of Cheatonators, Nvidia simply disables all rendering. After all, it optimizes more than a benchmark, and image quality doesn't change when compared to running benchmarks with your monitor turned off...

4) They disable Overdrive on the 9600XT, but of course post overclocking results of the FX...

Tech Report is going directly to the bookmark cemetary...
 
CorwinB said:
That review is a complete and utter joke.

1) They benchmark 3DM2K3 with unapproved drivers and publish the results, which I believe is forbidden by FutureMark.

(This is exactly what I was afraid of in another thread, and why I think FM should keep issuing patches if new drivers continue to break their optimization rules.)
 
Well I wasnt really thinking about quality of review but the GeForce FX5900XT or SE however you want to call it.

It seems to make FX5700 Ultra unnecessary as should theoreticaly be faster in every segment (4x2 and 256 bit bus).

Now as nVidia is happily writing on lots of places that FX5700 Ultra is selling great and that they cannot produce enough cards (or IBM process isnt that great :) ) why would they bring 5900SE? It just doesnt make sense especially as margins on slower NV35 are certanly worser than NV36, or at least should be by some common logic.

Is 700 MHz DDR that much cheaper than 900 MHz DDRII found in FX5700?

Zvekan
 
Its also quite possible that they have many 5900 cores from TSMC that could not clock to high enough frequencies for the ultra and vanilla models, and now they figure that if the nv40 and its derivatives are coming out, might as well sell off the older stock. Im not sure of nvidias plan for next year, but my best guess would be that they will follow a path of:
nv40 for high end (500)
nv41? (300ish)
nv36 for mid range (200)
and a nv34 (or respin) for low end (100)

Thus they would eliminate their stocks of nv35 and 38s as soon as it became possible. The price of making a complex chip/pcb alone means they cannot lower the price all that much and the nv4x cards should ut perform the high end 3x no problem.
 
If you are in the UK the price is pretty cheap for a 5900 if you know where to look (I am desparately searching for a DX9 card and believe it or not I dont care if it is NVIDIA or ATI... perference is ATI but if I get a cheap NV model that performs well I will go for it):


Ebuyer Leadtek 5900 400/700
 
"(I am desparately searching for a DX9 card and believe it or not I dont care if it is NVIDIA or ATI... perference is ATI but if I get a cheap NV model that performs well I will go for it):"
------------------------------------------------------------------------------------
nVidia has gotten alot of bad press latley. ati is definatly leading in the gaming arena, but imo nVidia has the non-gaming features. 2d seams better (snapier, more vibrant colors), and alot of cool features have been integrated into the drivers. you can assign task switching to you middle mouse, enable os-wide mouse gesture, and they have a built in ie pop-up stopper (worthless for me, but of value for others). better multi-monitor support (spanned desktop support for one), better tv out support, better support for playing video full screen on a multi-monitor system, and generaly better opengl supoort for games.
c:
 
I have to pay £164 for a pop up stopper?
Heh, why? I use Avant Browser when I need IE and my main browser at the moment is Firebird Black Diamond addition. If NVIDIA release a spam killer next with their card then I already have Thunderbird which does that for me (and is pretty darn accurate too).

I know you were just saying, NVIDIA deserves the bad press because of the bad decisions it made regarding its PR, marketing and driver implementations in games. I also was pretty satisfied with my 9700 Pro and its 2D ability, Hydravision was cool and the TV out was second to none. Unfortunately I had to sell that card a while back and now I am missing it...

Like I said though, if NVIDIA release a good prosduct at a competitive price I would consider it. I would even consider an XGI or Deltachrome but I am wary of becoming a beta tester until they (XGI) get their driver problems sorted out.

If I had my way I would buy an ATI Radeon 9800 but I dont have £180+ to spend... and I am not interested in the 9600XT as I would like to keep my next card for a fair amount of time (at least 12 months).

I am only speaking for myself, and as a consumer. My opinion of NVIDIA is that they suck - but still make reasonable products which are now selling for a good price (see beyond3d preview of the 5950 Ultra for comparison).
 
A 5900XT/SE/Non-Ultra seem great for the pricepoint, better than a 9600XT for sure. Those 9800 Non-Pros are good cards too :). Now we need a fully functional AA/AF Xolari V8 Duo and a great performing S3 DeltaChrome.


Then of course there's the next generation....
 
Offhand, I'm mainly wondering just WHAT they made the 5700U for, as they seem to be proving its pointlessness in comparison at every turn. :p ;) Would've like to see the 5900 non-Ultra numbers in there as well, since the XT/SE/whatever seems to be attempting to keep that performance as well they may while replacing the card with one with better margins.
 
The V49 benchmark apparently fails to correctly load Tomb Raider's GeForce FX-optimized code path, so it's more a reflection of how the GeForce FX cards perform with default DirectX 9 code than anything else. Running the normal Tomb Raider game executable, without the benchmark mode enabled, loads the correct GeForce FX code path and promises better performance.

The way the words sounded to me is that the Tech Report reviewer repeated what (perhaps) NVIDIA told that reviewer. Are those statements of fact backed up by the reviewer's own investigations, or statements provided by NVIDIA?

Subsequent to Beyond3D making known the fact that TRAOD can be benchmarked, NVIDIA have released drivers that improves their cards' performance in the benchmarking of this game. Can anyone guess why?

Perhaps, the scenario is such that NVIDIA didn't know the game can be benchmarked with v49. Perhaps NVIDIA had no such "GeForce FX-optimized" (whatever that may mean... anyone knows?) drivers for the game at the time that Beyond3D debuted this game as a benchmark. After the fact that Beyond3D made known the fact that the game can be benchmarked with the v49 patch, we have had NVIDIA drivers that demonstrably improves NVIDIA cards' peformance in the benchmarking of this game. This can be easily investigated by comparing the NVIDIA drivers used when Beyond3D first presented the TRAOD benchmarks and every NVIDIA drivers after this.

To further investigate the Tech Report reviewer's words, FRAPS can be used to compare the performance of actual gameplay versus benchmarking of the game... it appears, from the Tech Report reviewer's words, that there is a difference between actual gameplay performance and benchmark performance regardless of version of drivers used, all because benchmarking the game involves adding a "-benchmark=" command-line parameter that appears to change performance depending on whether we're talking about benchmarking or actual gameplay .... but take in mind the above paragraph -- to compare actual gameplay performance versus benchmarking performance according to what the Tech Report reviewer stated, using FRAPS while using the drivers before and after the fact that Beyond3D first presented the TRAOD benchmarks.

FYI, this is what the TRAOD PC programmer (from Core Design) told me regarding the v49 patch on a general basis :

But no matter what anybody says about the benchmark, the better the scores it reports then the better the game plays.

This directly contradicts what the Tech Report reviewer states, updated video card drivers excepted.

I am not satisfied with what the Tech Report reviewer stated about this game in his review. He should say if what he "reported" in his review about the benchmarking of this game is the result of undeniable fact based on his own investigation... or if it is directly what NVIDIA or Core/EIDOS told him. There's a huge difference between the two, and it is my impression that it is the latter, based on the way the words the Tech Report reviewer used. Which, of course, he shouldn't have used if this is the case.

I really wouldn't have any reason to post any of this if this Tech Report "review" was clearly labelled "Considering a NVIDIA video card? Read this!". If this was the case, then the reviewer have a right to say what he did, which was to provide comment (the way he did) about whether TRAOD was "GeForce FX-optimized" or not... anyone intending to buy a NVIDIA card and nothing else would love to read about any NVIDIA optimizations in any game. But this "review" is a shootout, with ATI cards present. Why would/should this Tech Report reviewer state what he did regarding the benchmarking of this game, without any apparent effort to prove the legitimacy of what he stated, nor provide any IQ investigations/comparisons among the various cards in a shootout when he mentions "GeForce FX-optimized" ?

This is exactly the sort of thing that can be labelled "spin". I'm not questioning the Tech Report reviewer's ethics... I'm just wondering why he felt any need to state what he did given that he provided no backing up for his statements, especially in a "review" that is a shootout.
 
Zvekan said:
Well I wasnt really thinking about quality of review but the GeForce FX5900XT or SE however you want to call it.

It seems to make FX5700 Ultra unnecessary as should theoreticaly be faster in every segment (4x2 and 256 bit bus).

Now as nVidia is happily writing on lots of places that FX5700 Ultra is selling great and that they cannot produce enough cards (or IBM process isnt that great :) ) why would they bring 5900SE? It just doesnt make sense especially as margins on slower NV35 are certanly worser than NV36, or at least should be by some common logic.

Is 700 MHz DDR that much cheaper than 900 MHz DDRII found in FX5700?

Zvekan

what has this to do with the FX5700Ultra? NV36 costs the same price like RV360.
Of course from the AIBs perspective.
The price in the retail store is a different story.
The card manufacturers and the stores know that they can charge more for a Nvidia card than for an ATI card.
But that has nothing to do with the price board makers a paying for a NV36 or RV360.
You also need to consider that NV36 is a pretty new part. It takes some time until the price comes down.
Most customers even don't realize that there is a 5900SE or XT available at the moment for a very competitive price.
This whole thing will even out as time goes by.
So i expect that the price of NV36 products will come down a littlebit.

On the NV35/38 side there is no problem at all. Those NV35/38 chips on 5900SE or XT cards mostly weren't able to achieve the clock speeds needed to put them on a FX5900 or FX5950 Ultra.
So you can throw them away or sell them for a cheaper price with lower clock speeds.
Both IHVs have choosen the second path.
We even don't know how much NV35 chips Nvidia has already collected since the NV35 launch because these chips couldn't be clocked high enough.

Looks like very good deals those SE and XT cards.
 
The card manufacturers and the stores know that they can charge more for a Nvidia card than for an ATI card.

This is a sorely misguided statement these days, and the arrival of the 5900 XT is proof positive that thats not the case any longer.

Pricing is dependant on the costs that are charged the the AIB - NVIDIA's problem throught the entire year has been on of margins and this is also passed on to the board vendor to some extent. If the vendor is already tight on their margins in relation to the price NVIDIA are announcing at they don't have room to manouver in terms of their own pricing either. Its not just down to performance why we have seen a large number of board vendors (and still more to follow) opt to take the route of dual supply these days, its also a factor of costs and at present ATI's products offer them that little more leaway.
 
dan2097 said:
Dabs are selling an FX5900XT for £149.00 inc Vat
http://www.dabs.com/uk/productview?quicklinx=2WJR

The Ti4200 of the FX range. I wonder what ATI will offer - a lower clocked 9800np perhaps?

You can get a 9700 pro for £150 now:
http://www.retekdirect.co.uk/acatalog/Video_Cards.html

That should beat the 5900XT :devilish:

Thats more of an exception though as in that price range all there really is is the 9800se at the moment... :?

2nd hand, 20 day warranty... not comparable, I'm afraid.

A 9800 can be had for about £200 though (brand new).
 
"I have to pay £164 for a pop up stopper?
Heh, why? I use Avant Browser...."
-----------------------------------------------------
and your point? not everyone uses "rebel" browsers, and some people get pretty annoyed by pop-ups. the fact is, for ie users, they are constantly being bobarted with pop-up ads advertising pop-up stoppers and "member" enlargers, and a free way to stop that (if you alreay own an nVidia card) is just an added bonus.



"I know you were just saying, NVIDIA deserves the bad press because of the bad decisions it made regarding its PR, marketing and driver implementations in games."
------------------------------------------------------
i was not saying that. i was saying they have been, imo, getting more bad press than they deserve. some people like to make it appear that ati is leading in every way, and the fact is they are not. for pure, single monitor shader intesive 3d gaming ati is currently king. nVidia tends to do very well in older games, and non-gaming applications and features compared to ati.



" I also was pretty satisfied with my 9700 Pro and its 2D ability, Hydravision was cool and the TV out was second to none. Unfortunately I had to sell that card a while back and now I am missing it..."
------------------------------------------------------
i too, am satisfied with the 2d quality of the 9700. but i'm impressed with the 2d quality of the fx series. it just seams sharper, with better color. tv out? try configuring tv-out as your primary display. also, in my experience, the fx cards do a better job of autodetacting when a ty is plugged into the card.



"Like I said though, if NVIDIA release a good prosduct at a competitive price I would consider it."
+
" My opinion of NVIDIA is that they suck - but still make reasonable products which are now selling for a good price"
------------------------------------------------------
so, you would buy nVidia cards if they were competitive for the price, and then you admit that the price is good, but nVidia suck.... ok you lost me in there somewhere.


c:

and so you know, i use opera as a web browser, my gaming rig has a r9700 in it, and i've been an ati user since the rage2c. all i'm trying to say is that although the fx series might be rubbish for shaders compared to the r300, it excels in other areas.
 
i was not saying that. i was saying they have been, imo, getting more bad press than they deserve.

Err, no... Not even close. Nvidia is deserving every piece of bad press they get, and then even some. As I see, most of the flak Nvidia is getting is related to :
- subpar performance of the FX line in shader-intensive titles (be they synthetic benchmarks or actual shipping titles that didn't get "optimized")
- atrocious PR mostly made of lies, damned lies, and some more lying to cover yesterday's lies. And we are not talking your average positive spin here, but things like trying to pass a chip as having twice as many pipelines as it really does, or insisting that hand-coded shader replacement is part of an "unified compiler technology"

nVidia tends to do very well in older games

I'm sure you will be kind enough to point us to benchmark showing that the current ATI line exhibits poor performance in "older games"... You may want to select data where image quality is consistent in the testing (ie testing with drivers where you can get full trilinear, for example), and of course pay attention to actual AA quality (ATI advantage) or AF quality (Nvidia advantage at equal AF settings in older drivers, although ATI can go up to 16x AF, and the newer Cheatonators botch filtering quality in order to gain a few precious FPSs)...

Of course, it's highly humorous to see Nvidia communicate on its good result in older games, whereas they were presenting the NV3x line as "The Dawn of cinematic rendering" or some similar nonsense... Brilinear filtering and botched shaders, sounds more like a Dusk to me...
 
Back
Top