poor performance on FX 5200?

Status
Not open for further replies.
Are those figures with the card clocked at 250/200?

Not too surprising that the performance is poor in this case. You can see why NVidia only wanted the Ultra version to be previewed.
 
Christ, I see that it is even slower then the Geforce MX!! lol, DX9 for $79. Yeah maybe that is .. a high price for that card. Ahh nvidia will fool a lot of people with that one. Yes that explains why nvidia didn't want to have the regular FX5200 benched, it performs worse then the MX series. I think given how poorly it performs I cerainly would take a fully complient DX8.1a card over the FX5200 .. any day. More marketing crap from nvidia.

EDIT: Note how badly the Radeon 9200 beats the FX5200, even in DX9 benches.
 
Sabastian said:
EDIT: Note how badly the Radeon 9200 beats the FX5200, even in DX9 benches.

And that is a standard 9200, not a 9200 Pro, which explains its slowness compared to 9000 Pro or 8500... the whole Pro/Non-Pro thing is kind of annoying since folks are not always clear about it when putting up benches like that... At first I couldn't understand why 9200 was so slow compared to the other two.
 
Ichneumon said:
At first I couldn't understand why 9200 was so slow compared to the other two.
*cough* "ultras"?. ;) Yeah the 9200 pro ought to be a rather good performing DX8 class card for a relitively low price.
 
Remember the FX5200 is sold at $79, the Radeon 9200 Regular will certainly cost at least $99
But yeah, FX5200 performance indeed looks bad...


Uttar
 
you like it or not, I'll be adding "DX9 FOR 99 DOLLARS!" on my list of legendary insider jokes.

what other sentences is there??
just one before that one:
"OY!"
 
Uttar said:
Remember the FX5200 is sold at $79

Yup, so it is.

http://www.xbitlabs.com/news/video/display/20030326093901.html

Japanese stores offer Sparkle and Prolink graphics cards that feature D-Sub, DVI-I, TV-Out and 128MB of DDR SDRAM memory. The cards run at 250MHz for the chip and 400MHz for the memory. The products cost about $125 now, what is $45 more than NVIDIA’s MSRP of $79 for the GeForce FX 5200.

the Radeon 9200 Regular will certainly cost at least $99

By what reckoning?
 
A couple of things to consider for 9200 in relation to 5200 - they are both on the same process, but 9200's die size is smaller. ATI shifted 9200's over to UMC purposely because they thought they could knock off about 20% off the bottom line, and bog standard 9000's are going for as little as $60 now. Given that it seems as though ATI are probably paying much less per 9200 chip then its only really the memory and board costs that are going to make up for the costs - IMO, I don't think that you are going to see 9200's priced greater than equivelently specced 5200's (in fact, that can be said for the entirity of ATI's new range).
 
OMG ! and I thought NVIDIA had learnt a lesson from touting their NV30 too much :p

But they were honest about one thing: the dawn of cinematic rendering has started with those NVIDIA chips. 25~30 fps average should be cinematic enough for everyone :p

So embarassing.....That much slower than GF4 MX440-8X ??? I think it might be the same as the GF4 MX420, or.......slower ? :p
 
Heh, imagine the average gamer who originally had a Geforce3. When GF4-cards came, he "upgraded" to a GF4MX. Now time has come to "upgrade" his videocard again :LOL: :LOL: :LOL:
 
You wanted DX8 in low-end, now you got DX9 there. What the hell are you complaining about? Or you thought, that NV34 will have the perfomance of NV30 and cost only $79? :)

It's a trade-off: you want speed - you can forget about features and vice versa.

BTW, don't forget, that NV34 image quality is better, than NV17/8's...
 
Yes, but you don't normally expect a next generation card to perform even slower than the previous generation at the same price point...
 
Its like nvidia has 2 guys trying to dig themselves out of a hole, but 3 more guys shoveling dirt right back in. Its becoming downright comical.
 
galperi1 said:
Can you say Geforce 4MX part deux :)

I don't think this is comparable. The 4MX (440) did very well performance wise (about the level of a GF2 Ultra or Radeon 9000, can sometimes even beat a GeForce Ti 200) for a budget part. A very efficient, simple 2x2 design. However, it got (rightly) bashed because of it's complete lack of pixel shaders (and therefore the unjustifed 4 in its name).
However, this GFFX 5200 is exactly the opposite. All the latest features, but dead-slow performance. What's frighening is that it appears slower than the previous low-end, even if you factor in the different clock speeds (275/250 for the MX440-8X, 250/200 for the 5200) which would imply a less efficient design (keep in mind also that the 5200 runs one game test more in 3dmark and is still slower, plus it might not even use trilinear - don't know the driver settings).
But at least the 5200 will have decent AA/AF options (for a low end part) - though the problem is, if it's hardly playable without AA/AF, it doesn't help if it's almost as fast with AA/AF...
But IMHO it's a bit too early to bash the 5200. I'd want some more tests apart from UT2k3 and 3dmark to draw any conclusions about the performance vs. a GF4MX 440-8x.
 
Yes, anyone who upgrades from a GF 4 MX is going to get a rather significant shaft from the 5200 in UT 2k3. Since that is a popular game, that is pretty significant.

What I wonder is about Doom III performance, since that will be a popular game as well.

They do get better quality, even in some older games, by the support for higher levels of aniso than the GF 4 MX, if the performance levels don't get hit too hard by turning it on at a setting that improves quality noticably. And AA performance should be good for it (depending on where it starts out performance wise for the game in question).

In any case, shader enabled games will make it a better value than the GF 4 MX, but people upgrading from GF 3 and (real) GF 4 cards will be majorly shafted there as well (I suspect the GF 4 4200 -> GF FX 5200 upgrade path will be the next common forum topic in this regard :-?).
 
demalion said:
What I wonder is about Doom III performance, since that will be a popular game as well.
You can use the results in 3D Mark 2003 GT2 and GT3 to get a hint, I believe...
And AA performance should be good for it (depending on where it starts out performance wise for the game in question).
What makes you think this? According to nvidia, there's no color compression on the 5200...
In any case, shader enabled games will make it a better value than the GF 4 MX,
Except that it's too slow to enjoy any shader enabled game.
 
Status
Not open for further replies.
Back
Top