poor performance on FX 5200?

Status
Not open for further replies.
OpenGL guy said:
demalion said:
What I wonder is about Doom III performance, since that will be a popular game as well.
You can use the results in 3D Mark 2003 GT2 and GT3 to get a hint, I believe...

Well, comparative to the 9000, yep. What I was looking for was compared to the GF 4 MX.

And AA performance should be good for it (depending on where it starts out performance wise for the game in question).
What makes you think this? According to nvidia, there's no color compression on the 5200...

My mistake, as I guess I'm confused by conflicting reports of what got removed. All bandwidth saving features were mentioned as being removed intially, and I forgot which ones were later stated to be still included.

In any case, shader enabled games will make it a better value than the GF 4 MX,
Except that it's too slow to enjoy any shader enabled game.

I played half-life on my PII 300 Rage Pro laptop. :LOL: I did enjoy it quite a bit, too, and that's one card I think the 5200 non ultra should be able to outperform even with shaders. So I know I could enjoy shader enabled games on it, though there is zero chance that I wouldn't simply buy something better (I'm actually planning on the refreshed refresh 256MB card and waiting for an M10 tablet, so get hopping people!).

So I have to disagree, it isn't too slow to enjoy any shader enabled game, though other shader capable parts do seem like they will embarass it (except the Xabre and Trident XP, I think) in shader related tasks.
 
Chris123234 said:
those frame rates are still playable. Though the 28 is pushing it.
If you look at the results of 3D Mark 2003... The 5200 runs all four tests yet still gets crushed by the 9200, which can only run the first three. This implies, to me, very poor results on GT2 and GT3. GT2 and GT3 use a similar shadow volume computation as Doom 3, so I think we can conclude that the 5200 will be rather slow in Doom 3.

If you look at the results in UT2003, the 5200 is getting about half the performance of a GeForce 4 MX... :LOL:

Maybe someone confused "cinematic" with "slideshow" :D
 
OpenGL guy said:
Chris123234 said:
those frame rates are still playable. Though the 28 is pushing it.
If you look at the results of 3D Mark 2003... The 5200 runs all four tests yet still gets crushed by the 9200, which can only run the first three. This implies, to me, very poor results on GT2 and GT3. GT2 and GT3 use a similar shadow volume computation as Doom 3, so I think we can conclude that the 5200 will be rather slow in Doom 3.

If you look at the results in UT2003, the 5200 is getting about half the performance of a GeForce 4 MX... :LOL:

Maybe someone confused "cinematic" with "slideshow" :D

what if someone pops that baby into the arb2 path(believe this is theone) in doom 3 an have it run at half the speed of what its getting now :)
 
Something is wrong with this preview:

Look at 3dMark2001 results
http://www.darkcrow.co.kr/image/preview/2003/0303/Prolink FX5200/mark.gif, and compare with results of 5200Ultra (these from digit-life review)
Code:
Test           5200Preview      5200Ultra
FillRate single Tex  481        820
FillRate multi Tex   315        908
Pixel Shaders         53.6       97.5
Adv. P. Shaders       24         40
Vertex Shaders        48         58

IMHO Obviously somthing is wrong with NV34 & 43.03 drivers compared to 42.72 (used in digit-life's tests). While 42.72 are known to have bugs, 315MT/s in multi-texturing is just impossible.
 
Re: Something is wrong with this preview:

chavvdarrr said:
IMHO Obviously somthing is wrong with NV34 & 43.03 drivers compared to 42.72 (used in digit-life's tests). While 42.72 are known to have bugs, 315MT/s in multi-texturing is just impossible.

agree,the poor performance of FX 5200 can not be explained by the lower clock of GPU&memory. someone said there's gonna be a 64bit version of FX 5200,maybe the card being test falls into this category if there's no driver issue.
 
The 64-bit memory bus boards are easy to spot. The only justification for these is the low price/low profile paradigm. Performance will be "ultra weak"...
 
But what is actually being tested? It clearly says "Ultra" on the box, but the numbers seem to be far too low.
Something appears to be very wrong.
 
OpenGL guy said:
Chris123234 said:
those frame rates are still playable. Though the 28 is pushing it.
If you look at the results of 3D Mark 2003... The 5200 runs all four tests yet still gets crushed by the 9200, which can only run the first three. This implies, to me, very poor results on GT2 and GT3. GT2 and GT3 use a similar shadow volume computation as Doom 3, so I think we can conclude that the 5200 will be rather slow in Doom 3.

If you look at the results in UT2003, the 5200 is getting about half the performance of a GeForce 4 MX... :LOL:

Maybe someone confused "cinematic" with "slideshow" :D

Maybe someone can stfu. How is 28 fps a "slideshow". And, did I say anything about cinematic? NO. And was I saying that it was faster than those cards? NO. So stop trying to prove me wrong on stuff I never said you moron. I said playable framerates. Americas army played fine on my old geforce 2 mx @ 27-31 fps.
 
Chris123234 said:
OpenGL guy said:
Maybe someone confused "cinematic" with "slideshow" :D
Maybe someone can stfu. How is 28 fps a "slideshow". And, did I say anything about cinematic? NO.
Did I claim you said anything about cinematic? No. Did it occur to you that the "someone" I referred to above wasn't you? No. If I was referring to you, Chris123234, I would have said "you" or your name because I was replying directly to your post.

Just so you know, the "someone" I was referring to was nvidia's marketing.
And was I saying that it was faster than those cards? NO. So stop trying to prove me wrong on stuff I never said you moron. I said playable framerates. Americas army played fine on my old geforce 2 mx @ 27-31 fps.
If you think that's playable, great, but there are many people who would disagree with you. You can search the Beyond 3D archives for long discussions on the topic.

And, just to clear things up, the "slideshow" comment was in reference to Doom 3 and 3D Mark 2003 GT2 and GT3 performance, which I expect to be low, as I mentioned above.

Lastly, don't you think it's bad for the consumer for a vendor to release a low end card slower than previous low end cards from the same vendor? (Yes, I am talking to you this time, Chris123234.) What does it matter if it supports DX9 when it can't play current games as well as the old cards and DX9 games aren't here yet? (You, Chris123234, can answer this one, too, if you like.)

(Edited to fix quoting format problem.)
 
I would also say that 28 fps is playable, however it would not be surprising if Doom 3 ran only half as fast, if not worse....
 
Chris123234
You better learn to respect the stablished members of this board.
Any of your f@nboy musings dont hide the fact that the 5200 is crap and Nvidia is trying to decieve its customers AGAIN by "upgrading" to a new generation with significantly slower performance. Disgusting, really

I really don't get it why some people are happy with keeping the blinders on.
 
OpenGL guy said:
GT2 and GT3 use a similar shadow volume computation as Doom 3, so I think we can conclude that the 5200 will be rather slow in Doom 3.
I knew it wasn't going to be long before somebody was going to make this comparison. It's totally, completely, and utterly baseless.

3DMark03 is a synthetic benchmark using Direct3D.

DOOM3 is a game (in development) using OpenGL.

The goals are different, the API's are different. Quite simply, there is no reason to make any comparison between 3DMark03 and DOOM3 performance.

I still contend that paying attention to 3DMark scores when comparing different graphics cards/systems is stupid. It's a synthetic benchmark. The fully-synthetic portions have some use, but the game tests have no use, except perhaps to optimize performance on a particular system (i.e. tweaking motherboard settings and the like).
 
Yeah, going off like that on OpenGL Guy does seem a bit blind, though I don't think he is due a modicum of respect because he is an established member of the board, but for other reasons entirely. Anyways, OpenGL Guy seems to know how to handle that on his own.

I don't think it is settled that the 5200 non ultra is slower than a GF 4 MX in Doom 3. But...things do look pretty bad for the card. I'm just not in any rush to condemn the card as a stinker (the fill rate results seem rather amazing) as I think it will be painfully obvious if it is. The 5200 Ultra/non Ultra bait and switch has already disgusted me long ago, so I'm just looking at the card by itself.

Now, someone owning a GF 4 MX, I have to agree with OpenGL guy that the 5200 is a severe disaster for current games that can be played without shader functionality. I'm even more worried by the blind upgraders who go from "GF 4 Ti 4200" -> "GF FX 5200" (*shudder*), but I think nvidia should be even more worried.

For someone owning or considering about any other shader enabled card, it looks like it will also be a disaster (we don't know the final picture yet, however, to be sure), though I don't keep a clear enough picture of the Xabre and Trident XP performance figures to be sure that the card might not be a worthy alternative to someone who ended up with some models of those.

For someone who has no intention of upgrading, or ability to critically evaluate a card for upgrading, I have to still maintain that the card can be a good option ...remember, 3dmark is representative of comparitive shading power, not the final performance. Something half as fast as a 9000 Pro in shading performance isn't unusable...the 9000 Pro is a pretty durned good card! Shouldn't be throwing around absolutes like that, though terms like "slideshow" seem perfectly accurate to me.

I think the 9000 Pro's "DX 8.1" functionality featureset also allows it to compete in the same playground as the 5200 non ultra, and then proceed to bully around pretty unmercifully when doing so based on the benchmarks we've seen, but that is a different discussion, and it would have to be recognized in it that the 5200 is capable of putting out better imagery than the 9000 Pro, unlike the GF 4 MX.

Of course, nvidia would have you concentrate on that possibility and ignore whether you could actually play a game with it when it was putting out imagery better than the 9000/9200 cards, but it's too early to be sure that you won't be able to (IMO).

OpenGL guy, it might help you get over nvidia's tactics with the 5200 if you thought of the 5200 as a card that encouraged people to upgrade to a Radeon 9x00. It seems to me that it would do a better job of that than the GF 4 MX would...the GF 4 MX seems to me to be a card that seduces people into sticking with DX 7 level features.

I'm not actually joking...putting out a card like this may very well be a double-edged sword of giving people a taste of advanced features, but convincing them that you aren't capable of delivering them. I think nvidia's intent for avoiding this is to dilute the label "DX 9" and replace it with "CineFX" so that the upgrade path people think of doesn't include competitors, though that assumes people will upgrade.

EDIT: clarity
 
Chalnoth said:
OpenGL guy said:
GT2 and GT3 use a similar shadow volume computation as Doom 3, so I think we can conclude that the 5200 will be rather slow in Doom 3.
I knew it wasn't going to be long before somebody was going to make this comparison. It's totally, completely, and utterly baseless.
How do you know? Did I claim that if you got x fps in GT2 of 3D Mark 2003 that you would get x fps in Doom 3? No.

GT2 and GT3 are mainly limited by stencil performance, as is Doom 3. If GT2 and GT3 run very slow, my conclusion is that it's because of stencil performance. Remember, the 5200 doesn't multipass on these tests like the GeForce 4 so you can't use that as an excuse.
3DMark03 is a synthetic benchmark using Direct3D.

DOOM3 is a game (in development) using OpenGL.
OpenGL and Direct3D can be made to do the same things...
The goals are different, the API's are different. Quite simply, there is no reason to make any comparison between 3DMark03 and DOOM3 performance.
Why? I've based my thoughts on how each application is rendering their respective scenes. Since both are limited by the same thing (stencil filling) I believe I'm not being unreasonable.
I still contend that paying attention to 3DMark scores when comparing different graphics cards/systems is stupid. It's a synthetic benchmark.
A synthetic benchmark that does real work. If your card can't handle the work, then maybe that should tell you something.
The fully-synthetic portions have some use, but the game tests have no use, except perhaps to optimize performance on a particular system (i.e. tweaking motherboard settings and the like).
I really don't think GT2, GT3 and GT4 results are going to vary much based on motherboard settings... Maybe you do some investigation as to where the bottlenecks are.
 
Chalnoth said:
OpenGL guy said:
GT2 and GT3 use a similar shadow volume computation as Doom 3, so I think we can conclude that the 5200 will be rather slow in Doom 3.
I knew it wasn't going to be long before somebody was going to make this comparison. It's totally, completely, and utterly baseless.

Well, we're not talking about comparing the GF FX 5800 to the 9700 in PS 2.0 tests, and the applicability of that to Doom 3, we're talking about comparing the GF FX 5200 to the 9000/9200 in the DX 8.1 tests. The GF FX 5200 performance for those tests should be indicative of the nv30 path performance of the 5200 in comparison to the R200 path performance of the 9000/9200 in Doom 3.
The functionality cross-sections of the workloads are similar enough for one to be expected to provide indication for the other. They are both DX 8.1 level functionality workloads performing similar tasks...if you hadn't insisted on being so absolute, I think you'd have made more sense.

3DMark03 is a synthetic benchmark using Direct3D.

DOOM3 is a game (in development) using OpenGL.

The goals are different,

The goals for GPU workload are pretty similar, but you're right, there are some differences. That doesn't mean it is "totally, completely, and utterly baseless", however.

the API's are different. Quite simply, there is no reason to make any comparison between 3DMark03 and DOOM3 performance.

You can tell when an argument is splitting into sides when the absolutes are luanched so frequently. Is it too late to avoid the full scale conflict, or can we back off on the FlameCon status? <-that's an equal opportunity dig.

I still contend that paying attention to 3DMark scores when comparing different graphics cards/systems is stupid.

Hmm...not a very helpful sentiment. Anyways, has your contention addressed the counter-arguments made to it?

It's a synthetic benchmark. The fully-synthetic portions have some use, but the game tests have no use, except perhaps to optimize performance on a particular system (i.e. tweaking motherboard settings and the like).

You're thinking of 3dmark2001...? 3dmark03 compares the ability to handle shader workloads between cards pretty well. I do agree that its primary usefulness is for standardized shader workloads, but that's what API's are supposed to be, and while that might not be directly indicative of Doom 3 in the form and driver state last discussed by Carmack, I don't see your basis for extending that to DX games as well.

Looking forward, the only cross-vendor standardized shader workload it doesn't seem likely to be very representative of should be GLSLang. The current cross-vendor OpenGL functionality and DX seem to reflect its results pretty strongly. We do have to wait for final drivers from all parties to be sure, though.
 
demalion said:
Yeah, going off like that on OpenGL Guy does seem a bit blind, though I don't think he is due a modicum of respect because he is an established member of the board, but for other reasons entirely.


His contributions to the board were implied in the "established" term, i thought i made that obvious, but could ahve worded it better.
 
gkar1 said:
Any of your f@nboy musings dont hide the fact that the 5200 is crap and Nvidia is trying to decieve its customers AGAIN by "upgrading" to a new generation with significantly slower performance. Disgusting, really

I really don't get it why some people are happy with keeping the blinders on.

Pot. kettle. black.
 
Status
Not open for further replies.
Back
Top