UT2003 GPU Shootout

Not to mention availability. In Norway I can't find anyone selling the ATI Radeon boards (not that I have looked really really hard for it, just casually browsed the places I usually buy stuff from), and the only Radeon boards are the Hercules ones. A Radeon isn't even near 1/3 of a 4600 here. more like 80% of a 4600. For specific prices I refer my post in this previous thread http://www.beyond3d.com/forum/viewtopic.php?t=1374
 
Actually that's £158.62 if u can't claim the VAT back for the Optimus 8500 Express and how do u know it is 300/300? I thought 8500's were 275/275? Just curious cause that's not a bad price at all :D
 
It is 300/300, it has a crystal orb like hsf which is applied with Arctic Silver 3 and comes with a special version of Rage3d tweak written specially for it.

Yes its a great buy at £158 compared to other 8500 pricing, but still too near the 4200.
 
I can understand why the Gf4 beats the Gf3 so much if it's down to the polygon throughput and the Gf4 has the extra vertex shader. The gf4 4200 score on MO database for polygon tests and vertex speed are almost exactly the same as my 8500. The Gf4 just is faster at DOT3 and EMBM.

hmm Was it established the 8500 has a second vertex engine for TRUFORM? Could it be used elsewhere to help poly throughput if there is no TRUFORM option in the game? Could this be the difference here. But the 8500 is where you would expect it in relation to the Gf3 though isnt it?
 
Randell said:
I can understand why the Gf4 beats the Gf3 so much if it's down to the polygon throughput and the Gf4 has the extra vertex shader.

If things were so dependant on poly throughput then you'd expect Parhelia to be romping away. I think you need to be thinking more about overall efficiency.
 
DaveBaumann said:
Randell said:
I can understand why the Gf4 beats the Gf3 so much if it's down to the polygon throughput and the Gf4 has the extra vertex shader.

If things were so dependant on poly throughput then you'd expect Parhelia to be romping away. I think you need to be thinking more about overall efficiency.


Or specific codepath optimizations as stated way back on the Geforce 3 debut that this was the card they were developing UT 2003 on, the Radeon 8500 has shown it has a significant vertex and pixel shader advantage if utilized, obviously it is not being used.
I guess Sweeny was right when he stated UT 2003 is a DX7 engine with a DX8 overlay.
 
DaveBaumann said:
If things were so dependant on poly throughput then you'd expect Parhelia to be romping away. I think you need to be thinking more about overall efficiency.

well I'm ignoring parhelia scores for know as there is something very wrong with it so far :)

Ok but in comparison to the 2 most stressful game tests before now (SS:SE & Comanche 4) they do not show anywhere near the kind of 40-50% advantage the 4400 has in this test over the 8500.
 
ActionNews said:
Seems like my KyroII will be OK for UT 2003 :D!
At this test the KyroII is between a Geforce2 Ultra and Geforce4 MX440 :D! Not bad at all :)! There are playable framerates at 1024x768x32bit and High Detail Settings :)!

CU ActionNews

Lol, so you are saying that 32.4fps is playable for a game such as this? Where fps is most important ... :rolleyes: :LOL:
 
Ok but in comparison to the 2 most stressful game tests before now (SS:SE & Comanche 4) they do not show anywhere near the kind of 40-50% advantage the 4400 has in this test over the 8500.

Comanche 4 is hugely CPU/system limited, so the differences aren't going to be that great. SS:SE you'd have to be careful looking at the settings being used (i.e. if Aniso is used then it will be a big leveller).
 
The high detail results look really strange for the 8500. 1/4th of the fps of the medium detail settings in higher resolutions. Seems like a bug of sorts. Or maybe fsaa is being used as well as truform. Would that still drop the fps 3/4? I can't think of any other game that causes that much of a drastic drop in fps between medium and High fps. Really strange. Don't know what to make of it. Compare the % drop between high and medium settings for the gforce cards then compare the % drop for the 8500. Hopefully this is something fixable.
 
We keep saying that, but even Parhelia's line graphs are exactly the same as a GF4Ti's or 8500's in Serious Sam. Strange.
 
Isn't the entire benchmark article fairly useless without specifying whether the driver settings for the cards were default and/or providing similar image quality?

This was the original complaint expressed concerning using this benchmark...and it still stands in my opinion. Without that this benchmark seems perhaps only useful for speculation by the masses who do not give such concerns any weight.

I'm not saying that the results are definitely not accurate, but that we have no clue or even the beginning of assurance that they are as of yet. The Parahelia 2x Aniso limit wasn't even mentioned that I noticed...I mean, does anyone have anything that I missed that points to why these results tell us anything at this point at all?
 
Anand doesn't even bother to show pics, or assure us that everything is being rendered correctly. We're basically taking him at his word. He does have an excellent reputation, but a pic or two wouldn't hurt.

I'm also not sure why on Earth he used 16-bit at all.
 
Daemon_UK said:
ActionNews said:
Seems like my KyroII will be OK for UT 2003 :D!
At this test the KyroII is between a Geforce2 Ultra and Geforce4 MX440 :D! Not bad at all :)! There are playable framerates at 1024x768x32bit and High Detail Settings :)!

CU ActionNews

Lol, so you are saying that 32.4fps is playable for a game such as this? Where fps is most important ... :rolleyes: :LOL:

Yes it is. My old P2 350+12MB Voodooo2 ran Quake3 at 640x480, 1/2 texture detail at about 20-25FPS(in game average), yet that's enough to play and win.
 
Framerate expectations have increased, rightly or wrongly.

GLQuake on my Voodoo1 at 30fps was an eye-opening experience for me and I assume 30fps in GLQuake was what Carmack would call "satisfyingly quick". Even back then folks were trying to get higher framerates (you know how it is... overclocking the Voodoo1, messing with GLQuake configs) but the generally accepted feeling was that "Wow, GLQuake looks absolutely marvelous on a Voodoo1 and plays great as well!"... all at 30 to 45fps, which was enough back then.

What a change we have witnessed between that time and now.

I suppose it has to do with new technologies and being "wowed" by it first by the graphics aspect and then later the performance aspect comes in as competition grew.

Nowadays, folks look at articles with video cards comparison graphs and see that the top-most performing cards are getting 100fps whereas the "lesser" competitors get maybe 50fps. Thus a conclusion is reached.

Mindset, due to websites.
 
Rev but dont you adapt to what you are used to? We have another debate goinf where people 'prove' 60fps is not enough. How did they survive when 60fps was a dream. I dont know :)

Plus it depends on the game, I for one am finding fps in the 20's in 'a leaked game demo' 'unplayable', yet I find fps in the 30's smooth enough in otehr games.
 
Randell said:
Rev but dont you adapt to what you are used to? We have another debate goinf where people 'prove' 60fps is not enough. How did they survive when 60fps was a dream. I dont know :)

Plus it depends on the game, I for one am finding fps in the 20's in 'a leaked game demo' 'unplayable', yet I find fps in the 30's smooth enough in otehr games.

Well. It´s like with everything else. If you drive with a car that has less than 100hp. It might seem ok, but when you change it to 150hp car, you might find that 100hp was not enough and so on. Then you will finally find yourself buying a 500hp car that is 'enough', or ques again.. ;-)

And it really depends on the game. Most of the games doesn't really need stellar +60fps. There are other types of games and not just FPSs :)
 
Back
Top