Poor OpenGL performance on ATI cards?

Goragoth

Regular
I was reading some reviews of graphics cards in a magazine today and among the cards were some Radeons (9700/9500) and some GeForces (4200/4600). Three benchmark results were given among them an OpenGL test and the Radeons scored significantly lower on these than the GeForces. I don't remember the numbers but I did see a Radeon9700Pro getting something around 40% less than a Ti4200 which seems rather bad. The reviews commented on this and said that while D3D performance of the Radeons is good they suffer from bad OpenGL performance.

Can someone please confirm or deny this (preferbly deny). I haven't heard this mentioned before and indeed Doom3 coded in OpenGL runs very well on the R300 boards so are these results due to poor drivers that have since been updated (I don't know exactly how recent the reviews were but no more than a couple of months at most I would guess since its a current mag) or is it just plain BS and they were using some bogus test or is there some truth here? A Ti4200 outperforming a 9700Pro by around 40% simply looks wrong but now I have to know the truth and B3D is the best place to find it :D
 
Were you reading the "Nvidia Fan Kit" or something? Radeon cards does have better performance in D3D usually, but only by a very small margin.

The only place you will see a GF4 Ti4200 beat the R9700 is in quake3 at 640*480*16 with all eyecandy set to minimum. But that isnt representative for real performance at all.
 
GF4 Ti 4200 could be faster in some subtests of SPEC viewperf 7. However, that's not really important for gamers.
 
What magazine was it? Im very curious whether or not this is some no name magazine. I would doubt very much that they scored that much lower.

later,
 
The only place you will see a GF4 Ti4200 beat the R9700 is in quake3 at 640*480*16 with all eyecandy set to minimum. But that isnt representative for real performance at all.

I play at those settings.
I have not know my R300 to run slow ever, so I really could care less. :)
 
OpenGL performance

there are areas where GF4 will beat R300 - profesional OpenGL applications for example. Especially if using Unwinder's SoftQuadro patch... :)
 
In my experience, it is generally the case that nVidia's hardware performs better under OpenGL, and ATI's under Direct3D. That doesn't mean that OpenGL performance was that poor under ATI's cards, just worse than D3D.
 
*Nappe1 gives every one punch of matches and barrel of napalm.

...so, if you are going to start flame war, do it with passion and especially with proper toolkit at least.

because nothing is more boring than pathetic flame war...
 
Goragoth said:
I was reading some reviews of graphics cards in a magazine today and among the cards were some Radeons (9700/9500) and some GeForces (4200/4600). Three benchmark results were given among them an OpenGL test and the Radeons scored significantly lower on these than the GeForces. I don't remember the numbers but I did see a Radeon9700Pro getting something around 40% less than a Ti4200 which seems rather bad. The reviews commented on this and said that while D3D performance of the Radeons is good they suffer from bad OpenGL performance.

Can someone please confirm or deny this (preferbly deny). I haven't heard this mentioned before and indeed Doom3 coded in OpenGL runs very well on the R300 boards so are these results due to poor drivers that have since been updated (I don't know exactly how recent the reviews were but no more than a couple of months at most I would guess since its a current mag) or is it just plain BS and they were using some bogus test or is there some truth here? A Ti4200 outperforming a 9700Pro by around 40% simply looks wrong but now I have to know the truth and B3D is the best place to find it :D

It's not an issue to be denied or acknowledged, but rather understood. ATI's OpenGL drivers for its mainstream 3D gaming parts (R9700P, etc.) are setup and tuned differently from its FireGL drivers, because of the different purposes involved in the use of the respective products under OpenGL. The benchmarks you looked at, although widely used, are for some reason widely misunderstood to somehow relate to 3D gaming, which they don't. Even though on a couple of them the 9700P's OpenGL driver beats the GF FX pretty well, the benchmarks here cannot be inferred to correlate with 3D gaming performance for either card.
 
In my experience, it is generally the case that nVidia's hardware performs better under OpenGL, and ATI's under Direct3D. That doesn't mean that OpenGL performance was that poor under ATI's cards, just worse than D3D.
It has been my experience that Nvidia has convinced a few companies to primarily support Nvidia proprietary OpenGL Extensions instead of the ARB standard ones. Which has little to do with the hardware.
 
good, bad and ugly

you don't need to ask around here, since you will get all the usual ATI/NV employee's chiming in about their own products.

it's clear that NV makes better software all around (GL and DX), and the only proof you need regarding GL drivers is to ask Carmack.

he uses an FX (regardless of the 85+ dB) because NV's drivers are significantly better. in fact, i would venture to guess that 70% or game developers use NV products in their development machines.

- sm
 
Re: good, bad and ugly

shaderman said:
you don't need to ask around here, since you will get all the usual ATI/NV employee's chiming in about their own products.

it's clear that NV makes better software all around (GL and DX), and the only proof you need regarding GL drivers is to ask Carmack.

he uses an FX (regardless of the 85+ dB) because NV's drivers are significantly better. in fact, i would venture to guess that 70% or game developers use NV products in their development machines.

- sm

heh... that is soooo last year. ;)

The FX drivers are certainly not without extraordinary issues right now.
 
Re: good, bad and ugly

shaderman said:
you don't need to ask around here, since you will get all the usual ATI/NV employee's chiming in about their own products.

it's clear that NV makes better software all around (GL and DX), and the only proof you need regarding GL drivers is to ask Carmack.

he uses an FX (regardless of the 85+ dB) because NV's drivers are significantly better. in fact, i would venture to guess that 70% or game developers use NV products in their development machines.

- sm

That's strange. I thought the nVidia fanboys had switched their arguement to "nVidia's Drivers Suck, once they become mature the GeForceFX will gain +50% performance and 4 extra pipelines!"
 
Functionality has been the hallmark of nVidia's drivers. Significant performance increases have always come a few months after the inception of a new architecture.
 
Re: good, bad and ugly

BoddoZerg said:
shaderman said:
you don't need to ask around here, since you will get all the usual ATI/NV employee's chiming in about their own products.

it's clear that NV makes better software all around (GL and DX), and the only proof you need regarding GL drivers is to ask Carmack.

he uses an FX (regardless of the 85+ dB) because NV's drivers are significantly better. in fact, i would venture to guess that 70% or game developers use NV products in their development machines.

- sm

That's strange. I thought the nVidia fanboys had switched their arguement to "nVidia's Drivers Suck, once they become mature the GeForceFX will gain +50% performance and 4 extra pipelines!"

LOL, ohh soo strange it is :?
 
Chalnoth said:
Functionality has been the hallmark of nVidia's drivers. Significant performance increases have always come a few months after the inception of a new architecture.

Which is quite ironic given the state of the current FX drivers, particularly relating to D3D *functionality*.

------------------

According to Nvidia the GeForceFX was released before the end of 2002... talk about shoddy drivers...

You can't have it both ways. You can't say "but the GeForceFX isn't available yet" or "Wait for shipping drivers"... and then out of the other side of your mouth say how the GeForceFX was the "best chip of 2002".
 
Re: good, bad and ugly

shaderman said:
you don't need to ask around here, since you will get all the usual ATI/NV employee's chiming in about their own products.

it's clear that NV makes better software all around (GL and DX), and the only proof you need regarding GL drivers is to ask Carmack.

Really? So Carmack's also going to weigh in on the state of D3d drivers, too? *chuckle*

Carmack's focus is exceedingly narrow, obviously, and suits his own purposes. That's as far removed from being a broad statement as you can get. He makes no bones about the narrowness of that perspective. Indeed, he is often quite verbose in describing it.


he uses an FX (regardless of the 85+ dB) because NV's drivers are significantly better. in fact, i would venture to guess that 70% or game developers use NV products in their development machines.

- sm

Your guess ventured was probably true based on technology prior to September '02. At the present rate this will have entirely flip-flopped within the next few months, I would venture to guess.

I didn't see anything within Carmack's .plan update which I would construe as him saying that nVidia's drivers are significantly better. I do recall him complaining about the noise level and not recommending the GF FX 5800 Ultra to anyone at the present time. In fact, I recall several positive things in his comments contrasting ATI's drivers to nVidia's--such as rendering to higher precision, for instance, and being 2x as fast on the Arb path. I don't think Carmack is quite the nVidia shill you seem to think. Most developer enthusiasm for nVidia, such as still exists, was based on the state of the 3D chip marketplace 6-9 months ago and earlier. If nVidia is unable to soon regain the position relative to its competitors that was true at that time, "this too shall pass," I would venture to guess...;)
 
Back
Top