Why Does ATI Get Beat Down on OGL

David Orton mentioned it was coming soon in the last update. (I believe).

I think it takes years to write drivers.
 
Last edited by a moderator:
rwolf said:
David Orton mentioned it was coming soon in the last update. (I believe).

I think it takes years to write drivers.

Especially if you don’t take it seriously and make it an actual priority.

Which is obviously the case here. It does not take *years* to Write a stinking driver. I bet a team of 5 could write a driver from scratch in 5 months (ok but at least within 12).

They have been supposedly working on this for almost 3 years now. They have KNOWN they needed to fix their OpenGL driver since the 8500 and have *supposedly* been taking it seriously since the 9700.

Which i find laughable.

I am frankly tired of the endless BS from ATi. It doesn’t take Nvidia years to fix issues with their driver. They basically have solid performance on both sides of the fence. There is absolutely no excuse for ATi's OpenGL driver except for the Plain flat out TRUTH. Which is they don’t freaking care. They only say they care to keep you buying their product.

The truth is they feel that OpenGL is not that important and if they get blown out in 2-3 games a year then so be it. OpenGL is supposedly going the way of the dodo anyway.

This should have become obvious to all of you when after 3 years to get it fixed they completely BLEW OFF Doom 3 as being important enough to get their driver straight for
 
Last edited by a moderator:
And dont take Hellbinder's opinion as anything. He's one of those people who expected unbelievable things from the X1K's and now is so pissed he has to bash everything about ATi for awhile so he feels better.

The truth, ATi doesnt care much really. Majority of games are D3D right now and its only going to be that way even more. The only engine worth anything it seems that is done in OpenGL is the Doom 3 engine and its games that use it. ATi sees this as only a minor area, and they simply rather optimizize games that use it than do a complete rewrite.
 
Skrying said:
The only engine worth anything it seems that is done in OpenGL is the Doom 3 engine and its games that use it. ATi sees this as only a minor area, and they simply rather optimizize games that use it than do a complete rewrite.

And with id switching their development platform over to the Xbox 360, once the Doom 3 engine runs its natural lifespan for licensees. . . .

I run Pacific Fighters in OGL when benchmarking for SimHQ since I feel that one game out of 7 or 8 using that particular API is a fair representation of the PC market.
 
AFAIK the OGL driver re-write its primarily being done via/for the workstation side, so it has to come out of there before going to the consumer side. I asked about this last year and the reply then was "a couple of products time", and then I took products to mean architectures.
 
It came up in the conference call this time in the context of the workstation market, and using X1000 family in conjunction with that new driver to make serious inroads into the upper-middle and upper levels of the workstation market. Their strength so far has been in lower middle and lower.

Also supporting that, the last interview I saw with the driver team that addressed it also suggested that the emphasis was stability and compatibility, with performance gains being a nice thing to have, but pretty much a by product if at all.

Based on all that, I would imagine it would make sense that we'd see that rewrite with the first FireGL product featuring X100-class hardware.
 
In conjunction, I would like to see real linux drivers from ATI. While driver comparisons are not frequent, Anand did this back in Dec 2004, and the then current ATI offerings were dismal. The 5700u beating ATI's R420 pro in the game I play the most? Unacceptable.

Linuxhardware.org did a more recent review, between old vs. new ATI drivers early in 2005, and while performance did increase a slight amount, AA was still broken.

I would like to join most people in rejoicing in the arrival of R520, but I'm not going to pay $450 for a card that could be outperformed by a $250 card, nor am I going to install Windows.
 
John Reynolds said:
And with id switching their development platform over to the Xbox 360, once the Doom 3 engine runs its natural lifespan for licensees. . . .

Linux isn't going anywhere (IMO either way, mind you) and I think the Mac will become much more attractive come next Summer. Also, there's some mixed signals coming out of ATi if your analysis is correct: JC's next engine will still be OGL on the PC; in fact the prime reason he gave for switching to the xbox for the following 6 months was exactly because the drivers (both IHVs I gather) were not where he would have liked them to be. And ATi was backing Prey at this year's E3.

Personally I think it is a matter of priorities, though I do doubt if increasing their OGL team slightly would somehow inhibit those other priorities.
 
Skrying said:
And dont take Hellbinder's opinion as anything. He's one of those people who expected unbelievable things from the X1K's and now is so pissed he has to bash everything about ATi for awhile so he feels better.

The truth, ATi doesnt care much really. Majority of games are D3D right now and its only going to be that way even more. The only engine worth anything it seems that is done in OpenGL is the Doom 3 engine and its games that use it. ATi sees this as only a minor area, and they simply rather optimizize games that use it than do a complete rewrite.

That is complete nonsense....
 
So, we always hear "Oh, ATI's OpenGL driver sucks," but what exactly does that mean? What, it's slower than NVIDIA parts in Doom 3? Well, considering NV cards can output two z-only pixels per clock per pipeline and how D3's shadows operate, yeah, that makes perfect sense. But, for example, if you made a set of GLSL shaders of increasing complexity and benchmarked them on NVIDIA cards versus ATI cards, what's the performance delta (on both sides) compared to a D3D app with identical shaders using HLSL? I know there's been a stigma associated with ATI's OGL drivers for years, and I'm curious about why that is, exactly. Is it even a problem for consumer-level cards (I recall a FireGL review at... THG, I think? that showed the FireGL card getting dominated by a previous-gen Quadro, but obviously this hasn't been the case for consumer-level cards AFAICT)?

Some more examples: we have plenty of theoretical benchmarks for D3D (Dave seems to like RightMark), but none for OGL. In a lot of B3D reviews, all we have to go on for OGL is Doom 3, which is not very useful in providing insight as to whether or not ATI OGL drivers suck compared to their NV counterparts. So... what exactly is the issue? "OpenGL sucks on ATI cards" is a mantra that's lasted for years, but upon reading this thread, I realized I didn't really know why, short of the "cause performance sucks," which is not very helpful.

Just some more explanation before I'm considered a ****** or ATI apologist or something. (oh hey, the filter circumvention thing doesn't work anymore)
 
Last edited by a moderator:
The Baron said:
Some more examples: we have plenty of theoretical benchmarks for D3D (Dave seems to like RightMark), but none for OGL. In a lot of B3D reviews, all we have to go on for OGL is Doom 3, which is not very useful in providing insight as to whether or not ATI OGL drivers suck compared to their NV counterparts.
Thanks for mentioning the true reason for Doom3's performance issues on ATI hardware. :)

I was going to add, that while old, there are some canned OGL benchmarks- VulpineGL and GLExcess. While they are dreadfully old, both can tax the GPU/API if higher resolution/AA is employed.

We need a more modern OGL benchmark suite badly. This would be ideal for mapping actual OGL performance without IHV specific references... or at least some form of fair/balanced utilizations of a particular architecture vs. the other mixed.
 
Sharkfood said:
Thanks for mentioning the true reason for Doom3's performance issues on ATI hardware. :)
I think you're interpreting my comments regarding the disparity between ATI and NV performance in Doom 3 to mean that NV is cheating, Doom 3 was built around NV hardware, or something to that effect. While there is that one shader replacement (at least) that they are doing (that does affect the output of certain situations), that's not what I'm talking about here at all. Just the same double-z-only output that has existed since NV30. It obviously is important in Doom 3, which is why basing claims about the quality of ATI's OpenGL driver on its performance in Doom 3 compared to NVIDIA's seems ridiculous.

I was going to add, that while old, there are some canned OGL benchmarks- VulpineGL and GLExcess. While they are dreadfully old, both can tax the GPU/API if higher resolution/AA is employed.

We need a more modern OGL benchmark suite badly. This would be ideal for mapping actual OGL performance without IHV specific references... or at least some form of fair/balanced utilizations of a particular architecture vs. the other mixed.
Forget comparing architectures right now. That is entirely separate from what I'm talking about. If you write an app in Direct3D that uses shaders written in HLSL and you write a functionally identical (as much as possible) app that uses OpenGL and GLSL shaders, will the latter be consistently slower on ATI hardware? If so, why is that?
 
The Baron said:
I think you're interpreting my comments regarding the disparity between ATI and NV performance in Doom 3 to mean that NV is cheating, Doom 3 was built around NV hardware, or something to that effect.
Not implied at all. It's just fairly common knowledge that shadows in Doom3 favor NV hardware.

When a developer has feature implementation choices at his/her disposal, they are always going to favor the IHV with the best support and their own internal development platform. This isn't cheating- it's developing on what they have and who's number is in the rollodex. NV's OGL support (in both drivers and dev support) outleagues ATI's. It might take 5-6 driver revisions and a year to get behavioral quirks with an extension handled with ATI.. whereas NV might get a patch or helpful workaround in a matter of weeks/days. I'm sure Quadro support, popularity and expertise has some impact on this.

I'm still leaning towards more generic OGL benchmarks with alternate approaches accumulating to a final "score"... this would allow more leaning towards IHV architectural benefits and disadvantages making somewhat of a "wash"..
 
Ah Pete I was going to mention Chronicles (which by the way is a game I had fun with and didn't almost fall asleep like in Doom3 *cough*).

If we could find out if there are no differences in default D3D and OpenGL in the Serious Sam2 engine (as in effects rendered; can't see any in the demo, but that's not good enough until the final game), then it might be another scenario worth investigation.

***edit:

http://www.xbitlabs.com/articles/video/display/radeon-x1800_14.html

http://www.xbitlabs.com/articles/video/display/geforce-7800_12.html
 
Last edited by a moderator:
The Baron said:
So, we always hear "Oh, ATI's OpenGL driver sucks," but what exactly does that mean?
It means, for me, that I have yet to see an OpenGL benchmark where the Radeon X1800 XT has done better than the GeForce 7800 GT, despite having typically significanlty higher performance than the latter in Direct3D games (I'm not even sure I've seen the XT beat out the 6800 Ultra, but I won't testify to that).

If you add up the OpenGL games listed in this thread, I believe you'll find four separate OpenGL games where ATI is very far behind indeed.
 
I have to wonder whether a good portion of NV’s advantage in OGL is because they are doing lowering filtering.

In COD, on my 9600 XT, the difference in performance between Bilinear and Trylinear (at 2AA/8AF) is 30-40%. I’ve benched this in several places with FRAPS -- the performance boost in this game by using lower filtering is amazing.
 
Blastman said:
I have to wonder whether a good portion of NV’s advantage in OGL is because they are doing lowering filtering.

In COD, on my 9600 XT, the difference in performance between Bilinear and Trylinear (at 2AA/8AF) is 30-40%. I’ve benched this in several places with FRAPS -- the performance boost in this game by using lower filtering is amazing.

Actually there are way less filtering related optimizations in OpenGL on GeForces than in D3D. Since the 78.03 betas the difference between "quality" and "high quality" AF in OGL is more or less around 3%, whereby under circumstances in D3D it can reach a >25% difference.

Here's a full report about it:

http://users.otenet.gr/~ailuros/7803.pdf

The ~3% difference in OGL carries over to all OGL games I've tested so far.
 
Back
Top