Poor OpenGL performance on ATI cards?

Re: good, bad and ugly

shaderman said:
it's clear that NV makes better software all around (GL and DX), and the only proof you need regarding GL drivers is to ask Carmack.

he uses an FX (regardless of the 85+ dB) because NV's drivers are significantly better.

Wrong. I'll add emphasis:

I am using an NV30 in my primary work system now, largely so I can test more of the rendering paths on one system , and because I feel Nvidia still has somewhat better driver quality

Sounds to me like Carmak's using the FX is more for convenience, than better drivers. That of course is self-fullfilling, because

1) NV30's ARB2 path sucks balls wrt performance, forcing Carmack to use the NV30 path.

2) ATI's latest gen doesn't "need" hardware speicific paths, industry standard ones are just fine.
 
I realize that this kind of OpenGL performance test has little to do with gaming performance, especially since most games are written in D3D these days. My guess is that the test doesn't use shaders and probably not even textures but is more of a raw geometry test aimed at 3D DCC package performance.

The test was just called OpenGL Test or similar and the magazine was just something I was browsing in a bookshop and I wouldn't particularly trust it. It just seemed interesting. I didn't think I would ever see a Ti4200 get a higher score than a 9700Pro in anything.

As to the argument that ATI's professional drivers are different than its consumer drivers, sure but so are Nvidia's. That's why the software patches work because their professional cards are similar to the consumer cards but the drivers make the difference (mostly).

I'm perfectly happy with my 9700Pro and no, its not slow in anything but I was intrigued by those benchmarks 8)
 
the thing is that openly is used in the workstation industry extensively and the benchmarks are for that. however, trying to base game performance on it is like declaring the winner of a football game on based on the number of fouls. :rolleyes:
 
like declaring the winner of a football game on based on the number of fouls

We used to play a game that used that as a scoring system, not sure I'd define it as football though. :D

Sorry, talkative mood tonight.
 
It has always been my experience that both OpenGL and DirectX perform pretty similar if used in a similar way. The only time there is a significant difference is when DX an OGL has very different interface and deals with something in a very different way. Like rendering to texture for instance, but I'm not sure what which performs better there, but I would guess DX has a slight edge since it doesn't have to deal with context switches.
 
Re: good, bad and ugly

WaltC said:
shaderman said:
it's clear that NV makes better software all around (GL and DX), and the only proof you need regarding GL drivers is to ask Carmack.

Really? So Carmack's also going to weigh in on the state of D3d drivers, too? *chuckle*

Some emphasis in red to help your reading comprehension. (Yes, I'm being a jerk.)
 
The following JC's quote is what I find most interesting and somewhat confusing:

"Trying to keep boneheaded-ideas-that-will-haunt-us-for-years
out of Direct-X is the primary reason I have been attending the Windows
Graphics Summit for the past three years, even though I still code for OpenGL."

Sounds like JC is preparing to move over to d3d. That would be interesting to say the least. Why else would he care about d3d then? I were him I would be happy that ms screwed up as it means more gl users which means better/faster vendor support.
 
Do you see video cards advertised as OpenGL ARB-compliant or DirectX 8-compliant? That's why JC has to concern himself with DX development.
 
JD said:
Sounds like JC is preparing to move over to d3d. That would be interesting to say the least. Why else would he care about d3d then? I were him I would be happy that ms screwed up as it means more gl users which means better/faster vendor support.
No that's not it at all. He knows that vidcard capabilities are largely built to meet d3d specifications. If a future version d3d screws something up, vidcard hardware will get messed up also.
 
Re: good, bad and ugly

Joe DeFuria said:
shaderman said:
it's clear that NV makes better software all around (GL and DX), and the only proof you need regarding GL drivers is to ask Carmack.

he uses an FX (regardless of the 85+ dB) because NV's drivers are significantly better.

Wrong. I'll add emphasis:

I am using an NV30 in my primary work system now, largely so I can test more of the rendering paths on one system , and because I feel Nvidia still has somewhat better driver quality

Sounds to me like Carmak's using the FX is more for convenience, than better drivers. That of course is self-fullfilling, because

1) NV30's ARB2 path sucks balls wrt performance, forcing Carmack to use the NV30 path.

2) ATI's latest gen doesn't "need" hardware speicific paths, industry standard ones are just fine.

funny. he DID switch to the NV drivers because they are ** significantly ** more stable than the ATI variety. JC wouldn't say that, because he's a diplomat.

but it's obvious. he switches to an FX, despite it's obvious flaws, for the drivers and to execise new rendering paths.

he doesn't need to use an FX in his primary to exercise rendering paths. he can use a target system for that.

when you load/unload the driver alot, you want a solid driver. that he chose an FX in his primary system is significant ...

- sm
 
Re: good, bad and ugly

shaderman said:
funny. he DID switch to the NV drivers because they are ** significantly ** more stable than the ATI variety. JC wouldn't say that, because he's a diplomat.
- sm

"Must... control... fist... of... death..."
 
When he's working on ARB2 he can use either ATI or Nvidia cards, if he's working on a dedicated NV extension he needs an NV card so it makes sense why he'd be using an NV30. But saying that, for all we know he may have finished that work and is now playing with an 9800Pro and it's unlimited shader instructions.
Also, in reference to driver stability, he said that quite sometime ago. If you're going to use these types of quotes maybe you should add a date to it. I don't see reviewers playing "Hunt the real Driver" where ATI is concerned!
 
Re: good, bad and ugly

shaderman said:
funny. he DID switch to the NV drivers because they are ** significantly ** more stable than the ATI variety. JC wouldn't say that, because he's a diplomat.
Right. And you know the inner workings of Carmack mind, i suppose?

but it's obvious. he switches to an FX, despite it's obvious flaws, for the drivers and to execise new rendering paths.

It is obvious that he needs to make a special rendering path for the nv30 because its ARB2 performance is so terrible, yes. The other is merely an assumption on your part - or are you reading his mind again?

he doesn't need to use an FX in his primary to exercise rendering paths. he can use a target system for that.
What about shader length restrictions? Thats a good reason to use the FX over a 9700. Or maybe he does need to do most of his "new path creation" on his primary machine. But then i guess you know more aobut him and how he programs for Doom3 than anyone else alive, right? After all, you can read his mind!
when you load/unload the driver alot, you want a solid driver. that he chose an FX in his primary system is significant ...
What the hell are you talking about?

The only thing significant here is your blatant fanishness.
 
Re: good, bad and ugly

shaderman said:
he uses an FX (regardless of the 85+ dB) because NV's drivers are significantly better. in fact, i would venture to guess that 70% or game developers use NV products in their development machines.

- sm

You'd be mistaken. We all use R300's here. We do have GeForce's around, but only for testing.
 
Any developer working with DX9 is using a R300. Think about it. Some may switch to NV once they actually ship an FX.

ATI's drivers have been pretty stable for the last 8 months. Even Derek Smart is using 9700 now, which definitely says a lot about ATI's support.
 
Re: good, bad and ugly

shaderman said:
he uses an FX (regardless of the 85+ dB) because NV's drivers are significantly better. in fact, i would venture to guess that 70% or game developers use NV products in their development machines.
- sm
Wierd. And there I was certain thinking that his .plan file said he uses an NV30 because he cant run the most ammount of paths on it. I guess he must have been lying. :LOL:
 
Carmack was probably using the FX because he was trying to get Doom III running on the FX. He has had a R300 for so long Doom III is probably optimized for it. He probably wanted to try the different rendering paths available such as fp16 and fp32 etc. on the FX. I wonder if the fan has driven him crazy yet.
 
Re: good, bad and ugly

flf said:
WaltC said:
shaderman said:
it's clear that NV makes better software all around (GL and DX), and the only proof you need regarding GL drivers is to ask Carmack.

Really? So Carmack's also going to weigh in on the state of D3d drivers, too? *chuckle*

Some emphasis in red to help your reading comprehension. (Yes, I'm being a jerk.)

*chuckle* :LOL:
 
rwolf:
Thats the point I was attempting to make. However, I just noticed I accidently typo'd "can" with "cant" :oops:
 
Back
Top