Why Does ATI Get Beat Down on OGL

Mintmaster said:
You're completely twisting my statement with a dumbass's interpretation of propositional logic.

Are there any D3D games that use stencil shadows and Carmack's reverse algorithm? No, so you can't compare it. In any scientific experiment, you have to control your variables. I guarantee you that if Doom3 was written in D3D, there would be a very similar performance deficit. In fact, I have done some stencil work in D3D and found a big speedup on NVidia's hardware. For that game, it has nothing to do with OGL. This new "fix" by ATI only brings the X1K's AA %age drop in-line with D3D games - it won't let ATI beat NVidia.

Chalnoth's example of the UT2003 OpenGL/D3D renderer is a controlled environment. You have the same workload going through different API's. We don't have any data, but that's a good experiment.

What I'm saying is that if you race a black dog and white dog on a track, and then race them again on grass while breaking the white dog's leg, you can't say white dogs suck on grass. In Doom3 and Riddick, ATI is maimed by Hi-Z (that's hierarchical Z; the rest of HyperZ is okay) not working. The deficit has little to do with the API.

The question is why ATI slows down in some other OGL games.

No the real Question is why didnt they freaking FIX IT in hardware by now???
 
Joe DeFuria said:
Well, to be fair, ATI needs to release that public beta of the 5.11 drivers with the fix built into it...then I would expect most sites to revisit some benchmarks.

I wonder how long WHQL takes? I can see a (justifiable, in my view) hissy if ATI tries to rush this into a non-WHQL release for reviewer benchies against whatever-it-is the rumors say NV will have 'round XT availability time. Sauce for the goose and all that.
 
geo said:
I wonder how long WHQL takes? I can see a (justifiable, in my view) hissy if ATI tries to rush this into a non-WHQL release for reviewer benchies against whatever-it-is the rumors say NV will have 'round XT availability time. Sauce for the goose and all that.

I take my same stance as always: as long as the driver is available to the public in some form direct from the provider, I don't really care about WHQL. When it's made public, all aspects of the driver can be scrutinized by the public, and that's what matters.
 
I'd agree that's one level of quality control. It seems to me ATI has a standard and history for Catalyst tho, and I'd at least want to know it had been submitted for WHQL as well, and then follow up that the resulting WHQL-certed version is also provided to the public and provides the same performance. Your own standards that you assumed voluntarily should not break under the pressures of convenience --bend a little maybe ;) , but not break.

Edit:
We currently have the industry’s only public release commitment. Anyone who owns an ATI product can count on 8-10 driver updates a year. Of course all our updates are further audited by a third party (Microsoft in this case), and receive WHQL (Windows Hardware Quality Labs) Certification. Our drivers are always fully supported by us, fully tested, and are for all the Windows Operating Systems (that are currently supported by Microsoft).
Everyone always asks, where is ATI going to go from here and how can they continue to improve? I want to officially announce that in the near future we will do even better. CATALYST will now officially be posted a minimum of 12 times a year! That’s right monthly updates (and note I threw the word ‘minimum’ in there). These updates will always include performance boosts, stability enhancements, and innovative new features. They will be fully unified, for all OS’es and of course will be Microsoft certified.
 
Last edited by a moderator:
Don't get me wrong, I put some value on WHQL.

It's just that for reviewing purposes (especially where significant performance changes are involved), "the public" should have access to the drivers used. If the provider is not willing to let the public have at it at least in beta form, then the drivers should not be made available to reviewers either.
 
Joe DeFuria said:
Don't get me wrong, I put some value on WHQL.
I don't. I've used WHQLed Volari drivers. I've used WHQLed DeltaChrome drivers. It's not even funny what kinds of bugs you can have in your drivers and still pass WHQL. IMO this boils down to pure money making and political games.
The only practical reason why you'd ever want WHQL is to get rid of the Windows XP nag dialog. And you can easily just turn that off, especially if you're an OEM who ships systems with the OS preinstalled.
Joe DeFuria said:
It's just that for reviewing purposes (especially where significant performance changes are involved), "the public" should have access to the drivers used. If the provider is not willing to let the public have at it at least in beta form, then the drivers should not be made available to reviewers either.
I totally agree.
 
Joe DeFuria said:
Don't get me wrong, I put some value on WHQL.

It's just that for reviewing purposes (especially where significant performance changes are involved), "the public" should have access to the drivers used. If the provider is not willing to let the public have at it at least in beta form, then the drivers should not be made available to reviewers either.

Oh, I don't disagree with that. I'd probably even go a step further and say it is more important than WHQL if you put me in that damnable "either/or" place I try mightily to avoid. I just want both, or at least a serious step towards the second (submission), based on their own standards and statements. You know what they say --if you'll cheat at solitaire, who knows what other character faults you may have. :LOL:
 
I wonder what would happen if ATI just dropped support for OpenGL altogether with their gaming cards? Would OpenGL become the new Glide for games? Vista certainly makes you think of what the future is for OpenGL in terms of gaming. Only reason I like OpenGL is because it isn't limited to Windows, other than that, I bet ATI and co would prefer to only have to do one set of drivers for their products.
 
trinibwoy said:
That's what I was waiting for - tests on an XL. Very impressive. Don't think it will catch a GT with those numbers though. Oh thanks neliz, didn't even notice it was only at 4x. Seems like sireric wasn't joking about the "empirical" nature of the fix..

I think jawed is on a streak with regards to the way the MC handles data and 4xAA samples seem to fall RIGHT in it's alley.
now, a conspiracy theorist would see this as ATI optimizing for the 'high" end benchmarks since most sites only seem interested in running a game with 4xaa for "apples to apples" sake.

Oh well, *IF* they implement this on a game by game basis we might seem some nice improvements with 2x and 6x as well but for me, it seems someone just struck gold and driver dev. ISN'T getting a day off this weekend. Do you think ATi would stick to it's normal release schedule now that they have some OGL gold in their hand?
 
trinibwoy said:
That's what I was waiting for - tests on an XL. Very impressive. Don't think it will catch a GT with those numbers though. Oh thanks neliz, didn't even notice it was only at 4x. Seems like sireric wasn't joking about the "empirical" nature of the fix..
You missed this :p

Exhibit 'A' & Exhibit 'B'.

;)
 
Why fix it...

Oh, i dont know.. perhaps to sell more hardware and make more money and have an even better reputation??

There are still a few big games comming that use OpenGL, specifically in a "Doom-III" way. Why not fix the issue so that the benchmarks are dominated on all fronts?? Dont you think that would in turn sell more hardware?? Of course it would.

There would no longer be a "its good but it sucks in x".
 
bdmosky said:
Since when did Nvidia lose in almost every Direct 3d app? I thought it was pretty even there. Nvidia wins some, Ati wins some.
Guess ya missed the entire page 2-3 discussion concerning older OpenGL applications (Quake3, Serious Sam, etc.etc.)... ATI's remain quite competitive. Websites just don't use these as benchmarks anymore since they require special care to ensure you're not CPU bound due to engine age... but at higher resolutions + AA/AF, they are GPU bound and ATI's OGL remains similarly competitive.

That was the entire gist- by the same logic: "NVidia's suck at Direct3D".. if "win some, lose some" becomes the measuring stick for "competitive" vs. "sucks at XYZ API"
 
Back
Top