OpenGL guy
Veteran
Is that so? Care to quote some market share figures? I think you'll be surprised.Hellbinder said:Or how about the fact that they sell next to nothing in the professional space?
Is that so? Care to quote some market share figures? I think you'll be surprised.Hellbinder said:Or how about the fact that they sell next to nothing in the professional space?
OpenGL guy said:Is that so? Care to quote some market share figures? I think you'll be surprised.
Chalnoth said:Old, looking for newer:
http://www.xbitlabs.com/news/video/display/20031002161856.html
Edit:
Well, I know that current data exists, but I haven't found it for less than $5000.
Man, look at the huge discrepancy with the following numbers from June:bigz said:
They used the 81.82 'reviewers' drivers in the X1800XT review, which are said to have similar performance to 81.84 - I beta tested them over the weekend before X1k launch but had issues with them in some titles. Release 80 brings some new performance improvements and should be pretty good when the bugs disappear.Mintmaster said:Man, look at the huge discrepancy with the following numbers from June:
http://www.firingsquad.com/hardware/nvidia_geforce_7800_gtx/page15.asp
The 7800 has improved a good 17%, though half of that is likely the different test system (FX-55 vs. FX57).
ATI's X1800XT, however, scores equal to the X850XT.
The old Firingsquad numbers, however, have almost no relation to XBitLab's numbers, even though both used the same drivers. Maybe a different game level?
It's really hard to make sense of all this with such a huge variance in data. If you look back to the 9700 days, the 5900 Ultra was being beaten. Yup, NVidia being beaten in an OpenGL game by a lower model from ATI:
http://firingsquad.com/hardware/sapphire_atlantis_9800_pro_ultimate_review/page8.asp
Then the X800XT and the 6800U were quite close:
http://firingsquad.com/hardware/ati_radeon_x800/page15.asp
In the 7800 review, the framerates jumped up, as they did in the X1800 review.
What confuses me most is that ATI's hardware is scaling similarly to NVidia's with resolution. I don't know how that can be due to drivers, and that's why I always thought it was due to Doom3 and Riddick using stencilled shadows the Carmack reverse way. Maybe poor memory management? I would think that it's very easy for this sort of low level driver code to be shared between DX and OGL drivers. It doesn't make sense to me.
Are you sure about that? I thought 9800 had some changes for that. 3DCentre did an article stating this, and then updated it backtracking on it - I thought that related to HiZ.Colourless said:HyperZ doesn't work with the 'Carmack' Reversed method. Kills the ATI efficency.
Hellbinder said:Especially if you don’t take it seriously and make it an actual priority.
Which is obviously the case here. It does not take *years* to Write a stinking driver. I bet a team of 5 could write a driver from scratch in 5 months (ok but at least within 12).
They have been supposedly working on this for almost 3 years now. They have KNOWN they needed to fix their OpenGL driver since the 8500 and have *supposedly* been taking it seriously since the 9700.
Yup, that's why I always thought the whole OpenGL driver thing was overblown. Especially with what I saw back in the R300 days. Chalnoth, I know ATI theoretically has no fillrate disadvantage with AA on for the stencil only passes, but without Hi-Z they'll be drawing a lot of pixels they don't need to. NVidia can even skip stenciled areas very quickly.Colourless said:HyperZ doesn't work with the 'Carmack' Reversed method. Kills the ATI efficency.
That's all well and good, but nVidia shows a lead even in OpenGL games that don't use any stencil shadowing.Mintmaster said:Yup, that's why I always thought the whole OpenGL driver thing was overblown. Especially with what I saw back in the R300 days. Chalnoth, I know ATI theoretically has no fillrate disadvantage with AA on for the stencil only passes, but without Hi-Z they'll be drawing a lot of pixels they don't need to. NVidia can even skip stenciled areas very quickly.
Unfortunately, I don't have that data. I didn't do scaling tests, as I was only interested in comparing one card to another (and I no longer have the hardware to re-do the tests, unfortunately). And yes, you'd think some code sharing would be a simple matter, which is what makes me think that ATI just doesn't care.Chalnoth, does ATI scale similarly in UT2003/UT2004 as well? If you're right, then I guess it is just a matter of laziness with respect to OpenGL. I just find it unlikely that such optimizations would have anything to do with the API, and I would think a little code sharing would be a simple matter.
Doom 3 is a special case. And even then ATI seems to have only achieved a similar performance deficiency to their deficiency in other games, like City of Heroes (and soon City of Villains), Neverwinter Nights, Knights of the Old Republic, Riddick, and others.rwolf said:I am sure they care. They have optimized for Doom3 a fair bit.
Chalnoth said:Doom 3 is a special case. And even then ATI seems to have only achieved a similar performance deficiency to their deficiency in other games, like City of Heroes (and soon City of Villains), Neverwinter Nights, Knights of the Old Republic, Riddick, and others.
I don't think that's the case. Take Doom 3 as an example: the "default" renderer uses no NV extensions, and in fact uses the ARB_fragment_program extension, which was originally written by ATI.rwolf said:Another issue might be that the game developers wrote the optimal path for Nvidia extentions. Then they wrote a workaround for ATI that was less then optimal.
There's no way that that is the case. The only suggested inefficiency here is with the "Carmack's reverse" algorithm for stencil shadows. Carmack chose this algorithm because it's mathematically more efficient than more "standard" stencil shadow volume methods, and turns out to be more robust as well.I think the performance of Doom3 had more to do with someone having an axe to grind then anything else.
Chalnoth said:Doom 3 is a special case. And even then ATI seems to have only achieved a similar performance deficiency to their deficiency in other games, like City of Heroes (and soon City of Villains), Neverwinter Nights, Knights of the Old Republic, Riddick, and others.