RussSchultz said:Sigh. The server ate my message.
To recap: I'll be painted an nvidiot for saying this, but I agree-ish.
From what I have heard, 1.4 isn't being supported in games.
I'm going to refer to "PS 1.4 functionality" rather than to the name "PS 1.4".
Well, Doom 3 does support that functionality. It is also presumable that similar shadow creation approaches will also benefit from that approach. Current games without such approaches do not, but it is not reasonable to assume that future games also will not unless they specifically ignore the benefits it can offer. It is a logical fallacy to make a statement like "that many of the pixel shaders use specific elements of DX8 that are promoted by ATI but aren't common in current games." when you are trying to promote "CineFX" architecture, and I don't think they could when trying to promote the GF FX in the same breath (and notably absent in that quote is a mention of the GF FX and Cine FX).
It is natural that games that utilize PS 2.0 functionality can benefit from PS 1.4 functionality, and this simple issue is what their statement completely sidesteps.
Of course, I'm not writing games, so I don't know for sure. NVIDIA works with a lot of developers (but I'm sure they don't get too many 1.4 questions ), but supposedly Futuremark has been doing more than simply writing cool looking demos/benchmarks and actually determining what will be the future. Who to believe? I dunno.
About efficiency...well, is it efficiency, or efficiency on GF cards in particular? I think nVidia has that complaint about DX pixel shaders entirely (above 1.3), not to mention the current ARB fragment shader functionality, even on the GF FX due to the decision they made, and they are making noise about it in regards to 3dmark because of its wide consumer recognition. But maybe there is some valid efficiency issue...will nVidia educate us about it, or is this just going to be talk without any backup?
About single-textured for the first game test...hmm...from what I read it is not single textured. Do they mean just one "color texture"?
Sounds a lot like the 8500 versus GF 4/GF 3...except the 8500 was clocked slower than the competition, and the GF FX is clocked faster. Also, I'm not sure the theoretical advantages of the GF FX would map especially well with games, and the 8500's advantages seem to have successfully done so. nVidia making their case for this with some example shader code to illustrate their statements would certainly help, and it is not like they would want to keep developers in the dark on how to optimize for the GF FX. They (and others) have much more shader experience under their belt than ATI did when trying to make the case for the 8500.
Hmm...well, it would be interesting to see benchmarks of what extent Cg optimizations benefit in comparison to DX 9 HLSL compilation for a variety of cards (for example, DX 9 HLSL and Cg for GF FX, then both for the 9700 Pro).
I think (personally) that the tests should use HLSL and let the best man win. That would give each vendor the ability to use their card to the best of their abilities. Of course, if the HLSL can't be reduced to work on 1.1, what to do...
MDolenc has pointed out that HLSL won't address the central issue nVidia has, which is that their architecture pre-GF FX is simply less capable with regards to some advanced techniques, and that this deficiency has increasing opportunties to be exposed going forward. As for PS 2.0 and beyond, I do agree that HLSL usage might offer opportunities in the future for improved performance.
Anyways, it is interesting. Its also interesting that HardOCP is following this line of thinking. 2 minds think alike? or a mindless drone?
I dunno. I don't follow how it was OK for 3dmark 2001 to be used when a test using features that didn't exist EDIT: in games, and weren't foreseen to be existing for a while yet, where they simply failed to run on other cards, and then later it was still OK the last time one vendor's architecture's functionality wasn't fully exposed by the test, yet for some reason this time when that happens it disqualifies 3dmark as a benchmark.
It could be viewed as they are trying to correct past mistakes, but the case doesn't seem as strong now as it was then, so that seems a bit bass ackwards. But perhaps future information will change that outlook.
Of course, the most cynical outlook is that the justification of the site is gaining significant "nVidia brownie points" in such a way that significant "ATI brownie points" aren't lost. With the entire "nVidia dropping the Quack dime" thing brought to light, this will likely be a popular view.
For myself, I tend more towards guessing they are "doing the right thing too late" with a healthy skepticism that it is indeed the right thing right now, if only to avoid getting mired in considering politics with such a scarcity of real information.
EDIT: for clarity