I'm not complaining one way or another regarding the differences in speed. What I'm stating is that I think the best comparison between two separate cards in a game is to run them on the "standard" application level.
ARB2 is an OpenGL standard is it not? I don't think you can compare ATI's path to Nvidia's path because frankly we don't know that they were optimized evenly, or if that's even possible. However, if you code directly to ARB2, then it's up to the hardware manufacturers to adhere to the standards set forth by the governing OpenGL board. Now correct me if I'm wrong here, but is that not what a review is supposed to do?
I read this forum and see people bitch about how it's so difficult to compare apples-to-apples image quality when it comes to AA and AF, due to the names and whatnot. What's the difference here in comparing an apples-to-apples rendering path? If you want to show the differences between the Nvidia and the ATI paths specifically, then I think that should be a separate portion of the review.
But the bulk of the reviews, imo, should be run on the standard path. Unless reviewers stress the standard path, you honestly think card manufacturers will see a need to support those standards if they think that developers will support their proprietary extensions?
I don't buy a card to play one game. I buy a card that will support every game that I want to purchase, and run them without having to have uber optimizations for it. Frankly not every developer will necessarily do that.
I don't see what the problem with asking an adherance to standards is, and why the rolleyes and the attitudes about it. I think it's a fair statement and expectation.
ARB2 is an OpenGL standard is it not? I don't think you can compare ATI's path to Nvidia's path because frankly we don't know that they were optimized evenly, or if that's even possible. However, if you code directly to ARB2, then it's up to the hardware manufacturers to adhere to the standards set forth by the governing OpenGL board. Now correct me if I'm wrong here, but is that not what a review is supposed to do?
I read this forum and see people bitch about how it's so difficult to compare apples-to-apples image quality when it comes to AA and AF, due to the names and whatnot. What's the difference here in comparing an apples-to-apples rendering path? If you want to show the differences between the Nvidia and the ATI paths specifically, then I think that should be a separate portion of the review.
But the bulk of the reviews, imo, should be run on the standard path. Unless reviewers stress the standard path, you honestly think card manufacturers will see a need to support those standards if they think that developers will support their proprietary extensions?
I don't buy a card to play one game. I buy a card that will support every game that I want to purchase, and run them without having to have uber optimizations for it. Frankly not every developer will necessarily do that.
I don't see what the problem with asking an adherance to standards is, and why the rolleyes and the attitudes about it. I think it's a fair statement and expectation.