dizietsma said:
Dare I say "apples to apples, oranges to oranges ". This forum has only been going on about it for over the last XXX months.
But...the problem is that the issue is not "apples to apples" and never has been. Here's the apples-to-apples people believe exists:
Ati's trilinear optimizations = nVidia's trilinear optimizations
That has never been true and isn't true today. What is true is:
R420 does not = nV40
Catalysts do not = Forcenators
ATi's trilinear optimizations do not = Forcenator trilinear optimizations
ATi's universal optimizations do not = nVidia's universal optimizations
It's evident that we cannot get an Apples-to-Apples comparison from the hardware, drivers, and optimizations, because they are
all different and
all unequal.
So, how can we get "apples to apples"...?
By comparing products of the same generation and price range produced by each IHV, and using the APIs, in-game IQ comparisons, and synthetic benchmarks (to test API feature support not yet supported in shipping games) as the "equalizers" for the comparison. Other than that, everything else is apples-to-oranges.
The big mistake I see repeated in recurring fashion on this subject is the quite unreasonable assumption that ATi's trilinear optimizations are the same as nVidia's trilinear optimizations, and that you can turn nVidia's off whereas ATI's are always on.
First, ATi's particular approach to optimizing trilinear
is not the same approach taken by nVidia for trilinear optimization--in terms of code and method and implementation. They are fundamentally different optimizations. Right off the bat, then, it's plain that direct comparisons between the optimizations is apples-to-oranges.
Second, ATi's trilinear operation is set to function conditionally, and automatically, which means
it turns itself off automatically under the right conditions. However, it is improperly, in my view, suggested that there's something wrong with what ATi has done here, simply because people desire to "turn off" ATi's tri optimizations just as they "turn off" nVidia's, so that they can get an an "apples-to-apples" comparison.
The problem with this approach is this: there's no evidence that I've seen to date that conclusively proves that when you "turn off" the tri optimizations in the Forcenator control panel all such optimizations in the drivers are
in fact turned off. In the absence of compelling proof which demonstrates that nVidia's bri-defeat switch actually functions across the entire spectrum of 3d games, reliably and predictably, we can see that even if we could "turn off" ATi's just as we "turn off" nVidia's, we still could not be assured of an apples-to-apples comparison, unless we could demonstrate the efficacy in both products of a bri-defeat switch. (A recent example here is the set of Forcenators released to coincide with the R420 launch reviews, in which it came to light later that the cp bri-defeat switch was somehow conveniently "broken" in that set of drivers.)
I am of the opinion that the Forcenator bri-defeat switch will be shown to work in some titles, but have little to any effect in others. I would be pleased to be shown wrong in this opinion, without a doubt. I think about it this way:
The default for nVidia's tri optimization in its drivers for brilinear is "always on." Indeed, it was "always on" long before nVidia added the "off" switch to the control panel, which certainly suggests that brilinear became a fundamental component of nVidia driver design long before any thought was given to turning it "off" via a cp switch. This in turn suggests to me that turning it off via a cp panel switch
might not always work predictably.
Had nVidia desired from the start to enable user control over its brilinear optimizations, I would have imagined that the driver default would be "brilinear off" and that a cp switch would enable you to turn it
on. Had the first appearance of nVidia brilinear optimization appeared in that configuration I would be far less skeptical of the "off" switch than I presently am (since it would be a "brilinear on" switch," instead, and would have been introduced as an optional component from the start.) But such is not the case and the "off" switch can be "broken"--but apparently the brilinear optimizations
cannot be broken in the Forcenators.
See what I mean? If turning on brilinear is indeed fundamentally an optional setting for the Forcenators, then I'd expect to see instances of "brilinear on" being "broken"--instead of what we've seen--which is that the driver mechanism to turn off brilinear has been "broken." The upshot here is that brilinear seems far more of a fundamental component of the Forcenators, and the so-called "off" switch nothing more than an unreliable, unpredictable driver hack added much later, which has already been "broken" at least once. And, as I said, when not "broken" it remains to be seen how reliably and predictably the "off" switch actually works across all 3d games in terms of defeating the optimization.
OK, so where does that leave us with regards to an apples-to-apples comparison? Since we do not yet know if it is possible to actually defeat brilinear within the Forcenators reliably and predictably, then for comparison purposes we must consider the Forcenators to always be "on" with regard to brilinear (the exceptions being the cases we can demonstrate that the defeat switch isn't broken.) But this presents a problem, because although the ATi brilinear optimization has no cp defeat switch, it turns itself on and off according to its pre-programmed conditional response. So, it is always going to be apples-to-oranges when comparing the nVidia approach to optimization with ATi's.
In the end we swing full circle and arrive where we started, and that is we compare these products, apples-to-apples, by way of pricing and generation, by way of API support as demonstrated in games and synthetics, and by way of observed IQ across the software spectrum. That is as close to "apples-to-apples" as we can reasonably expect to get, in my view. I think that a lot of people have taken what is a fairly complex issue and grossly oversimplified it.