Ailuros said:
In the very least a good reviewer knows exactly what he wants to test and how; if he shouldn't be able to see a difference between no optimisations and optimisations he should rather hand in his pen and admit he's to dumb to write reviews. Someone who knows what he's doing doesn't need guidelines in the first place.
Yes, and a good reviewer should also know right off the bat to ignore official nVidia instructions as to how to "best benchmark" a nVidia product, right? Of course, since nVidia goes to so much trouble to spell out a particular method reviewers should use expressly for benchmarking, we have to assume that nVidia doesn't have as high an opinion of reviewers in general as you do...
(If nVidia didn't believe they could be manipulated they wouldn't even try, right?)
More to the point, you keep talking about optimization whereas nVidia talks only about why its G7x products should only be benchmarked on Quality as opposed to High Quality. The exact role of optimization there is simply inferred by you as opposed to being mentioned by nV.
The point is that you've found another tidbit to make out of a mouse an elephant. Texture filtering optimisations are common ground for years and no I don't agree with them either at all times and that's why I personally want to have the ability to switch them off.
Excuse me? *I* found the tidbit? Heh...
Did I start this thread? Did I do the original IQ article? Did I write the nV "review benchmarking instructions" nV published for its G7x products?
Since the answer is "no," to all three questions (obviously), then I'm not involved at all, am I?...
And you weren't the only one back then owning an R300. However I closed one eye to the angle-dependency in full consciensce and had to result to 3rd party tools like rTool to disable texturing stage optimisations and I more than often had the MIPmap bias stuck at +0.5 in order to save myself from eye-cancer from the occasional aliasing I hit onto. That still didn't change my opinion over the R300 which I still consider a momentum for the past few years in the market and of course ATI too.
I always run AF and FSAA together, now as well as then, and despite the proclaimed advantage to non-angle dependent AF when considered independently, the AF/FSAA IQ of the R300 completely castrated nV25...
I could see that with both eyes open, and without Rtool.
Yeah of course let's mix everything we can remember about either/or company and hell why not add past horror stories from the 3dfx-era to the mix.....
Yes, its so boring not to be a revisionist, isn't it? I guess I'll have to be boring, then...
Back to reality though: the G70 is fine with the soon to be released drivers and albeit it was highly annoying it seems to be gone now.
Wonderful news--and that's yet another reason I don't use nV today--the fact is that everything nVidia has ever made "will be fine with an upcoming driver release"...
I know that I must be peculiar, but I'm a present-tense kind of guy.
B3D stands fine towards NVIDIA last time I checked; in fact I'd prefer the B3D reviews to analyze in detail filtering related optimisations. With what exact driver settings were those actually written I wonder.
I didn't say anything about B3d reviews and I agree they have always been fine. I was talking about the atmosphere in the B3d *forums* where some people are still confused on the difference between optimizations and cheating on benchmarks...