When enough is enough (AF quality on g70)

trinibwoy said:
With respect to that article, how do we know that the performance improvement is solely due to additional optimizations given that the older drivers do not officially support the G70 and were "hacked" for the purpose of the comparison?

Well, for starters, why is it, do you think, that nVidia actually recommends that reviewers *not use* High Quality IQ mode for benchmarking? Further, they make the unprecedentedly silly comment that the High Quality IQ is "too high" (above industry standards) for reviewers to use when benchmarking (apparently nVidia is of the opinion that most of its customers revel in crappy IQ--and judging by some comments I read that is probably true.) Take a look at nV's official comments about this topic and all doubt should be forever erased...;)

More to the point, I'm surprised that at this late date that anyone at all is surprised to see nVidia compromising acceptable IQ for the sake of publicity benchmarks centered around frame-rate performance. This has been a solid pattern at nV for years--and is the primary reason I stopped using nV products in 2002. Basically, what is acceptable IQ for nV is not acceptable for me.
 
croc_mak said:
7800 has 8 more texture units than 6800...So, the benefits of any quality/performance tradeoff optimization that cuts the texture quality to decrease texture filtering cycles are higher on 6800 than 7800.

These sort of optimizations might make some sense at 6200 level...but a $500 card having to stoop to lower quality levels sucks IMHO


Well I was thinking of a percentage point of a view. Not an overall performance loss.
 
Le Inq:
NVIDIA WAS very fast to react to a story we pixellated yesterday. We wrote here about Nvidia's over optimisation. Well, as expected Nvidia said it's a bug and it could not be fixed by changing to high quality driver setting, which was the case with Geforce 6 series cards.
Yup, they reacted to Fuad's article all right :D
 
WaltC said:
Well, for starters, why is it, do you think, that nVidia actually recommends that reviewers *not use* High Quality IQ mode for benchmarking? Further, they make the unprecedentedly silly comment that the High Quality IQ is "too high" (above industry standards) for reviewers to use when benchmarking (apparently nVidia is of the opinion that most of its customers revel in crappy IQ--and judging by some comments I read that is probably true.) Take a look at nV's official comments about this topic and all doubt should be forever erased...;)

More to the point, I'm surprised that at this late date that anyone at all is surprised to see nVidia compromising acceptable IQ for the sake of publicity benchmarks centered around frame-rate performance. This has been a solid pattern at nV for years--and is the primary reason I stopped using nV products in 2002. Basically, what is acceptable IQ for nV is not acceptable for me.

Walt, next time you quote somebody, please ensure that your response is relevant to the comments that you quoted :rolleyes: I understand you revel in every opportunity to decry anything Nvidia does but please do so without hinging on other people's on topic posts.
 
Last edited by a moderator:
Particleman said:
It was the 3rd perspective or the realtime AI calculations in botmatches that was the problem.

Anyways here are my results using my own ONS-Primeval timedemo recorded in first person on a 7800 GTX. First person is more what you would get in gameplay so I feel this is more accurate than botmatches. Quality was tested with (trilinear optimzation off, aniso filter optimization on, aniso sample optimization off)

High Quality: 99.430864

Quality: 139.988691

what this tells me is that botmatches suck for testing AF performance, because when I used the UMark botmatches it was 87.750137 on High Quality and 89.507195 on Quality.

I just tried and got a similar perfomance loss. Strange.
 
WaltC said:
Well, for starters, why is it, do you think, that nVidia actually recommends that reviewers *not use* High Quality IQ mode for benchmarking? Further, they make the unprecedentedly silly comment that the High Quality IQ is "too high" (above industry standards) for reviewers to use when benchmarking (apparently nVidia is of the opinion that most of its customers revel in crappy IQ--and judging by some comments I read that is probably true.) Take a look at nV's official comments about this topic and all doubt should be forever erased...;)

Because the competition delivers it's products with optimisations enabled too.

More to the point, I'm surprised that at this late date that anyone at all is surprised to see nVidia compromising acceptable IQ for the sake of publicity benchmarks centered around frame-rate performance. This has been a solid pattern at nV for years--and is the primary reason I stopped using nV products in 2002. Basically, what is acceptable IQ for nV is not acceptable for me.

Actually optimisations are a common place for years.

I personally have nothing against any optimisation as long as I have the choice to disable it.

Nothing would surprise you concerning NVIDIA as long as it's negative; what else is new exactly? :p
 
croc_mak said:
7800 has 8 more texture units than 6800...So, the benefits of any quality/performance tradeoff optimization that cuts the texture quality to decrease texture filtering cycles are higher on 6800 than 7800.

These sort of optimizations might make some sense at 6200 level...but a $500 card having to stoop to lower quality levels sucks IMHO

Exactly my thoughts. Furthermore the G70s are so abysmally CPU bound in today's systems and games, that you'll have a hard time noticing a significant difference in real time gaming conditions anyway.
 
Particleman said:
It was the 3rd perspective or the realtime AI calculations in botmatches that was the problem.

Anyways here are my results using my own ONS-Primeval timedemo recorded in first person on a 7800 GTX. First person is more what you would get in gameplay so I feel this is more accurate than botmatches. Quality was tested with (trilinear optimzation off, aniso filter optimization on, aniso sample optimization off)

High Quality: 99.430864

Quality: 139.988691

what this tells me is that botmatches suck for testing AF performance, because when I used the UMark botmatches it was 87.750137 on High Quality and 89.507195 on Quality.


Of course are botmatches suboptimal for those cases; 3rd perspective scenarios are usually abysmally CPU bound. It's actually the same in racing sims and replays, and has been exactly the reason why I don't consider benchmarks from saved replays as reliable. The camera doesn't flip all around when driving a car does it?

In any case how many bots did you add while recording that timedemo?

***edit:

you may also want to try that one:

http://www.3dcenter.de/downloads/ut2004-primeval.php

16 bots and 1st person view; it gives me only a 6% difference though between HQ and all optimisations enabled.

I tried a couple of scenarios in UMark and I actually got up to 30% difference between quality (all optimisations on) and HQ. CTF-Face3 flyby gives me a 12% difference and since flybys are better suited for fill-rate measurements it sounds more realistic. It's not like there's a scenario in UT2k4 where you can bring a G70 even in high quality to it's knees.
 
Last edited by a moderator:
trinibwoy said:
Walt, next time you quote somebody, please ensure that your response is relevant to the comments that you quoted :rolleyes: I understand you revel in every opportunity to decry anything Nvidia does but please do so without hinging on other people's on topic posts.

Heh...;) It seems to me that I simply answered your "how do we know..." question very directly. I explained precisely "how we could know..." Look, if you are going to ask a question please don't complain when people answer it...;)
 
Ailuros said:
Because the competition delivers it's products with optimisations enabled too.

Well, the kicker is that if I desire I can run my Catalysts on "High Quality" with optimizations turned off. Or I can run the Cats in High Quality with optimizations on, in varying degrees. In fact, the Catalysts specifically separate optimization control from the various standard IQ quality modes that you can choose from--so that, for instance, I could set the drivers to "low quality" IQ mode and *still* turn the optimizations off.

Please enlighten me where in nVidia's instructions to reviewers as related in this thread nVida states that running in "Quality" mode is equivalent to running with optimizations on and *nothing else.* In fact, please tell me where it is in these instructions that nVidia mentions the word "optimizations" at all. The only thing I saw was a plea for reviewers not to use "High Quality" when benchmarking nV products on the grounds that the IQ produced by High Quality was too good for mortal man (and as such would "unrealistically" slow down the frame-rate benchmarks nV wants to see published about its products)...;)

Secondly, as the bulk of the responses in this thread abundantly demonstrates, nV's recommended "quality" mode for benchmarking is hardly satisfactory for anything *except* frame-rate benchmarking. IE, it isn't an acceptable setting IF IQ is as important a consideration for the reviewer as frame-rate performance.

Actually optimisations are a common place for years.

Yes, both good and bad optimizations have indeed been common for years...;)

Good = frame-rate performance improves without degradation of IQ.

Bad = frame-rate performance improves while IQ degrades.

The differences are very clear. What is uncommon in the last few years, especially since 2002, is that the term "optimization" itself has picked up this negative connotation. Before that, the only optimizations discussed were the good ones. Which IHV do you find primarily responsible for that sad state of affairs?

I personally have nothing against any optimisation as long as I have the choice to disable it.

Nothing would surprise you concerning NVIDIA as long as it's negative; what else is new exactly? :p

As long as what nV does seems to me to be negative, my comments as to those things will reflect it.
 
Last edited by a moderator:
WaltC said:
Please enlighten me where in nVidia's instructions to reviewers as related in this thread nVida states that running in "Quality" mode is equivalent to running with optimizations on and *nothing else.* In fact, please tell me where it is in these instructions that nVidia mentions the word "optimizations" at all. The only thing I saw was a plea for reviewers not to use "High Quality" when benchmarking nV products on the grounds that the IQ produced by High Quality was too good for mortal man (and as such would "unrealistically" slow down the frame-rate benchmarks nV wants to see published about its products)...;)

And reviewers end up testing what exactly in the end on either/or? Since I can see that the majority out there tests with default quality driver settings for both, that "plea" as you describe it doesn't surprise me.

Maybe you'd prefer for example to see a comparison between "brilinear" and trilinear, but I'd prefer to see apples to apples comparisons.


Secondly, as the bulk of the responses in this thread abundantly demonstrates, nV's recommended "quality" mode for benchmarking is hardly satisfactory for anything *except* frame-rate benchmarking. IE, it isn't an acceptable setting IF IQ is as important a consideration for the reviewer as frame-rate performance.

Which they'll most likely fix in a driver soon and from what I can see so far the results are quite satisfactory both on the IQ and performance level.


Yes, both good and bad optimizations have indeed been common for years...;)

Good = frame-rate performance improves without degradation of IQ.

Bad = frame-rate performance improves while IQ degrades.

The differences are very clear. What is uncommon in the last few years, especially since 2002, is that the term "optimization" itself has picked up this negative connotation. Before that, the only optimizations discussed were the good ones. Which IHV do you find primarily responsible for that sad state of affairs?

First things first: ALL optimisations will inevitably degrade image quality. It then comes down to how much it'll degrade after all. One separation I won't deny are legitimate or illegal optimisations.

Texture filtering optimisations have been common place since the Voodoo-era, unless of course you consider the late 3dfx's last products being capable of real full trilinear filtering.

The R100 was the first PC desktop part from what I can remember with angle-dependent AF. But does it really matter who started what? The answer would be probably yes if it would be NVIDIA.

As long as what nV does seems to me to be negative, my comments as to those things will reflect it.

You've never sounded to me like you're willing to cut NV a fair chance whatsoever, but it actually might just be my wrong impression. If the amount of criticism towards ATI is at the same level and I've missed it then of course an apology from my behalf ;)
 
I apologize in advance for my ignorance, but is there a link to a good source that has done an indepth study/benching of the 7800 vs x800 using as close to possible image standards?

Any "respected" links of the sort is appreciated.
 
Ailuros said:
First things first: ALL optimisations will inevitably degrade image quality. It then comes down to how much it'll degrade after all. One separation I won't deny are legitimate or illegal optimisations.
Not really true. Do optimizing compilers degrade accuracy of compiled C code? I sure hope not. Similarly, optimizing compilers for graphics chips should also maintain accuracy. There are plenty of other optimizations that don't degrade image quality.
 
OpenGL guy said:
Not really true. Do optimizing compilers degrade accuracy of compiled C code? I sure hope not. Similarly, optimizing compilers for graphics chips should also maintain accuracy. There are plenty of other optimizations that don't degrade image quality.

I had actually just texture filtering optimisations in the back of my mind.
 
silence said:
so, Nvidia is back with over-optimizing? while they have best product on market? :rolleyes:

I frankly don't know whether this was intentional or not, but it doesn't make that much sense either, since the differences in performance - especially on G70 - seem way too small to justify even the thought.
 
Ailuros said:
I frankly don't know whether this was intentional or not, but it doesn't make that much sense either, since the differences in performance - especially on G70 - seem way too small to justify even the thought.

okies.... i know it there (yeah i read INQ, shame on me), but i didnt comment before.
i could understand doing it in NV30 days.... but now?
and if you say that gains are marginal... why? most people dont even know what we are talking about... hell 95% of user dont change their _MONITOR_ refresh rate, let alone play with advanced settings....

IMO, stupid.
 
I believe some people are overreacting over this. nV's filtering algorithm obviously doesn't work as it should in some cases. It will be fixed. End of story. There isn't a conspiracy in every corner methinks. Plus, if you expect nV to release a (fixed) driver that will have a large performance hit, think again. It will just optimize its filtering so that it does the job better. Which is not bad, as long as visual degradation does not occur.
 
Back
Top