When enough is enough (AF quality on g70)

Sxotty said:
Well I it is just the nature of competitiveness, I could see all kinds of shimmering on a 9800 pro when I went to it from NVs previous offerings, yet people lauded it. I saw shimmering on my x800pe and people lauded it. I don't have a nv40 but ATI started the whole AF mess as far as I am concerned and until consumers show with their wallets that they have had enough of lowering the quality then both companies will have an incentive to slowly lower quality to increase speed since that is what the average joe seems to be most interested in.

meh...

You hardly could see shimmering on 9800 as optimisations work only on 9600->X800
 
caboosemoose said:
The shimmering texture issue is a tricky one. There is currently a clear difference between ATI and NVIDIA at present but NVIDIA are handling it pretty cannily. They have it set just below the level at which most reviewers either A. notice it or B. judge it to be bad enough to merit a ticking off. I think one issue that critics of those who publish reviews should note is that reviewers often do not have long with the boards before publishing reviews. Given that they have to get through a number of benchmwark tests (with both the new card and existing cards) I think it's harder than you might imagine to spend time carefully analysing IQ. That sort of analysis often comes later, typically when the contest has pretty much already been decided.

It's interesting that NVIDIA consider this level of IQ acceptable even when they have a decent hardware advantage.

Incidentally, I have no problem recording the shimmering effect in HL2 using FRAPS for those who are looking for an easy way to see or record it.

Shimmering issues on GF6 aren't new. And it was discused almost year ago.
http://www.beyond3d.com/forum/showthread.php?t=13700

so lack of time is hardly a problem
 
Razor1 said:
I understand, I'm trying to take vidoes of UT 2004 in flyby of Deck 17, but the recorder I'm using, Camtasia studio 2, seems to have a high cpu usage and its lagging badly, if I can get it working then I'll do other levels too.

Tridam, what settings did you use and how did you apply them? Through the CP or in game?

AF 8x, game INF, lod 0.0

To avoid lagging, it's best to record at a lower framerate with slomo. You can change the video framerate after.
 
tEd said:
Actually with quality setting which the reviewers benched there is no doubt that it shimmers everything else is simply a lie. On High Quality it is probably very depnending games and situation.

To play devil's advocate I should say that the monitor plays a huge role in the shimmering perception. With blurry CRTs the shimmering effect is lowered and with slow TFTs it's hidden.

I think it's best seen with good CRTs.
 
Tridam said:
To play devil's advocate I should say that the monitor plays a huge role in the shimmering perception. With blurry CRTs the shimmering effect is lowered and with slow TFTs it's hidden.

I think it's best seen with good CRTs.

Yeah maybe nobody has any good CRT's anymore ;)
 
ok recorded with fraps getting around 15-17 fps set fraps to record at 16 fps, now I did two recordings one with setting it like you did and one setting it up everything from the control panel, with x8 af, turned on the lod clamp with the drivers.

I don't see a difference, both have no shimmering in deck 17, whats the best way to compress these with xvid, right now they are both around 150 mb.

Oh wait there is shimmering, was looking at the walls should have been looking at the floor :). Lets see what HQ does.

HQ removes the shimmering. Ok now just have to wait for my 7800 to come in.
 
Last edited by a moderator:
At least this subject matter is being discussed and respect posters with the courage to raise these subjective discussions. To gain framerate at the expense of image quality has always been going on and nothing new. It is the sites responsibility to raise these points to the end-user to make an informed buying decision - not nVidia; not Ati's job, imho. It's easy to point the finger at nVidia or ATI, etc.... but what would you do if you had to sell product? The sites suppose to protect us from marketing; do they?
 
tEd said:
Yeah maybe nobody has any good CRT's anymore ;)
Lcd is newer.. newer is better.. always ;)
There are no downsides to lcds ;)
They produce perfect accurate colors and have perfect contrast ratios aswell as well haveing accurate specifications for both contrast ratios and pixel responce times.
There is no cheating in spec numbers at all.
Bottom line-
Newer is better bar none.
 
Last edited by a moderator:
Tweaker said:
You hardly could see shimmering on 9800 as optimisations work only on 9600->X800
Um, well I could see it sorry. I played a lot of morrowind back then and there was plenty of weird stuff with AF on it. Perhaps it is not the exact same thing, and like I said I really don't care to much at the moment. I am tired of the 3d bit until things come down from the stratospheric highs that exist right now. Maybe they are made from oil :p
 
What consoles match PC specs currently? Except for the new generation of consoles there is no way you can compare a console to a PC. Much lower resolutions and lower resolution textures because of memory constraints.

The NV2A in the XBox hardware wise is capable of vastly superior filtering quality then any of the junk that ATi and nVidia have released in the last several years(ever for ATi). The performance hit is enormous and hence it is under utilized however it would be very nice to have the option of running that level of quailty on newer setups(particularly G70 SLI setups).
 
BenSkywalker said:
The NV2A in the XBox hardware wise is capable of vastly superior filtering quality then any of the junk that ATi and nVidia have released in the last several years(ever for ATi). The performance hit is enormous and hence it is under utilized however it would be very nice to have the option of running that level of quailty on newer setups(particularly G70 SLI setups).
What? can you back up that silly statment with any facts, pics or even more wacky ideas/opinions? Or is this just a theoritical spec debate?
 
NV2x hardware had probably the most "complete" filtering of any consumer card available - the trilinear filter was a fairly smooth 8-bit and the Anisotropic filtering filtered all angles. The issue was that the performance was fairly shoddy, especially with AF enabled.

Of course, this does actually raise the question of what the filtering is going to be like on the next gen consoles - ATI are saying that Xenos is different from current desktop chips, but how different is an unknown quantity; if PS3 doesn't have the same level of quality that G70 does then I would assume that that quality we see is driver related, not hardware.
 
Jawed said:
Ooh, that sounds interesting, what is it?

Jawed

Well on ATI cards shader arithmetic and texture filtering runs paralelly.
So if the shader does enough calculation, turning on AF can be free.

On NV cards a texture read that needs multiple cycles blocks the shader execution so AF always has a performance hit.
 
You can't judge g70 texture filtering by nv4x -- they're different:

6800_7800_anim.gif


Now what you should probably remember is that nv40 had some filtering issues too when it was released but these issues were fixed in the latter drivers.

The question is why is g70 having the same issues if they were already fixed once for nv4x?..
 
Last edited by a moderator:
DegustatoR said:
You can't judge g70 texture filtering by nv4x -- they're different:

6800_7800_anim.gif


Now what you should probably remember is that nv40 had some filtering issues too when it was released but these issues were fixed in the latter drivers.

The question is why is g70 having the same issues if they were already fixed once for nv4x?..

To make them look better(faster) to the competition and their own little brother? Just a thought.

Ormaybe g70 just can't do better than that i wouldn't rule that out either
 
Last edited:
In the list of innovations introduced by NVIDIA, the manufacturer says the 7800 features a more efficient anisotropic filtering. First important point, we have to keep in mind that because of its architecture, the GeForce 7800 GTX´s performance cost will be partially hidden with the activation of a complex filter because of a higher number of pipelines than ROPs.

But this isn’t all. NVIDIA has once again modified anisotropic filtering. It’s hard to tell what the differences are, but clearly something is new. Unfortunately, it sometimes leads to a noticeable reduction in quality in movement. We can clearly see a shimmering effect on some parts of textures, more or less obvious according to the texture level of detail, orientation, or even its level (the first is less impacted when multi-texturing). This shimmering is less noticeable with the GeForce 6800.
.

Further in the article:

This bright new portrait can’t, however, hide the downside found with the new anisotropic filtering which sometimes reduced graphic quality. This shouldn’t happen with a GPU of this calibre and the performance war should respect some limits.

http://www.behardware.com/articles/574-5/nvidia-geforce-7800-gtx.html
 
The nvidia control panel has been fairly flakey from what I can tell in terms of turning the Trilinear/AF optimizations. For it to stick in my case 100% of the time I had to switch off the Trilinear/AF optimizations under the global profile, reboot, and then nearly every game will be relatively shimmer free. (from what I can recall, nHancer did manage to get stuff cleaned up without a reboot).

I quantify it as "nearly every game" because certain ground textures do still exhibit the shimmer effect, but in most cases it's a lot more tolerable than with them on.

It possibly might be a bug, a guy at [H] found this. I don't know what card he has, but will look into this too.
 
Tweaker said:
You hardly could see shimmering on 9800 as optimisations work only on 9600->X800
To echo Ben and Dave, the 9700P was better than the 8500 because it allowed for trilinear to be used in combination with AF, but both were worse quality than the NV2x series (NV20/GF3, NV2A/Xbox, NV25/GF4) in both angle-dependency and interpolation precision/resolution/bitness. See here and here.

So, back to the quality/performance trade-off. Dedicate more transistors to AF, or spend them on something more readily apparent in the typical review's onslaught of benchmarks? Quite a few people accepted the 8500's IQ tradeoff just to be able to use AF all the time with a minimal performance hit, so ATI and nV may not have as much incentive to spend engineering resources on it, especially when p/reviews tend to gloss over IQ differences because of time or other constraints.
 
Back
Top