NV40 IQ

Well, if you think of the quality of the rendered image itself, I would say the 9800XT.

Why?

- Far Cry "lighteffect blockiness" as seen on tomshardware (driver/software issue?)
- Lack of gamma corrected AA, non programmable sample patterns, strange distribution of AA modes.
- AF no longer angle independent (also driver issue).

Those are my .02 after almost 24h of (p)review and discussion digestion.

With Regards
Kjetil
 
They look close, except for the puzzling (to me) Far Cry and Lock-On banding. AA and AF look close enough to no longer be a serious point of contention, particularly at 1280x1024+.
 
firingsquad site had nice pics of iq in which 9800 was predominantly better looking to me. 6800 had banding problems on the elvevons of the jet in lomac game.
 
Pete said:
AA and AF look close enough to no longer be a serious point of contention, particularly at 1280x1024+.
Well, once developers get into hardcore shaders (which will finally start to be heavily encouraged by NVidia since NV40 rocks at pixels shaders), 1280x1024 is not very feasible. Check out rthdribl - Even NV40 only get 45-60 FPS with 4xAA at 640x480. I know the demo looks damn good on a 9700 even at that resolution, so the difference between good and great AA can be noticeable.

I think the 3DMark2003 airplane shots are the best examples of how much difference gamma correction can make. Look at the bottom edge of the main airplane's wings, or the bottom airplane's tail. I guess it's possible your monitor has a non-standard gamma, though.
 
Heh, what I don't get is what is the point of using 4xAA @ 640x480. At that point I'd definately put higher resolution as a priority above AA. To each thier own I suppose.
 
While the 6800 has improved AA it's effectively limited to 4x and it's still not gamma corrected.

In all of the screenshots i have seen today it is might hard to say that the 9800xt has significantly better IQ all the time, but I haven't seen any where I doubted it was at least the equal of the 6800.

It should be a good battle once these cards (nv40/r420) start shipping in a month or so.
 
I'd say the image quality is pretty even between NV40 and R3xx. As said, there's a slight advantage in R3xx AA, but I believe it isn't a substantial difference. They are both clearly better than NV3x cards.
 
I think most of the artifacing/ banding on the 6800 will be gone in the near future. They are, I think, leftovers of the "optimisations" in nVidia's driver. The difference, up to 4X is actually small enough to dismissed. Yes, Gamma correction makes ATI's better, but by how much? Fact is, they are so close that anyone with an open mind shold call it a draw........
 
Very slight edge to the R300 as I really do like ATi's AA and the games I play shows this up a lot.

In other areas it's very definitely a draw.
 
Reverend said:
Can you specify/explain what is meant by "better image quality" first?

You are sooooooo bad! ;)

How bout this: If a NV40 runs its fan at full speed, and there's no one around to hear it, is it still somewhat noisy? :LOL:

Yea, yea, I know OT! Sorry bout that!
 
Reverend said:
Can you specify/explain what is meant by "better image quality" first?

I mean like when you see in games like far cry on the floor you can see all pixels on the nvidia and none on the Ati and stuff like that but from what I'm learning is a drivers issue not a probelm with the card itself, and also does the problem of the floor on far cry have to do with AA/AF or with Pixel shader??
 
The point for me is that the NV40 is so fast, that a true 8x MSAA mode would be actually usable in most games. So the AA of the NV40, although improved at 4x, still disappoints me quite much.
 
madshi said:
The point for me is that the NV40 is so fast, that a true 8x MSAA mode would be actually usable in most games.

Usable, yes. Useful, not very, IMO. The majority of games seem to use alpha tests or something else that breaks MSAA. Developers need to stop using alpha tests or anti-alias them using multisample masking. Until then, 4x MSAA with high resolution is fine with me.
 
EasyRaider said:
Usable, yes. Useful, not very, IMO. The majority of games seem to use alpha tests or something else that breaks MSAA. Developers need to stop using alpha tests or anti-alias them using multisample masking. Until then, 4x MSAA with high resolution is fine with me.
That doesn't make much sense to me. Right, MSAA doesn't anti-alias alpha tests. But not all games use it and the games that use it don't use it everywhere. So there are still lots of edges which MSAA works fine on. And for sure a good 8x MSAA mode would look better on those edges than a 4x MSAA mode. So I don't understand what 4x vs 8x has anything to do with alpha tests? Why do you think 4x MSAA is useful, while 8x MSAA is not so useful?
 
madshi,

It's because the higher the MSAA level, the more will alpha tests stand out. OK, it's better to have polygon edges anti-aliased than to have nothing anti-aliased. But in games with alpha tests, I prefer the highest possible resolution to minimize general aliasing, and then add 2x or 4x MSAA on top of that to further smoothen polygon edges. I feel that just 2x AA looks good enough in 1280 and above. Additional fillrate would be better spent to improve lighting IMO, but that is beside the point.

8x AA wouldn't be completely useless, but I think it would be far better to have 2x or 4x AA with everything anti-aliased.
 
IMO higher-order AA is more important on digitally-driven flat-panels than on analogue FPs or CRTs. In top-end TFTs you can absolutely see every pixel, and they look square, which isn't the case with CRTs at high resolution (nor some of the analogue TFTs I've seen). So you get some anti-aliasing from the display device "for free".
 
The typical TV will give you even more "free anti-aliasing". Kinda like Quincunx or any other blur filter.

So it really depends both on the display device and the ratio of display size to viewing distance.
 
EasyRaider said:
It's because the higher the MSAA level, the more will alpha tests stand out. OK, it's better to have polygon edges anti-aliased than to have nothing anti-aliased. But in games with alpha tests, I prefer the highest possible resolution to minimize general aliasing, and then add 2x or 4x MSAA on top of that to further smoothen polygon edges. I feel that just 2x AA looks good enough in 1280 and above. Additional fillrate would be better spent to improve lighting IMO, but that is beside the point.

8x AA wouldn't be completely useless, but I think it would be far better to have 2x or 4x AA with everything anti-aliased.
But seeing the raw power of the NV40, a lot of games could probably be played using 1600x1200 with 6xMSAA, if only the NV40 could do 6xMSAA.
 
Back
Top