New AA options in Detonator

Well, I have both an X800 and 6800, and I was using the X800 as my primary gaming card. My primary games are HL2/CounterStrike:Source, EQ2, and BF1942. I recently switched to the 6800 system. I don't see any big IQ differences at 4x (the predominant level I play at). In fact, as far as edge AA, they are mostly the same in motion. Texture AA is not the problem. I see mostly equivalent levels on both cards (e.g. still some slight shimmering on each). The problem is games using shaders. CS:S and EQ2: in particular, which have alot of shaders, cause tons of aliasing that isn't fixable by anything save better shaders, or supersampling.

Load up CS:S for example, go to Dust1 or Dust2 and look at the floor where it turns to dark shiny tiles. The shader which puts specular highlights/shine on the tiles is a flicker-fest on both cards.

I would give a slide edge to texture AA (AF) on the X800, but the differences are not monumental, and certainly not deserving of the comment "sucks ass" Given the state of shader AA, I'm almost about to turn off anti-aliasing altogether, since the aliasing introduced by shaders spoils everything, and if I'm going to have flickering floor tiles, I may as well turn off AA an get back more performance and higher resolutions.

Hopefully the G70/R520 can run games like CS:S with super-sampling. (It makes no sense to say "fix the shaders", since I already bought the games, and I need a solution which fixes legacy games, not future games)
 
trinibwoy said:
swaaye said:
I'm also a bit perplexed that some of you call people biased fanboys when these people own both cards. I don't think someone that invests $400 or more into both ATI & NV fits into that category.

That's not what determines whether or not someone is a fanboi. There are a lot of people who were arguing over the FX/R300/NV40/R420 that still own Ti 4200's and 8500's. ;) It's one thing to say that you own both cards but bias is a very difficult (read:impossible) thing to hide.'

There are people who own both but only have positive things to say about one brand, and negative about the next. What do you call that?

Heh, well yeah those people bother me too and I run into them all the time. They are a group who don't like to read up on things and easily are swayed by experience with one "good" card (of course they have no experience with the other side so wth). Brand loyalty is scary and a LOT of people seem to fall into it.

If you've been in this industry long enough I'd say you sorta become immune to fan-boy syndrome. When you once looked forward to the release of, say, the Matrox Millenium...the sheer number of subsequent hype-fests sort of numbs you :)

(btw, the forums are deleting words (f a n b o y) on me for some reason) :LOL:
 
trinibwoy said:
Hmmm good point. Do you have an example of where I can look for this shimmering. Here's a short list of the games I have (I assume this doesn't occur in RTS titles or flight sims)

UT2004
BF1942
Call of Duty
Medal of Honor (entire series)
Aliens vs Predator 2
Serious Sam
Splinter Cell + Pandora Tomorrow
Prince of Persia
Far Cry
HL2 demo
NOLF

And is the shimmering the same as the texture crawling some people complain about?

Yes for me shimmer = texture aliasing. It as akin to when I moved from a V5 to an 8500 and a lot have games had sharper lod and increased texture aliasing that needed at least 2xAA and soem AF to sort out. The 9700Pro was better than the 8500 in every game and now the 6800 seems slightly worse in most games at that and I fiddled a lot with the showstoppers (Bf1942/CoH) before I found out I had to wait for driver fixes.

Of that list I only have Bf1942 and CoD.

Bf1942 and CoH are meant to be specific bugs/game issues (but due to 6800 LOD slection according to ChrisRay/Chalnoth). Never saw CoH fixed, but even with LOD clamp Bf1942 is worse than my 9700Pro.

Other games that I would say are just generally worse at shimmering at around mip-map bands reagrdless of optimisations are CoD, Doom3 floor gratings, WoW can have some really bad shimmering as well.
 
Here are some FSAA 4x screenshots. It took 4 years, but nVidia finally achieved FSAA quality of 3Dfx Voodoo 5 :LOL: But ATI's result is even better, note near-horizontal lines (black half). Some edges are clearer, so less crawling artifacts should be seen in motion


nVidia NV40 - FSAA 4x


3Dfx VSA-100 - FSAA 4x


ATi RV350 - FSAA 4x



sorry for grammar and spelling mistakes
 
Well, I looked at the first three on the left. Almost horizontal, one above that, and one slightly above that. I switched rapidly back and forth between the images, and I don't see a big difference. In fact, the one that is the second one above the horizontal on the left looks slightly better on the NV40.

Edge aliasing is not the primary problem in next-gen titles, it's texture shimmering, alpha textures, and shaders IMHO. I think multisampling as currently implemented is at a dead end. I'm not really that interested in 8x or 16x MSAA. Something needs to be done about alpha textures (which are still used in many games instead of geometry) and of course, people who write "cheap" shaders and neglect to account for aliasing induced by shaders.

Sure, edge could still be improved, by removing the "gray shimmer" that you get from today's 4x-6x edge AA. But it is not what glaringly stands out and annoys one today.
 
It really depends on how the games handle colors. If you need to crank up gamma because the game doesn't care about color spaces, the gamma-adjusted downsampling will be counter-productive actually (on my display, it's visibly worse at gamma 1.3 already).
 
trinibwoy said:
Wasn't there a site that had mouseovers of these somewhere?

Really don't know. These screenshots are my own production.

A bit OT, but I have one real "diamond":

FSAA 8x, GeForce FX, (this patterns was only used in non-WHQL drivers, this one is from 61.40). Enjoy:





If could nV call this FSAA 8x, so where is the problem with ATI's 10x and 14x? :D
 
This is really a very stupid pattern, but it leaves the hope that the pattern is actually more programmable than it seems.
 
Xmas: Yes, this is the ugliest pattern I have ever seen.

trinibwoy: NV40 8x is much better, wait a minute I'll prepare it. NV40 8x is comparable to ATi's 6x
 
no-X said:
Here are some FSAA 4x screenshots. It took 4 years, but nVidia finally achieved FSAA quality of 3Dfx Voodoo 5 :LOL: But ATI's result is even better, note near-horizontal lines (black half). Some edges are clearer, so less crawling artifacts should be seen in motion
Okay, what you have to recognize here is that due to the way that ATI does the gamma-adjusted FSAA, the mode that is better will be highly dependent upon the gamma settings and your monitor. On the monitor I'm currently looking at, the ATI shot and the NV shot are nearly indistinguishable.
 
So here are some "higher AA" screenshots :)


Parhelia (rev.2) [16xFAA] (1.71.60_107.01whql)


(no pattern screenshot, 16x OG)

Radeon 9600 XT [6x] (ATi 4.6 WHQL)





GeForce FX 5900XT [8x] (61.40 non WHQL)






GeForce FX 5900XT [8x] (56.72 WHQL)



(patterns from FX5700, but it is the same)


GeForce 6800Ultra [8x]





Voodoo 5 - 6000 [8x] - perhaps someday...
 
Chalnoth said:
no-X said:
Here are some FSAA 4x screenshots. It took 4 years, but nVidia finally achieved FSAA quality of 3Dfx Voodoo 5 :LOL: But ATI's result is even better, note near-horizontal lines (black half). Some edges are clearer, so less crawling artifacts should be seen in motion
Okay, what you have to recognize here is that due to the way that ATI does the gamma-adjusted FSAA, the mode that is better will be highly dependent upon the gamma settings and your monitor. On the monitor I'm currently looking at, the ATI shot and the NV shot are nearly indistinguishable.
Heh, true. With the gamma settings I use for every day desktop use, ATI's 6x looks better than the others by far. However, if I pump up my gamma settings to what I use for gaming (or viewing screenshots), NVidia's 8x on the 6800 looks best. At the same settings the 5900 with the 56.72 drivers is approximately equal to ATI's.
 
no-X said:
So here are some "higher AA" screenshots :)


Voodoo 5 - 6000 [8x] - perhaps someday...
Ati's 6X aa appears to be the best.
I guess the real merit of nvidia's 8x is for texture aliasing/ alpha textures.
 
Horizontal lines (NV40 8x) are much better than vertical. EER is 8x4, so ATi (6x with EER 6x6) gives better results at vertical lines, horizontal lines are worse (compared to NV40 8x).

edit: radeonic2: it's a pitty, that nV40 can't support RGSS + programmable MS. RGSS/sparse-MS combination would make sense.
 
no-X said:
Horizontal lines (NV40 8x) are much better than vertical. EER is 8x4, so ATi (6x with EER 6x6) gives better results at vertical lines, horizontal lines are worse (compared to NV40 8x).
Not seeing that on my HP M90 19" crt.
Ati looks all around better.

 
radeonic2 said:
no-X said:
So here are some "higher AA" screenshots :)


Voodoo 5 - 6000 [8x] - perhaps someday...
Ati's 6X aa appears to be the best.
I guess the real merit of nvidia's 8x is for texture aliasing/ alpha textures.

Yeah it does look better than the 8x shot. Is it down to sample resolution? It looks like ATi has full 6x resolution on both axes. Nvidia 8x only has 4x in the horizontal but 8x on the vertical since there is no horizontal jitter between the texture sampling positions.
 
Just for the record that 8xS screenshot from the NV40 is showing a 4*8 EER; would you be willing to ignore to ignore supersampling performance penalties even more, 16x (4xRGMS + 4xOGSS) yields actually an 8*8 EER and in terms of EER exclusively it's equivalent to Voodoo5 6000's 8xRGSS.

Frankly I was hoping for a few more speculations about what those hypothetical new AA tidbits for G70 could stand for. Instead I read almost 3 pages of rechewed stuff from the past.

Yes there is a problem with NVIDIA's new angle-dependant AF algorithm, but it seems to be purely between the algorithm and applications not recognizing default "0" LOD values. I recall being amongst the first to officially document in a review said issue and speculated back then where the problem originates from and apparently I haven't been much off base either; call it beginner's luck if you please since I don't really need laurels either :p

It appears strictly with AF enabled and it doesn't appear due to filtering optimisations or goes away when you switch to high quality either. Furthermore even Supersampling won't cure it either, just push the problematic area by a -0.5 LOD shift further away (with 2xSSAA that is, with 4xSSAA by -1.0) and so on. If you colourize in an application MIPmaps you might notice that the forementioned problematic area is only a "band" of the size of one MIPmap and it usually appears before the first red MIPmap.

In most cases switching the negative LOD bias slider to "clamp" cures the problem; if not - and that in rare cases - a bit of extra positive LOD might be needed also. The question would be why it didn't turn up on former GeForces or why it doesn't happen on competitive products: I'll call it an "algorithmic bug" for simplicity's sake, otherwise an added driver switch for it wouldn't be really necessary.

One thing I agree with here, are Democoder's points about texture aliasing; I myself am extremely finicky with texture aliasing and way too many people out there consider "extra sharpness" as ideal. On a still screenshot and up to some degree probably yes, yet in real time motion I don't want any dancing meanders across my screen. And yes apart from any prementioned bugs (or call them whatever you want) the primary focus for hardware designers - especially with increasing shader lengths - for antialiasing should be more in the texture than on the polygon edge department. I won't deny that I'll be a sucker for a 8*8 or even higher EER, but if I get tons of texture aliasing my apetite for antialiasing deminishes quickly. And yes I can understand when Democoder says that it's often that bad that he'll rather swith off MSAA entirely.

Uhmmm wait I'm ranting again.....oooops :oops:
 
Back
Top