Future of MSAA?

Chalnoth said:
Personally, it's the blurring of the text with Quincunx that really gets me. I don't think I'd really notice the slight blurring of the scenery.


slight blurring????.....you are blind then.
 
mboeller said:
Chalnoth said:
Personally, it's the blurring of the text with Quincunx that really gets me. I don't think I'd really notice the slight blurring of the scenery.


slight blurring????.....you are blind then.

It's a tad more noticable in real time. It also depends (as weird as it may sound) on the monitor and it's age/clarity. I just did some tests on another comp with a 5900 and a quite old monitor and the difference is truly less noticable, even aliasing is overall less noticable. Just a possible theory.
 
mboeller said:
Chalnoth said:
Personally, it's the blurring of the text with Quincunx that really gets me. I don't think I'd really notice the slight blurring of the scenery.


slight blurring????.....you are blind then.

You are probably comparing the 2x OGSS (supersampled shot) to the 2 MSAA based shots.

The reason it is less blurry compared to the first 2 (for those who are still wondering the first image is straight 2x MSAAm the second is quincunx) is that super sampled AA sharpens texture quality.

Having said that, the screenshots are qutie murky. Did you make use of the "force mipmaps --> Trilinear" and quality or high quality image setting Ailuros?

Anyhow, here is a PSP8 difference image between Pics 1 & 2 (the 2 MSAA modes).

http://members.ozemail.com.au/~gregorystanford/settings.jpg

http://members.ozemail.com.au/~gregorystanford/difference.png

As you can see only text and object edges differ. MSAA can't affect texture quality except at object edges.
 
radar1200gs said:
You are probably comparing the 2x OGSS (supersampled shot) to the 2 MSAA based shots.

I highly doubt he is, since I labeled each screenshot specifically as to what method was used.

The reason it is less blurry compared to the first 2 (for those who are still wondering the first image is straight 2x MSAAm the second is quincunx) is that super sampled AA sharpens texture quality.

We do know that difference too and that since websites used to compare just Multisampling against Supersampling performance.

Having said that, the screenshots are qutie murky. Did you make use of the "force mipmaps --> Trilinear" and quality or high quality image setting Ailuros?

Why murky? I don't see what texture filtering has to do with anything, as long as the same filtering methods are being used in each of the three shots. It'll blur with Quincunx a tad more irrelevant if I use bi-/bri- or trilinear.

You like Quincunx and that's perfectly acceptable. I personally cannot stand it whether on still shots or in real time and yes I can see the difference under any occassion.

As you can see only text and object edges differ. MSAA can't affect texture quality except at object edges.

Polygon edges and intersections. However the 5-tap blur filter isn't under all occassions affecting poly edges/intersections only and that's my major disagreement with the method all along.

Once more Quincunx = 2xRGMS + 5-tap blur filter; while the first affects poly edges and intersections exclusively, the latter goes a tad further.
 
OpenGL guy said:
radar1200gs said:
Yes I know what Quincux is doing. I was waiting for someone to ask this question, actually.

The difference is that you don't HAVE to enable Quincunx if you don't want to - you have a choice. A choice you don't get with ATi and gamma correct AA.
Let's just beat that dead horse some more, shall we? When you can enable gamma correct AA on your card, maybe you'll have a point.

<<SNIP>>
http://www.nvnews.net/vbulletin/showthread.php?t=41532
4xaa_gc.png


4xaa_nogc.png


Heh, I knew the 70.41's were good, just didn't know how good. Early days by the looks of things, but a sign of things to come.

Technically, of course, it's not my card doing the GC FSAA (I have nx3x, not NV4x), but, it's good news for NV4x owners (quadro's use the same GPU as consumer cards, just different drivers).
 
:rolleyes:

Heh, I knew the 70.41's were good, just didn't know how good. Early days by the looks of things, but a sign of things to come.
When ati has gamma correct its a bad thing, now as soon as nvidia has it, its a good thing? Priceless.

If it is enabled on the Quadro's it is not a given that it will be on the normal boards as fsaa performance isn't going to be as important there and if they need extra steps to do it (which you would guess they would if its taken this long) it might use more performance. Lets see some benchmarks.
 
whql said:
:rolleyes:

Heh, I knew the 70.41's were good, just didn't know how good. Early days by the looks of things, but a sign of things to come.
When ati has gamma correct its a bad thing, now as soon as nvidia has it, its a good thing? Priceless.

If it is enabled on the Quadro's it is not a given that it will be on the normal boards as fsaa performance isn't going to be as important there and if they need extra steps to do it (which you would guess they would if its taken this long) it might use more performance. Lets see some benchmarks.

No, you have my argument all wrong. Gamma Correct FSAA is only a bad thing when you have no control over it. That's why ATi's implimentation is bad. GC FSAA on nVidia will hopefully be just one method out of many available to the end-user.

It probably will use more performance/resources etc, but that doesn't mean it won't still have uses for older games or less demanding games.
 
radar1200gs said:
whql said:
:rolleyes:

Heh, I knew the 70.41's were good, just didn't know how good. Early days by the looks of things, but a sign of things to come.
When ati has gamma correct its a bad thing, now as soon as nvidia has it, its a good thing? Priceless.

If it is enabled on the Quadro's it is not a given that it will be on the normal boards as fsaa performance isn't going to be as important there and if they need extra steps to do it (which you would guess they would if its taken this long) it might use more performance. Lets see some benchmarks.

No, you have my argument all wrong. Gamma Correct FSAA is only a bad thing when you have no control over it. That's why ATi's implimentation is bad. GC FSAA on nVidia will hopefully be just one method out of many available to the end-user.

It probably will use more performance/resources etc, but that doesn't mean it won't still have uses for older games or less demanding games.
Did you even read the lil snipit and artical?
http://www.gzeasy.com/itnewsdetail.asp?nID=16498
 
The problem is that ATI's implementation does not allow the adjustment of the gamma that is used in the recombination. For my monitor, the value that ATI uses has always been a bit high. What we really need to see is a user-selectable gamma setting for the recombination, as well as a wizard in the drivers that walks users through setting it properly for their monitor.
 
I honestly cannot tell what those images are testing, seriously.

Gamma Correct FSAA is only a bad thing when you have no control over it. That's why ATi's implimentation is bad.

That's bad logic. It's ATI not giving you control that is bad. The implimentation can be good. Now I see why your logic is nonexistant.
 
Deathlike2 said:
Gamma Correct FSAA is only a bad thing when you have no control over it. That's why ATi's implimentation is bad.
That's bad logic. It's ATI not giving you control that is bad. The implimentation can be good.
The ATI isn't giving you control of GC in it's MSAA implementation. That's why this implementation is worse than an implementation where such control is given.

The main thing is that there is no other GC MSAA implementation than that of ATI today. So it's hardly worse than something theoretically good but non-existant for now. Let's see whether this GC MSAA on NV4x will made it into the ForceWare for GeForces not Quadros.
 
radar1200gs said:
Gamma Correct FSAA is only a bad thing when you have no control over it. That's why ATi's implimentation is bad.
Chalnoth said:
The problem is that ATI's implementation does not allow the adjustment of the gamma that is used in the recombination. For my monitor, the value that ATI uses has always been a bit high. What we really need to see is a user-selectable gamma setting for the recombination, as well as a wizard in the drivers that walks users through setting it properly for their monitor.
This would appear to be a problem only compared to hardware that does allow for user-adjustable gamma. There is no such hardware in the consumer space as of yet, though we'll see what nV can do with those new drivers (and how much of a hit that extra shader pass will incur).

And ATI appears to be gamma correcting to the NTSC and HDTV standard of 2.2, according to RTR 2/E p.110: "A value of 2.2 has been proposed as part of a standard color space for computer systems and the Internet, called sRGB."

Control is usually preferable (when paired with education), but calling ATI's implementation "bad" or a "problem" appears disingenuous.

BTW, do LCDs offer a more standard gamma correction target than CRTs, or is it still influenced more by ambient conditions?
 
Except for the fact that on my previous monitor (I haven't tested rigorously on my new one), ATI's use of a gamma setting of 2.2 was so high that it was almost as bad as no gamma at all, just in the opposite direction. Not quite, but almost. So it was better than no gamma correction, but only just.
 
Chalnoth said:
Except for the fact that on my previous monitor (I haven't tested rigorously on my new one), ATI's use of a gamma setting of 2.2 was so high that it was almost as bad as no gamma at all, just in the opposite direction. Not quite, but almost. So it was better than no gamma correction, but only just.
So blame the monitor. Obviously it's not giving the correct color space.
 
OpenGL guy said:
Chalnoth said:
Except for the fact that on my previous monitor (I haven't tested rigorously on my new one), ATI's use of a gamma setting of 2.2 was so high that it was almost as bad as no gamma at all, just in the opposite direction. Not quite, but almost. So it was better than no gamma correction, but only just.
So blame the monitor. Obviously it's not giving the correct color space.

Lots of cheap monitors (and there are far more cheap monitors than expensive in common usage) have all sorts of trouble displaying images as intended, even when adjusted as well as possible (never mind the brightness/contrast sins their owners then commit on top of that). In addition there are 3 different color gamuts in common use 9300K, 6500K and 5500K.

On top of all that monitors age and their image deteriorates over time, requiring adjustment until adequate adjustment can no longer be provided and the monitor reaches the end of its effective life span.

ATi's hardwired solution takes none of that into account, so rather than blame the monitor, which at least allows adjustments, blame ATi for not providing any.
 
Chalnoth said:
Since when was 2.2 gamma "correct" ?
Did you read my post? 2.2 was proposed as the PC standard, and apparently it's already the NTSC and HDTV standard.
 
Back
Top