Messing with LOD bias... and a question about AA and consume

Reverend said:
Aw shoot, now I've gone ahead and done it... while taking all the screenshots, a thought crept into my head to actually make an article out of this... and so I shall.

I was hoping you would, while your working on your article it would be also interesting to use the MDK 2 and NOLF shots like you did with your edge AA with the Voodoo 5 vs Geforce 3 :p
I assume you will be using AA for the LOD Bias shots.
 
tobbe said:
Isn't Doom's shots showing the difference between supersamling/multisampling (and implicit LOD differencies) rather than an explicit difference in default LOD?
IE the LOD would be the same if AA wasn't used?

ATI defualts to a more agressive LOD than nvidia in both d3d and ogl when FSAA is off.
 
Bambers said:
tobbe said:
Isn't Doom's shots showing the difference between supersamling/multisampling (and implicit LOD differencies) rather than an explicit difference in default LOD?
IE the LOD would be the same if AA wasn't used?

ATI defualts to a more agressive LOD than nvidia in both d3d and ogl when FSAA is off.

Doesnt that make either nv or atis OpenGL a non-conformant implementation, since afaik mipmap selection is unambigously defined?
 
Cheated pictures...

Thanks tkopp! Now I can see all of the images.

Actually if you look closer at the Geforce4xAANoAniso.jpg , it can be clearly seen that it has a 2D blur filter applied to it after it was taken.

The blur filter kernel is way larger than the Quincunx filter, but since it supposed to be 4XAA it shouldn't provide full-screen blur at all.

You can compare it to the increased LOD image and look at the edges of the closest torch. The edges look blurrier on the default LOD image, but LOD does nothing to poligon edges. Note, that this is the same amout of blur you get on textures compared to the Radeon shot.


I find it pathetic when fans of a videocard provide faked images from the other companies products, just to prove their card is the best...
(And no, I'm not an nVidia fan.)
 
Here's the magnified part of these "screenshots".

doomtrooper-compare.png


Not only the "Geforce Normal" shot is blurred, you can see that the "Radeon Aniso" shot has light pixels around the dark edges.
I won't say what it is, I leave it to people familiar with image filters to realize what effect has such side-effect.

And no jpeg compression doesn't do this. If you set the compression quality to that low (another way to cheat), the image would get blocky. So these filters were applied before jpeg compression.
 
Rev-

Another interesting topic. In the V5 days, combining slightly more aggressive LOD then throwing on AA was a nice combination of adding more detail without the expense of adding moire, shimmering and crawl.

The 8500 is the same way, but a bit tougher to tune out as anisotropy wields a bit of an aggressive LOD shift to go with it. Luckily SS does indeed clean this up nicely.

I also don't know about anyone else- but has anyone been able to get the LOD Bias adjustment to work on the 8500? In early drivers, it was stuck in it's fuzzy setting and now seems to be stuck in it's highest sharp setting OGLLODBias from 0 to 8 and everything in between has no effect in most all the drivers I've tried- these being the 71xx->912x drivers for Win9x.

I also don't think all LOD Bias is created equal. As these are tuning some sort of feature in hardware and texel sampling, it's not assured an LOD Bias setting is applied equally to all mipmaps. I've noticed on the GF3 that there is some sort of disproportionate usage per mipmap level with the standard LOD Bias adjustment in D3D (haven't studied close with the new LOD Bias for OGL yet). It seems an LOD Bias of -1.0 doesn't have nearly as dramatic an effect on farther mipmaps as closer level mipmaps. The 8500 and V5 on the otherhand- an aggressive LOD Bias setting usually is fairly linear across all mipmap levels.

You have your work cut out for you Rev- as showing aliasing and crawl in screenshots is no easy task. What can look perfectly reasonable in a still shot can be a moire-fest in motion.

Hyp-X:
On SS2-
>>"There is no way to reduce this shimmering on GF3 cards as MSAA changes nothing on textures (except Quincunx - but that's not the right way to reduce aliasing). "<<

I've been posting this *everywhere* concerning my GF3 and SS2. The very first single player level, as you approach the stairs the shimmering on the steps is just horrible and no matter what settings for LOD, filter booster or anisotropy, I just cannot remove this noise. Greatly reduce? Yes, but remove it completely? No. My 8500 with 4xQ SV and it's gone.

I'm still playing around with the 4xS mode with RivaTuner and NVMax. Some way to tune this out of the GF3 has to exist- at not at the expense of reducing closer mipmap detail to the point of blur.
 
Hyp-X said:
And no jpeg compression doesn't do this. If you set the compression quality to that low (another way to cheat), the image would get blocky. So these filters were applied before jpeg compression.

Whoops, meant to say that you have to be carefull in looking at JPEGs as its better to have the raw source files.
 
Re: Cheated pictures...

Doomtrooper said:
Hyp-X said:
Thanks tkopp! Now I can see all of the images.

Actually if you look closer at the Geforce4xAANoAniso.jpg , it can be clearly seen that it has a 2D blur filter applied to it after it was taken.

The blur filter kernel is way larger than the Quincunx filter, but since it supposed to be 4XAA it shouldn't provide full-screen blur at all.

You can compare it to the increased LOD image and look at the edges of the closest torch. The edges look blurrier on the default LOD image, but LOD does nothing to poligon edges. Note, that this is the same amout of blur you get on textures compared to the Radeon shot.


I find it pathetic when fans of a videocard provide faked images from the other companies products, just to prove their card is the best...
(And no, I'm not an nVidia fan.)

As far as I know the LOD adjustment was at default but I didn' t take the shots but was confirmed by another Ti4600 user that posted his screen shot at default LOD.

This was his shot:
RTCW%20%231.jpg

Then using a third party tweaker adjusted the LOD to -1.5 here:
RTCW%20%232.jpg
 
Re: Cheated pictures...

Doomtrooper said:
As far as I know the LOD adjustment was at default but I didn' t take the shots but was confirmed by another Ti4600 user that posted his screen shot at default LOD.

This was his shot:
...

Ok. This new shot has no added blur.

Of course this is a completely different shot. (Different position, resolution.)
So it can't be compared to the previuos ones.

Edit: I mistyped the quote tag...
 
Im looking at LOD here, like the Gun Detail and the walls, the blurring I though was what I used to see with my old Geforce 3 Quincunx so I never thought anything of it. I do see the blurring is less on the newer shots but still there.
 
The only way the quality thing is ever going to get a satisfactory answer is if someone sets up a blind test with motion video and enough people to get some trusthworthy MOS's. Since that is not going to happen I wish people would just stick to giving their opinions. Screenshots are fine to point out still image artifacts but screenshots dont move, cant we just let the quality thing rest as far as proof is concerned? :( All this half assed "proof" gets noone anywhere.
 
Screenshots are fine to point out still image artifacts but screenshots dont move, cant we just let the quality thing rest as far as proof is concerned? :( All this half assed "proof" gets noone anywhere

Well, I don't personally believe it has anything to do with "proof" so to speak, but instead ensuring information or portrayals given are of high value to the gamer and don't mislead a false conclusion. I think that's what Rev is talking about, at least from my point of view.

There are lots of screenshots floating around of Serious Sam 2, Jedi II, and the like at LOD -1.6, 8x anisotropy and 2xAA that look absolutely gorgeous... clear, defined, sharp and incredible. Amazingly enough, todays generation of cards can handle this and still obtain playable framerates for a lot of people in a lot of games.

But the unsaid thing is what effect on texture aliasing such gorgeous shots *dont* show when in motion.

The whole topic at hand is the amount of detail LOD bias shifting reveals and the increase in details/loss of blur. The part being overlooked is the impact this has on texture aliasing- and for people critical of texture aliasing, such impact is self-defeating (additional aliasing artifacts outweigh the IQ improvement from the increase in detail).

So more appropriately, what is needed is a good discussion of the pro's *and* cons of using more aggressive texture mip LOD bias, with an eye not only on the improvements in texture detail, but also any negative impacts it has from adding moire, shimmering and noise... as well as methods to obtain the best of both worlds- incredible texture detail but with minimal aliasing.
 
It is very hard to rely on OBJECTVIVE screen shots and why I always hoped Beyond3D would come back as its one of the few sites that used to say it like it is and I hope it continues.
My major beef with LOD bias is the peformance hit that cards take that no one ever mentions. If you look at ALL reviews done on the big sites like Anandtech, Tomshardware etc... LOD is never mentioned and screen shots are few to none. What they do like to do is put up nice graphs that really mean squat unless its a apples to apples comparison.
If the faster card is running with a HIGHER LOD , say 0 and takes a performance hit to match the IQ of the slower one..this SHOULD be mentioned. :-?
 
But thats just as meaningless, the way different cards use the bias and perform filtering in general isnt even the same ... what I am interested in is what can give the best perceptual quality. Personally Id love for reviewers just to tweak resolution, lod-bias, filtering and and anti-aliasing to what they find optimal on cards independently in some actual games ... and at the end of that compare the way they like how it feels compared to other cards where they did the same. Sure that just gives me his opinion, but at the end of the day that opinion means more than numbers for which I have absolutely no idea how they will correlate to my personal experience when I get the card.

Id rather have an honest opinion than a meaningless number.

Marco

PS. Im not saying objective measures are not valuable, Im only saying they are only valuable as a cost saving device ... their value is derived from the fact that they correlate well with the subjective experience (higher framerate/resolution/supersampling-ration/anisotropy/etc means higher quality, that sort of thing). But the moment that correlation breaks down, for instance because a number on one architecture does not mean the same as on another the usefullness of the measure breaks down and it should not be used. Given how different supersampling, texture filtering etc is now being handled on different architectures I feel that NONE of the objective measures which are available to us (framerate/resolution/supersampling-ration/anisotropy/etc.) are any longer very usefull to compare between them, wether you include the LOD bias or not. Since the quality of a screenshot does not correlate well with the quality of the moving images either they too are not very usefull, again regardless of the LOD bias.
 
Bravo MfA- that's my main beef completely.

I think that is also why people enjoy Rev's analysis on things, as well as the commentary from *real gamers* that use the hardware to point out these things. This is where all real practical and valuable information is obtained.

I also enjoy the folks at 3DGPU since they are usually pretty good about mentioning the things other sites do not- and it's what's not said that is often times much more important and of value than is said at most review sites.

Benchmark graphs of FPS scores of two completely different and unequal resultant image qualities is useless information. I fail to see any value at all in this standard process.
 
Two shots or maybe three during game play can show texture aliasing. You don't necessarily need 60FPS images to detect problems in rendering. Just highlight an area of note and show two images with a slight different angle or movement. Flippling between the two can show most texture problems with the added bonus keeping the data files rather small. Here is a animated gif (134kb) I did showing texturing aliasing on a Radeon card: (really only two frames would have been enough to show the moire problem very apparently.)

moire_texaliasing.gif


Just some ideas I hope is helpful.
 
Noko-
Your scaled example doesn't even need more than a still/single frame... and is not symptomatic of the types of comparisons being created.
 
I have an idea on how to show texture aliasing on a web page but I am not sure if it will work or if it does, whether it will tax B3D's bandwidth. But I will try.
 
Hyp-x: I've also noticed the texture aliasing in SS2 and it annoys the hell out of me.

Sharkfood: You said 4x Q SV in your radeon 8500 took away the shimmering. What res do you run that at? Cuz mine still isn't as smooth as I'd like with 1024*768 2x Q SV.

jack
 
Back
Top