Radeon 8500 Aniso vs Geforce 4 Aniso

So by adjusting LOD on a Gf4 to match Radeon texture clarity, the LOD has to be set so low it woudl cause terrible texture aliasing on the Gf4, which isnt apparent on the 8500.

Or did I miss something?
 
That's possible. But, since I haven't seen a Radeon 8500 in action myself, I can't be sure whether 'terrible' to me isn't so terrible to many other people. It just does not bode well...

However, one thing to keep in mind is that the Radeon 8500 does support 16-degree anisotropic, so in some limited cases, the 8500 will show less aliasing (generally when a surface is viewed at a great distance and high angle...).

I personally didn't see the comparison, but it most certainly should have been done at the exact same degree of anisotropic.
 
Chalnoth,

I always understood that Radeons 16x was equal to nVidia's 8x. they are both 64 tap anisotropic filtering, but as the Radeon's is bilinear based its 16x4 samples =64tap v nVidia tilinear = 8x8 samples = 64 tap.

I could be wrong but that why people here always compare Radeons 16x to nVidia 8x.
 
The other interesting tidbit is how critical to IQ comparisons made with the 8500's aliasing completely forgive much larger and more obvious flaws with aliasing on the GF4.

Fire up Quake3, set Q3DM2 and find this area. You may apply as much anisotropic filtering and as much AA as you possibly can. Key on the areas circled in blue and watch the aliasing as you walk across the bridge (screenshot scaled to 800x600 for modem users. Example to see aliasing should play at 1024x768):
q3alias.txt


This is incredibly clean on the 8500. The shimmering, crawling, aliasing is NOT there at all with the 9031 drivers + 8500.

I find it humorous how I have to take screenshots, rotate textures 45 degree, magnify 400x or otherwise make special commendations to have texture aliasing issues "pointed out" for the 8500... yet the display with the naked eye is so brutally obvious with texture aliasing on my GF4's but this is somehow ignored or totally overlooked.

Cheers,
-Shark
 
Randell said:
I could be wrong but that why people here always compare Radeons 16x to nVidia 8x.

Because ATI's max anisotrophy is 16, there will be some rather rare situations where it's generally better. A prime case was on the earlier Serious Sam shot, where the floot was viewed straight-across at an extreme distance. This is obviously not a common situation, which is why I generally ignore it in arguments.

[qutoe]Fire up Quake3, set Q3DM2 and find this area. You may apply as much anisotropic filtering and as much AA as you possibly can. Key on the areas circled in blue and watch the aliasing as you walk across the bridge (screenshot scaled to 800x600 for modem users. Example to see aliasing should play at 1024x768):[/quote]

I just have to say that I've never seen noticeable aliasing on my GeForce DDR or GeForce4. I also do not have Quake3 (nor do I care to get it), so if you can find a game that I do have, maybe I'll believe you. Right now, I just suspect that it's due to some different settings that you had on those machines, as I've never witnessed it myself, and I've certainly not seen anybody else claiming bad aliasing on a GF card.
 
This is getting long winded and boring.

When I first experienced aniso via a GF3, the difference was stark to me but only when I specifically do such a critical analysis. It has to do with more detail further into the distance as well as less texture aliasing visually.

If I really get down to playing a game for fun, however, I will probably not miss aniso.

It's fine discussing this in terms of 3D tech but take a second for a reality check.

Screenshots are overused in this aspect. I'm to blame, so are others.[/b]
 
Ahh, good to see that anecdotal evidence, cherry picking, self-selected samples are still a tradition when debating.


Is it that hard for people to actually comprehend that neither card will be an "overall IQ" winner, and that some implementations will look better in some situations, and some not? For every screenhot you take, you can pick areas with problems and areas with no artifacts. I can do this on every card.

Likewise, there will be performance tradeoffs and IQ tradeoffs for each and every setting on a card. Maybe one day a pixel-perfect card will come out that has zero artifacts anywhere and high performance.

This endless bickering and zooming in on your favorite screenshot is never going to prove anything.
 
I just have to say that I've never seen noticeable aliasing on my GeForce DDR or GeForce4.

That I find very hard to believe.. Either that or you have a MAG 17F monitor or similar. :)

I also do not have Quake3 (nor do I care to get it), so if you can find a game that I do have, maybe I'll believe you.

Maybe the prior statement wasnt so hard to believe after all. DO you have any games? :) I'm sure I've never seen any "texture aliasing" if I use only MS Word or Outlook, but once ya' fire up a few modern to semi-modern 3D games, it's pretty obvious.

What games DO you have? OGL or otherwise, doesnt matter. NOLF? Maybe SS or SS:SE?

Rev-
It's fine discussing this in terms of 3D tech but take a second for a reality check.
Screenshots are overused in this aspect. I'm to blame, so are others.

I'd say a "reality check" is in order if you've played games without aliasing for any substantial amount of time then revert to a product that has blatant, obvious aliasing. Again, assuming decent gear (good monitor, games with textures with some amount of detail and contrast, etc.etc.).

The problem is screenshots DON'T show the aliasing at all.. they can only be used to highlight an area to specify the location where the problem areas are located... not as if such pointing out is needed as it should be blatantly obvious.
 
Sharkfood said:
Maybe the prior statement wasnt so hard to believe after all. DO you have any games? :) I'm sure I've never seen any "texture aliasing" if I use only MS Word or Outlook, but once ya' fire up a few modern to semi-modern 3D games, it's pretty obvious.

What games DO you have? OGL or otherwise, doesnt matter. NOLF? Maybe SS or SS:SE?

Haven't bought a whole lot of games recently...I'm currently waiting for UT2k3 and Neverwinter Nights, but here's a summary of my current library:

Unreal Tournament
Morrowind
Alien vs. Predator
X-Wing: Alliance
Freespace 2
Half-Life
Quake 2
Unreal
Final Fantasy 7 & 8
Baldur's Gate 1 & 2
Icewind Dale
Planescape: Torment
Starcraft
Diablo II

And a bunch more older ones.

Oh, but I will have to retract my previous statement. I have seen a fair amount of texture aliasing in Freespace 2. But, I'm really not ready to call that a hardware issue... I guess all I can say is that in no OpenGL game have I see any noticeable texture aliasing.

Yeah, I haven't been getting that many new games, but since my true love in gaming lies with RPG's, that should be fairly understandable, as few decent ones have shipped lately (And I do NOT consider Diablo-style games RPG's...those are action games with character building...).
 
Chalnoth said:
Oh, but I will have to retract my previous statement. I have seen a fair amount of texture aliasing in Freespace 2. But, I'm really not ready to call that a hardware issue...

It is expected. Freespace 2 does not use MIP-map at all.
 
Back
Top