Which is better, higher res and no AA, or lower res and AA?

Which resolution/AA mode is best?

  • 1280*960 with 4X AA

    Votes: 0 0.0%
  • 1024*768 with 6X AA

    Votes: 0 0.0%

  • Total voters
    208
1280x1024 w/4x or 2x FSAA for me (Radeon 9500NP - monitor is a 5:4 Samsung 172T-MM LCD).

I am still suprised how well this card performs - I was prepared for the worst when the 9500NP-to-9700 mod failed, but it's actually not bad at all. :)

MuFu.
 
JF_Aidan_Pryde said:
Many people back then argued that 800x600@4RGAA is better/just_as_good as 1600x1200. By the same logic, 80x60@40xAA would be just as good as 1600x1200 no AA.

I realise it's not crucial for the argument, but shouldn't that be 80x60@400xAA? (In order to raise sampling as much as pixel count is lowered.)
 
Nick said:
800x600 with 16x :D

When playing an action game, you won't see the pixels even at 640x480. Unless ofcourse 'patience' is your kind of action game or you have a 22" monitor at 10 cm.

Maybe, but the image will look "blurrier" due to less pixels being on the screen whereas playing at 1024x768 and higher will look "sharper".

I play 1024x768 as a minimum
 
As others have said, 1280x1024 at whatever AA mode I can get acceptable performance out of, given that this is the native resolution of my LCD display.

Nite_Hawk
 
UberLord said:
Maybe, but the image will look "blurrier" due to less pixels being on the screen whereas playing at 1024x768 and higher will look "sharper".

I play 1024x768 as a minimum
I think you might have misunderstood my argument. When playing a fast-paced game (i.e. you're not standing still at any moment), you absolutely don't see the individual pixels. It already looks blurred just because you're moving fast.

Bigger resulutions don't change this much. You just don't have the time to see those extra pixels. Only when the pixels are so big you directly see them as boxes instead of dots (lower than 640x480 on a 17"), resolution starts to matter.

Most people probably just decided what to answer to this poll by looking at a screenshot, or at least by not moving very fast. That's just as dumb as comparing the quality of minesweeper...
 
I hate games (like NWN) that puts the overlayed text too small when you up the resolution.

(At least I think it was NWN. I think tribes2 did the same thing. And BF1942)
 
They added "high rest fonts" a patch or two ago. Just exactly too big for 1280x960/1024 (for me), but just right for 1600x1200 I think.
 
Nick said:
UberLord said:
Maybe, but the image will look "blurrier" due to less pixels being on the screen whereas playing at 1024x768 and higher will look "sharper".

I play 1024x768 as a minimum
I think you might have misunderstood my argument. When playing a fast-paced game (i.e. you're not standing still at any moment), you absolutely don't see the individual pixels. It already looks blurred just because you're moving fast.[/b]

No, I don't see the individual pixels, but the overall picture seems sharper, more clear, better defined.

And is UT or UT2003 not faced paced enough? :LOL:


Bigger resulutions don't change this much. You just don't have the time to see those extra pixels. Only when the pixels are so big you directly see them as boxes instead of dots (lower than 640x480 on a 17"), resolution starts to matter.

But you see more pixels.
For a really good example, take a top of the range Digital Camera and 3 year old digital camera. Now, take the same photo and get em printed on A4 paper at a decent camera shop and compare. You'll notice that the newer camera picture is much clearer and sharper. This is primarily due to the pixel density doubling in size on modern camers.
 
IMO, there's not straight answer for this... When I am using my 21" FDTrinitron, bigger resolution goes over AA. But when I am using my TFT (primarily only during vacations, because I rather not move 21 inch CRT around on every trip I need to take my computer with me... ;) ) 1024x768 is the optimal resolution so extra AA will become handy.

Of course when ever performance stays acceptable, I try select best combination of biggest resolution and 4xAA with at least 8xAF. :)

EDIT: uhh... it seems I really need a sleep... a way too high typo / word ratio.
 
Mintmaster said:
I assume that's a GF3/GF4 or something? I think the 9700's 4x FSAA has a marked improvement over its 2xFSAA, so 1280x1024 w/ 4xAA is significantly better than 1600x1200 w/ 2xAA. My choice would be reversed with a GF4, however, due to the ordered grid 4x.

Well, not in my case. I tested with my own program with my own sample pattern (similar to VSA-100's). 4X FSAA is still not good enough. I don't think that ATI can be much better than that.

I guess Nagorak should specify the card that we're using :)

That's true. NVIDIA's 4X OGSS is not worth it. 4XS is a bit better though.
 
Nick said:
I think you might have misunderstood my argument. When playing a fast-paced game (i.e. you're not standing still at any moment), you absolutely don't see the individual pixels.
I disagree. I have a very nice 22" Iiyama CRT, and at 1024x768 I can still see the individual pixels in the game, even in motion, even in fast FPS games. At anything less than that it becomes annoying. For example: at 800x600 the actual scanlines on the monitor become very visible.

For reference, I run my desktop at 1792 x 1344.
 
Nick said:
I think you might have misunderstood my argument. When playing a fast-paced game (i.e. you're not standing still at any moment), you absolutely don't see the individual pixels. It already looks blurred just because you're moving fast.

Bigger resulutions don't change this much. You just don't have the time to see those extra pixels. Only when the pixels are so big you directly see them as boxes instead of dots (lower than 640x480 on a 17"), resolution starts to matter.

Most people probably just decided what to answer to this poll by looking at a screenshot, or at least by not moving very fast. That's just as dumb as comparing the quality of minesweeper...

This is simply NOT true.
In FPS like Serious Sam, where there are large draw distances, high res ABSOLUTELY impacts your gaming performance.
Its the difference between seeing an enemy, and not.
anything below 1024x768 is a large disadvantage in SS/SS:SE, just for an example.
 
Given that many of the factors (such as the display device) you've left completely unspecified, I would simply choose the one with the highest sampling rate:
The options equate to:
  1. 3.8x10^6 samples
  2. 4.9x10^6 samples and
  3. 4.7x10^6 samples
This implies that "2" would be the best choice.
 
I don't play fast-pace FPS's, I play creeping, slow, methodical FPS's. Half-Life, Unreal, Thief, Deus Ex, NOLF, just to name a few. I like to sneak around a lot, and catch sight of my enemies before they can see me. When you're crawling around a corner, or trying to see if there's a monster behind that rock in the distance, you need all the detail you can get. Thus, resolution is the top priority.

However, because I don't run around like a chicken with my head cut off, taking unnecessary damage and wasting all my ammo, visual quality becomes a lot more apparent. Even at 1280x1024 you can notice pixel popping and jagged edges fairly easily. If my card was capable of running 4x FSAA at that resolution and still get acceptable framerates, it would be the ideal setting IMO. Unfortunately, the only game that I can do that in is Thief, and the performance drops so much that I don't think the quality gained is worthwhile. Trying to make Thief look with its 8-bit textures is an exercise in futility anyway ;)

Now, what was my point again? Oh yeah--my monitor can't do 1600x1200, so I voted for 1280*960 with 4X AA.
 
I voted 1024X768 6xFSAA, because of the game called BF1942, now my video card is R9700 Pro ,when i play the BF1942 online,it's very noticeable while turn on 4XFSAA, everything i look in this game is so smooth, sweet :LOL:
 
UberLord said:
But you see more pixels.
For a really good example, take a top of the range Digital Camera and 3 year old digital camera. Now, take the same photo and get em printed on A4 paper at a decent camera shop and compare. You'll notice that the newer camera picture is much clearer and sharper. This is primarily due to the pixel density doubling in size on modern camers.
LOL, what did I just tell you? Don't compare it with screenshots or photographs, but with a game in the heat of the action!
 
Bigus Dickus said:
I disagree. I have a very nice 22" Iiyama CRT, and at 1024x768 I can still see the individual pixels in the game, even in motion, even in fast FPS games. At anything less than that it becomes annoying. For example: at 800x600 the actual scanlines on the monitor become very visible.

For reference, I run my desktop at 1792 x 1344.
Dickus, I already said in my first post that bigger monitors are an exception when they are not placed further away. BTW, the fact that you start to see lines at 800x600 has nothing to do with resolution or antialiasing from a theoretical point of view, but is just a flaw of CRT monitors that should not be 'solved' by using higher resolutions. It's a technical limitation even the best CRT's have but it shouldn't influence your answer to this poll.

It's simply a physical fact that on a 'perfect' monitor at the correct viewing distance in 1024x768 mode the pixels are too small to see individually. In other words, you can't see the difference between a black dot or two grey dots near each other. So that's when anti-aliasing matters. If this isn't the case on your CRT and you think it's bad, then next time buy a 17" LCD and you won't have this problem.

Sitting further away when playing might also help :LOL:
 
Back
Top