MSAA + HDR benchmarks ?

Mintmaster said:
So even if ATI doesn't improve their scores, I think if you compared the image output side by side then the X1800XT would win by a decent margin. It'll be way better than the 256MB 7800GTX's output, and notably better than the 512MB GTX's output even considering the 36% resolution loss per axis.
But that's only considering the IQ of polygon edges! Considering overall IQ, I'm not sure I'd pick 1024x786, 4xAA over a plain 1600x1200, in fact, I'd rather doubt it.
 
incurable said:
But that's only considering the IQ of polygon edges! Considering overall IQ, I'm not sure I'd pick 1024x786, 4xAA over a plain 1600x1200, in fact, I'd rather doubt it.

The other problem is LCD's and their native res. The X1800 XT will imo need to handle at least 1280*1024 with MSAA + HDR, otherwise it's not that interesting for most people using LCD's (1024 is not a very common resolution these days). Though 1280*1024 with 2X MSAA is sure as hell a lot better then 1280*1024 without it, but i'd like to see if the XT can handle that in a new game first, f.e UE3 engine based games.
 
Bjorn said:
but i'd like to see if the XT can handle that in a new game first, f.e UE3 engine based games.
Yeah, probably not. But UE3 won't be available for some time yet, and as such we'll have much better hardware available by that time.

With the breakneck pace of hardware, it's really most important to just look at what the hardware you buy can buy today. When the games come out that can use new hardware, then buy new hardware. If you don't want to pay the money, buy cheaper hardware.
 
incurable said:
But that's only considering the IQ of polygon edges! Considering overall IQ, I'm not sure I'd pick 1024x786, 4xAA over a plain 1600x1200, in fact, I'd rather doubt it.
No, I took everything into account (which is why I said "even considering the resolution loss).

I'd say polygon edge quality is way better, equal to at least 3200x2400 if not higher. For the rest of the image, you're only right only to a certain degree. Higher resolution is only going to make a difference under texture minification, i.e. beyond the first mipmap transition.

Now, without anisotropic filtering, higher resolution gives you quite a big boost in texture sharpness, but if you play without AF (on recent cards) you don't care at all about image quality. It's many times more important than AA. With anisotropic filtering, however, the first mipmap transition is quite far away for the majority of the screen. So higher resolution doesn't improve too much.

One of my friends was saying even 640x480 is fine as long as there's 6xAA (e.g. for newer games on an older video card). That's essentially DVD quality, after all.

Bjorn said:
The other problem is LCD's and their native res. The X1800 XT will imo need to handle at least 1280*1024 with MSAA + HDR, otherwise it's not that interesting for most people using LCD's (1024 is not a very common resolution these days).
First, remember that this is only one game; in fact, it's the first game to use HDR. Splinter Cell is another story. The point is that disabling 4xAA gives you 23% more performance, and going one resolution down gives you 50-60% more performance. It's only a coincidence that makes the 7800GTX look ideal at 1280x1024 in this game. Though I suppose my post was specifically concerning Far Cry.

Second, you can do something like run at a custom resolution like 1280x720 to restrict LCD rescaling to only one direction. That's only 17% more pixels than 1024x768.

Third, I know it's best to run at a native resolution, but would you seriously give up AA for that? I'm going to make some comparison screenshots and start a poll, because I don't believe it.

Fourth, how many people in the market for a $500+ video card have a 1280x1024 LCD? Many will have a CRT, since they are best for gaming due to response time and resolution flexibility. Many will have a better LCD, like the popular 2001FP/2005FPW/2405FPW models from Dell, each having a higher resolution. Again, future games will give no guarantee of playability at 1280x1024, and surely LCD owners know that.
 
Mintmaster said:
I'd say polygon edge quality is way better, equal to at least 3200x2400 if not higher. For the rest of the image, you're only right only to a certain degree. Higher resolution is only going to make a difference under texture minification, i.e. beyond the first mipmap transition.
Correction: it's going to make a difference beyond the first mipmap transition for the lower resolution, 1024x768, which will be significantly closer than the transition at 1600x1200.

As for giving up AA for resolution, that all depends upon the game and the resolution difference. For example, some games just don't look very good at the highest resolutions. While others have very little polygon contrast or lots of alpha test surfaces, either of which will minimizes the improvement of FSAA.
 
Chalnoth said:
With the breakneck pace of hardware, it's really most important to just look at what the hardware you buy can buy today. When the games come out that can use new hardware, then buy new hardware. If you don't want to pay the money, buy cheaper hardware.
Are you serious?

I'm still on a 9700PRO, and if I just looked at "current games" back in early 2003, the Geforce FX was not far behind. Today, though, games like FarCry, SC:CT, HL2, Painkiller would all suck on the latter, unless I disabled PS2.0 when possible. Looking to the future is very important.

I buy hardware with the intent of reducing resolution as games get tougher. You think I should buy cheaper hardware more frequently? The 6600GT has what, maybe 35% higher performance with AA/AF on average? That's not even one resolution bump! That $150 will do lot more good for me in the future.



Bjorn, here is the fact: ATI loses around 20% for 4xFSAA with HDR. That's comparable to the hit without HDR, and less than the hit past GPUs took without HDR. The performance hit of HDR on R520 is comparable (some games more, others less) to that of G70 with the exception of FarCry.

Unless you think AA was worthless in the past, you can't say it's worthless now.
 
Chalnoth said:
Correction: it's going to make a difference beyond the first mipmap transition for the lower resolution, 1024x768, which will be significantly closer than the transition at 1600x1200.
Well yeah, I never said otherwise. But even when you're beyond that transition, the detail improves with the square root of resolution, hence the 36% number I gave. As for how many interior pixels on the screen are affected by resolution increase, that depends on the texture resolution.

[/quote]While others have very little polygon contrast or lots of alpha test surfaces, either of which will minimizes the improvement of FSAA.[/QUOTE]
Well if low contrast prohibits you from seeing edge quality improvements, then why wouldn't they diminish your ability to see texture quality improvements from higher resolution? I don't think this makes resolution gain relatively more valuable than FSAA.

For alpha tested textures, you're right, though transparency antialiasing makes sure your rendering power goes where its needed the most. In fact, I think this tips the scale even more towards FSAA, not less.
 
Mintmaster said:
Well if low contrast prohibits you from seeing edge quality improvements, then why wouldn't they diminish your ability to see texture quality improvements from higher resolution?
The visual deficiency of lower resolution on textures is blurring. The visual deficiency of lack of FSAA is aliasing. These are rather different effects, and it's certainly going to be possible for one to dominate over the other.

For alpha tested textures, you're right, though transparency antialiasing makes sure your rendering power goes where its needed the most. In fact, I think this tips the scale even more towards FSAA, not less.
True, but I believe current drivers have issues with transaprency antialiasing when AA is turned on in the game (from both vendors).
 
I got a quick question has anybody messed with riva tuner and tried to run one of the lower pure Supersampling modes and HDR on a G70
 
Last edited by a moderator:
Chalnoth said:
The visual deficiency of lower resolution on textures is blurring. The visual deficiency of lack of FSAA is aliasing. These are rather different effects, and it's certainly going to be possible for one to dominate over the other.
They are both effects related to spatial frequency. Texture detail will improve with higher resolution only when the information has higher frequency than the monitor dpi (i.e. during minification). Edges also have information at a higher frequency than the pixel spacing allows.

Your argument is basically saying lower contrast makes it harder to percieve the high-frequency step created by an edge. If that's the case, then you'll also have a harder time seeing the higher frequency information in the texture.

I can agree with you, however, if you find the temporal aspect of edge aliasing overwhelmingly more important than the spatial aspect.
 
Mintmaster said:
Bjorn, here is the fact: ATI loses around 20% for 4xFSAA with HDR. That's comparable to the hit without HDR, and less than the hit past GPUs took without HDR. The performance hit of HDR on R520 is comparable (some games more, others less) to that of G70 with the exception of FarCry.

Unless you think AA was worthless in the past, you can't say it's worthless now.

If it's going to be 20% in most games then it's definitely a good feature. Perhaps it'll be even less when/if we get games that uses the lower precision FP blending modes. And with regards to using non native resolutions on LCD monitors, well, my monitor isn't that good on non native resolutions. But from what i have read in reviews, that differs quite a lot between different monitors (LCD of course).
 
The monitor isn't the only think that downsamples, your eyes do too. If I show a 4096x4096 texture on a 2inch display capable of actually resolving every pixel, the result will still be an optical downsample (as well as retinal downsampling), unless you have 20/5 vision and hold the screen up to your nose. (I read a paper a while ago about wavefront constructed contact lenses giving someone who had 20/12 vision someting like 20/5, and the subject actually perceived the world as somewhat aliased)
 
Back
Top