Resolution increases versus performance and overall IQ

Rys

Graphics @ AMD
Moderator
Veteran
Supporter
EDIT: Spun this side chain out of a conversation being had in the Radeon R9 Fury reviews thread, about performance and image quality tradeoffs versus ever increasing resolutions.

Honestly, I've never really understood the fascination with chasing ever increasing resolutions for game rendering. I'm fully behind it for regular computer use, to get great text quality and overall fidelity on the desktop, but for games I really struggle to see the point. I'd much rather developers focused their efforts on fantastic looking pixels and stable frame delivery at lower resolution, with great hardware upscaling if you've got a high DPI display. Gaming is not what high DPI is for.

Nirvana for me is a 4K 100Hz+ variable refresh rate display and a single high-end GPU with great scaler. I'd run at 4K for non-gaming use, then for gaming it'd be 1080p or 1440p at good framerate, without worrying about v-sync, with the best looking pixels possible.

Better looking and fast, stable delivery should come waaaaay before natively rendering at 4K.
 
Last edited:
Honestly, I've never really understood the fascination with chasing ever increasing resolutions for game rendering. I'm fully behind it for regular computer use, to get great text quality and overall fidelity on the desktop, but for games I really struggle to see the point. I'd much rather developers focused their efforts on fantastic looking pixels and stable frame delivery at lower resolution, with great hardware upscaling if you've got a high DPI display. Gaming is not what high DPI is for.

Nirvana for me is a 4K 100Hz+ variable refresh rate display and a single high-end GPU with great scaler. I'd run at 4K for non-gaming use, then for gaming it'd be 1080p or 1440p at good framerate, without worrying about v-sync, with the best looking pixels possible.

Better looking and fast, stable delivery should come waaaaay before natively rendering at 4K.

There's at least one thread somewhere about this. There are games where higher resolution is definitely useful, the primary example would be flight simulators (where noticing a bandit that occupies a single pixel as early as possible can make the difference between life and death) but I think it applies to first-person shooters too, just to a lesser degree.

For other kinds of games it matters a lot less, but of course as consumers we have no control over the quality of the lighting (beyond choosing the highest game settings) so the only thing we can actually chase is higher resolution. For game developers, it is probably easier to pick a single rendering technique with switchable options and then scale the definition from one device to another, relative to using radically different rendering techniques according to the target device.

In a market dominated by multi-platform games, I think that's an important factor.
 
Yeah, I'm not saying it's dumb to chase high resolution for all games, but for the vast majority it's pretty stupid. Higher resolution does more to waste the most precious and costly resource a GPU has (bandwidth), and generates more back pressure for a modern GPU microarchitecture to soak up, than anything else in modern rendering.
 
It's understandable for gaming's progression towards higher resolutions, as long as gpu's can pace.
index.php
 
Haha. So if the big image shown 1:1 on my "FHD" monitor shows more detail than the little FHD call out bubble thing, then what?
 
then for gaming it'd be 1080p or 1440p at good framerate, without worrying about v-sync, with the best looking pixels possible.
You are aware scaling doesnt look great, do you run games at 960x540 upscaled on a 1080 display
 
Lots of things in realtime graphics today don't look great. All weighed up, for me, there's a big list of things ahead of cheap upscaling that make me think something looks bad.
 
Ah so we should forget resolutions that are actually playable in favor of 4k slideshows. Good point.

So you are saying that lowering resolution is better than lowering other details like AA ? Really ?

Yes, you should forget. Because we all need to move forward to 4K as soon as possible.
I do not want to destroy my eyesight with ugly resolutions. :(
 
Ah so we should forget resolutions that are actually playable in favor of 4k slideshows. Good point.

That's an unfair criticism by itself alone. Ultra settings used in reviews have diminishing returns considering the performance hit they incur. Big kepler and Hawaii could handle it well enough with few settings turned down without much loss in IQ.

http://forums.anandtech.com/showthread.php?t=2364951

Besides games that are slideshows at 4k would be more demanding at lower resolutions to take cpu bottleneck out of the equation for AMD cards which seems to be the major stumbling block them at lower resolutions.

The worst scenario for AMD cards is with 120Hz monitors. The best is VSR on lower resolutions where they had a slide showing next to no hit for it, while nvidia cards incur about a single digit percentage or so from DSR review that I read. Of course haven't seen a reviewer test it that way.
Kudos to AMD's marketing division. :LOL:
 
So you are saying that lowering resolution is better than lowering other details like AA ? Really ?

Yes, increasing resolution can't compensate for lighting or geometry. You would be much better off using any spare rendering time on those things.

Yes, you should forget. Because we all need to move forward to 4K as soon as possible.
I do not want to destroy my eyesight with ugly resolutions. :(

If 1080p "destroys your eyesight" you should really go see a doctor.

That's an unfair criticism by itself alone. Ultra settings used in reviews have diminishing returns considering the performance hit they incur. Big kepler and Hawaii could handle it well enough with few settings turned down without much loss in IQ.

Agreed on diminishing returns for maxing out settings in most games. Same can be said for resolution though. Going from 1080p to 4k is nice - I use DSR wherever I can. But after a certain minimum PPI it's better to spend processing power on prettier pixels and not just more of them.
 
DSR and 4k aren't comparable though. Shadows on very high vs. ultra usually don't look that different to justify the performance loss.

Hardwareluxx is the most Fury favoring review as of yet, still waiting for ixbt's, even if you aren't considering only 4k ultra. Taking 50fps avg as the desired,

Crysis 3: 1600p 1xAA 1xAF 27%
Bioshock: 4k ultra 28%
BF4: 4k 1xAA 1xAF 34%
CoH2: 1600p 'kein' AA 1xAF 12% AA seems to cut down performance in half.
Metro LL: 1600p 'kein' AA 1xAF 19%
Tomb Raider: 4k FXAA 36%

And a loss in Grid2.
 
So you are saying that lowering resolution is better than lowering other details like AA ? Really ?

Yes, you should forget. Because we all need to move forward to 4K as soon as possible.
I do not want to destroy my eyesight with ugly resolutions. :(
To my eyes 1080p with 4x rotated grid supersampling looks better than 4k with no AA. Jaggies blow.

We should focus on getting realistic visuals at 1080p (even target 720p if you want, but make sure you have lots of good AA).

If you think about it, I don't think many of us would complain about 1080/720p HDTV quality visuals.
 
They probably meant 2x SSAA - but yes, 1x SSAA would mean to take 1 sample per pixel.
 
To my eyes 1080p with 4x rotated grid supersampling looks better than 4k with no AA. Jaggies blow.

We should focus on getting realistic visuals at 1080p (even target 720p if you want, but make sure you have lots of good AA).

If you think about it, I don't think many of us would complain about 1080/720p HDTV quality visuals.
Agreed.
What would be interesting is rendering with something like 8xMSAA into a 720p buffer and use custom resolve to 1080p resolution. (perhaps with some edge finding algorithm as well.)
This would give thin geometry a superior definition to 1080p native resolution while reducing overall shading or at least giving better sample placement.
 
i have native 1440x900 monitor and 1920x1080 TV. so resolution increase beyond that is wasteful. Sure it give better AA but i prefer cheap post-aa and have higher overall graphical FX
 
Even FSAA isn't a substitute for higher resolution. At some point more pixels are needed to resolve the finer details, while AA is simply masking out the high-frequency artefacts and discontinuities. I think the screen resolution should go up in tandem with the overall graphics fidelity and detail. After all, what's the point of running an old title like Q2 in UHD or QHD mode if the assets' definition is well below that point. Sure it provides cleaner details for objects in the distance, as a byproduct.
 
Even FSAA isn't a substitute for higher resolution. At some point more pixels are needed to resolve the finer details, while AA is simply masking out the high-frequency artefacts and discontinuities. I think the screen resolution should go up in tandem with the overall graphics fidelity and detail. After all, what's the point of running an old title like Q2 in UHD or QHD mode if the assets' definition is well below that point. Sure it provides cleaner details for objects in the distance, as a byproduct.

That could be useful for Quake 2 snipers. :D
 
Back
Top