*downscaled* Resolving Resolution

Status
Not open for further replies.
Well that video isn't actually of Crysis being played (and even if it were, the viewer wouldn't be the person playing) so it doesn't actually go against Shifty Geezer's point.

What that video does show though - and with startling clarity - is that there are lots of entirely worthwhile and valuable things to spend processing power on instead of or in addition to increasing the resolution.

The difference in resolution is way down the list of things that you'd want to improve first if you were trying to make Crysis on the 360 have the same visual impact as the maxed out, pimped out, processor and memory wasting PC version.

Edit: If you did have 2 ~ 3 times the power to spend on 360 Crysis and all you used it for was increasing the resolution to 1920 x 1080 you'd have either a) done it because that's all the business case called for (time, money and sales) or b) messed up.
 
Last edited by a moderator:
Shifty I'm sure a man such as yourself can see how blurry the up scaled shot looks without resorting to excuses such as 'It's not the same art'
the excuses aren't to try and change people's perception. Is the upscaled game worse looking than the 1080p native version? Yes, and indisputably so. But would the 720p game look horrible on my TV whereas the 1080p version wouldn't? No. Certainly not to the degree that a static screen suggests. 1080p would be better, but by a degree. In the same way I can watch 720p films on a 1080p and not be aware that they aren't 1080p without looking them up. It depends on what distance you're viewing from, what your TV is like (I've seen some pretty atrocious smearing on LCDs), and the nature of the content.

The simple, overzealous assertion "upscaling is horrible" doesn't gel with most people's experience, even if that's how you feel about it.
 
Static screenshots aren't indicative of how the thing looks and feels when being played.
But would the 720p game look horrible on my TV whereas the 1080p version wouldn't? No. Certainly not to the degree that a static screen suggests.

Why do you always write that static screenshots would be inappropriate to demonstrate the differences? It's not like as if static screenshots would show something completely different to what you see when you play a game, isn't it?

In the same way I can watch 720p films on a 1080p and not be aware that they aren't 1080p without looking them up.

Film does not suffer from the aliasing which videogames usually suffer from. So that might not necessarily be an appropriate analogy (at least as long as there are jaggies in videogames).

And upscaling usually makes aliasing even worse. It's almost like putting a magnifying glass over those jaggies.

Upscaling from 1280x720 to 1920x1080 for example makes those jaggies quite a bit "bigger".

And then there also is this thing that is actually not shown in static screenshots: jaggies that suddenly start to "crawl" as soon as there is movement in the scene... :eek:
 
Last edited by a moderator:
Why do you always write that static screenshots would be inappropriate to demonstrate the differences?
Because it's true. But that doesn't mean motion masks upscaling effects 100%. It only makes them less noticeable.

Film does not suffer from the aliasing which videogames usually suffer from. So that might not necessarily be an appropriate analogy (at least as long as there are jaggies in videogames).
So if the choice is rendering 1080p with no AA, or 720p with AA that eliminates jaggies, which is then better?

And upscaling usually makes aliasing even worse. It's almost like putting a magnifying glass over those jaggies.

Upscaling from 1280x720 to 1920x1080 for example makes those jaggies quite a bit "bigger".

And then there also is this thing that is actually not shown in static screenshots: jaggies that suddenly start to "crawl" as soon as there is movement in the scene... :eek:
I don't understand this whole argument. I never said upscaling is perfect and has no down side. I never said graphics in motion have no upscaling problems. Just that upscaled graphics aren't 'horrible' universally and a pain to watch. In most cases a typical player probably can't tell much difference, and with improved IQ at rendering eliminating aliasing issues, it'd be even less noticeable next-gen. People can present comparison vids and screenshots all day long, but that doesn't change the fact me and many others play upscaled 720p games without our eyes being gouged out of their sockets but the ghastly mess. It also doesn't change the fact that we watch plenty of 720p content upscaled without checking the screen to see is someone's been smearing Vaseline all over it.

On a scale of 0 to 10 for horribleness, where 0 is unwatchable and 10 is pixel perfect loveliness, where standard SD pictures pixel-upscaled to 1080p would count as 1, and I'd place 720p graphics upscaled to 1080p at 8.
 
On a scale of 0 to 10 for horribleness, where 0 is unwatchable and 10 is pixel perfect loveliness, where standard SD pictures pixel-upscaled to 1080p would count as 1, and I'd place 720p graphics upscaled to 1080p at 8.


I'd go maybe a 7...

I really hope we move to 1080P next gen. My brother really made me conclude that a few months ago, he's a PC gamer and mentioned how he hates the low res of console games when he plays them (and he's not really a freak about any visual things)

Every since he said that I have tended to agree with him.

I agree though normally 720P on 1080P display in consoles doesn't bother me, I am used to it, (but it would drive me crazy to do anything on my PC upscaled). But I would call it low res in this day and age and hope next gen consoles pack the punch to move up.
 
Upscaling from 1280x720 to 1920x1080 for example makes those jaggies quite a bit "bigger".

Except you have twice the GPU resources to spend on those 1280x720 pixels, so you can have much better filtering and thus much higher quality pixels.

Would you rather play a game at 1920x1080 with no anisotropic filtering and no anti aliasing, struggling to run at 30 fps or would play at 1280x720 with 16 x anisotropic filtering and 2 or 4xMSAA at rock solid 30 fps with more complex shaders to boot ?

I'd pick the latter every day of the week.

Cheers
 
Last edited by a moderator:
I really hope we move to 1080P next gen. My brother really made me conclude that a few months ago, he's a PC gamer and mentioned how he hates the low res of console games when he plays them (and he's not really a freak about any visual things)

Every since he said that I have tended to agree with him.

I agree though normally 720P on 1080P display in consoles doesn't bother me, I am used to it, (but it would drive me crazy to do anything on my PC upscaled). But I would call it low res in this day and age and hope next gen consoles pack the punch to move up.

I have way more trouble with what I consider crap framerate (~30) and low res textures than rendering resolution.
720p might be just fine if it's the price to pay to have AA, and high quality scenes along with rock solid 60fps.
But I disgress...
 
Except you have twice the GPU resources to spend on those 1280x720 pixels, so you can have much better filtering and thus much higher quality pixels.

Would you rather play a game at 1920x1080 with no anisotropic filtering and no anti aliasing, struggling to run at 30 fps or would play at 1280x720 with 16 x anisotropic filtering and 2 or 4xMSAA at rock solid 30 fps with more complex shaders to boot ?

I'd pick the latter every day of the week.

Cheers

Honestly I'd probably pick the former. I'm farthest from a framerate stickler, 25-30 FPS with maxed gfx is best imo. The only thing I'd like is to have some AF on the 1080P choice, maybe 2X or 4X. Give me that and I'll easily take the 1080P option there.

Obviously it's a personal matter, but why stop at 720p? Have even prettier pixels at 600P, or 480P.

It's all a tradeoff, last gen "we" decided it was overall better to have uglier pixels at ~720P, and this was decided to be better than prettier ones in SD. I think the time to move to 1080P is nigh. Additionally, the strain of moving to 1080P is less (2.25 pixels of 720P vs 3X in the 480P>780P transition).

60 FPS is a never ending battle you'll never win, so might as well leave that out :p
 
Because it's true. But that doesn't mean motion masks upscaling effects 100%. It only makes them less noticeable.

I'd say motion makes aliasing and other upscaling problems more noticeable. Shimmering is more noticeable than static jagged edges.
 
Film does not suffer from the aliasing which videogames usually suffer from. So that might not necessarily be an appropriate analogy (at least as long as there are jaggies in videogames).
So if the choice is rendering 1080p with no AA, or 720p with AA that eliminates jaggies, which is then better?

Why 1080p without AA?

On a native 1920x1080 display, even 1920x1080 with just SMAA 1x (Preset "High" or "Ultra") looks better than 1280x720 with supersampling anti-aliasing upscaled to 1920x1080.

The upscaled 1280x720 image with supersampling anti-aliasing still looks "soft"/"blurry"/"out of focus"/etc. due to the upscale to 1920x1080.

But the native 1920x1080 image with just SMAA 1x looks a lot "cleaner"/more "precise"/more "detailed"/etc. (SMAA appears to be quite a bit less "blurry" than FXAA).

So even supersamling anti-aliasing apparently can not prevent upscaling from making the picture blurry and so on.

And something else:

Why do you try to compare playing videogames with watching film?

Both are completely different things, don't you think?

You interact with videogames. You play them.

With film you don't. With film you're just a "spectator".
 
It depends (as ever, there's not a simple black and white, does or doesn't answer). Jaggie motions can be distracting, but persistence of vision (or the display) can also overlay multiple different jaggies in the same space over a small period of time, contributing to a form of temporal AA. Furthermore, jaggie perception is affected by contrast, and in some games a lack of AA isn't very noticeable so the choice of higher resolution to combat them has to be factored against the cost. Plus there's the option of AA when going with lower resolution.

Or in short, there's no right/wrong answer, despite some claims that lower resolution is always bad. A person who's super sensitive to lower resolution would prefer a higher resolution, naturally, although I dare say better IQ enhancements might meet with their approval more often than not, and that's the choice facing devs. And there's also the issue of temporal resolution as well. The blinkered pursuit of higher resolution is not the most sensible approach when designing a game's output targets, and could result in a far worse experience than more flexible compromises.
 
Why 1080p without AA?
As an example of the sorts of compromises having to be made this generation, which is providing the games people are presenting as evidence "higher resolution = better."

Why do you try to compare playing videogames with watching film?
The ideal is games that look as good as films. That requires more processing per pixel. The higher the resolution, the lower the per pixel quality, such that a machine that could render photorealistic graphics at 320x200 will render non-photorealistic images at 1080p. The issue is where to strike the balance. Those advocating higher resolution above all else are presenting a very unbalanced approach to selecting performance targets.
 
Given how much dof is used nowadays, they could also consider rendering stuff that falls in Z focus at full res, then all the rest whose Z falls into the out of focus areas can be rendered at lower res. The would especially help with cutscenes which tend to be dof heavy, it would let them bump up the render quality there.
sebbi's post is very interesting because of the smart use of technology, just in the ways of nature. :smile:

A NASA Scientist, Rich Terrile, talked about his theory that Real Life itself may just be a simulation like any of todays computer games. And like the quantum particles of the universe only things that matter are shown.

He mentions GTA IV on the PS3 as an example:

The natural world behaves exactly the same way as the environment of Grand Theft Auto IV,” explained Terrile. “In the game, you can explore Liberty City seamlessly in phenomenal detail. I made a calculation of how big that city is, and it turns out it’s a million times larger than my PlayStation 3.”

You see exactly what you need to see of Liberty City when you need to see it, abbreviating the entire game universe into the console. The universe behaves in the exact same way. In quantum mechanics, particles do not have a definite state unless they’re being observed. Many theorists have spent a lot of time trying to figure out how you explain this. One explanation is that we’re living within a simulation, seeing what we need to see when we need to see it.”

Terrile went onto explain that with computers growing in power at an incredible rate (acting in accordance with Moore’s Law) it in fact won’t be that long before we’re able to recreate similar world building experiments in our computers. Creating sentient beings and lives of their own within a computer. Not only that, but we’ll be able to do it for an entire digital planet’s population.

“Perhaps in the next 10 to 30 years we’ll be able to incorporate artificial consciousness into our machines,” concluded Terrile.

http://www.nowgamer.com/news/161516..._the_same_as_liberty_city_nasa_scientist.html

This basically resounds to the words sebbi used, where he says true quality should be shown where/when it is needed.
 
To those who claim "1080p would be a waste" (or whatever):

Just wondering:

Have you ever watched the 3.2 GB 1080p @ 165 Mbps version of that UE3 "Samaritan" demo video available over there for example:



on a large native 1920x1080 display (the operative word being: 165 Mbps version)?

:mrgreen:;)
 
So if the choice is rendering 1080p with no AA, or 720p with AA that eliminates jaggies, which is then better?

From personal experience on PC, I would take the native 1080p

While AA would no doubt improve the IQ of 720p it just can't compete with the clarity of 1080p, on the same size screen you're talking of twice the PPI... that kind of thing is hard to beat from an IQ point of view.

Developers could well target 1080p with FXAA/MLAA next gen which I feel would be a much better choice then 720p with 2 or 4x MSAA.

1080p in general would require less AA to achieve a jaggie free look compared to 720p, 1080p with 2xMSAA would not be far of 720p with 4xMSAA
 
From personal experience on PC, I would take the native 1080p
I don't doubt that you would. However, not everyone shares your sentiments, or perception of resolution. If the choice is between a console that's advertised as "Full HD 1080p games" and another console that 'looks better', Joe Gamer will pick the latter (all other things being equal).
 
Somehow it's a bit "funny"/baffling how this forum is used to be known so well for it's "pixel counting activities" and threads such as:

"Neverending Upscaling/Resolutions/AA etc Thread"
"Image Quality and Framebuffer Speculations for Unreleased Games"
"Image Quality and Framebuffer Analysis for Available Games"

but as soon as someone mentions 1080p should be mandatory for next-gen console videogames, suddenly a lot of users start to whine and scream "Please no, would be such a waste!" and so on (or whatever) :???::eek::D:p:mrgreen:;).
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top