Digital Foundry Article Technical Discussion Archive [2013]

Status
Not open for further replies.
Yup, 55" and from where they'd sit in front of the coffee table, it'd be ~5-6 feet to the TV. So well within the ability of people with 20/20 vision or better to be able to technically see the difference. And it is VERY obvious when shown my computer desktop at that distance. That changes when it comes to photos, video, and games (without UI displayed).

That's the point, though. Even when that is the case. People generally can't tell what the resolution is without a side by side comparison. Some can tell most of the time if there's a "one after another" comparison, but still not nearly 100% of the time. Depending on the source material, some is easier to identify than others. But even then. As I said, most people can't reliably tell the difference even when they should be able to.

I have no doubt that if I had a side by side comparison from in front of the coffee table, that most people would be able to tell which resolution is which depending on the source material. Something simple like a pie chart or static scene of a city skyline (high contrast lots of sharp lines) would be easy. A 720p game with 4xSGSSAA or RGSSAA would be relatively difficult compared to a 1080p game with AA. In that case, if the 1080p rendered game had no AA, while the 720p game had the aforementioned AA, some of the knowledgeable computer videophiles might correctly pick the source resolution, but most others would probably pick the 720p image as being higher due to it looking better and cleaner (less jaggies).

Regards,
SB

I did some testing and I have to give it to you that your method does makes it harder than I thought it would be. I have to ask a friend to come over and do a proper blind test.

I took one 1080p and one 720p screenshot, these are with 8x AA and 16AF.

http://sdrv.ms/19SxIZz

If one downloads those and watches back and forth in full screen, the difference is very big and it's easy to separate them from even a long distance on my 50" screen. I'm also confident that once I do a proper blind test I will pass it quite easily from 5-6 feet away, but your method of testing, when you really don't have a reference point is a lot trickier.
 
Last edited by a moderator:
Yup, that's why it's so difficult to test yourself. Chances are you've seen the images beforehand even if you attempt to make them random. Hence you'll already know what subtle things to look for.

Going into it totally blind (with no reference beforehand), like most consumers in their living room with a new game, video, etc. makes it extremely difficult to tell if what is displayed on screen is 720p or 1080p at typical living room distances.

For games, there's one thing that helps to give things away, and that's the UI. But if you then make the UI natively 1080p as this current generation is likely to do, then it will become nigh impossible for most people to tell the resolution without a side by side comparison. And since most people don't have 2x TV's in their living room, they will never know unless someone tells them.

Ryse is a perfect case in point. Many people that saw it in person thought it was 1080p. Even when their face was literally just 2-3 feet from the screen.

Regards,
SB
 
I think you guys are focusing too much on specific pixel counts. This shouldn't be about specifically "which resolution is the game running at" which is always going to be difficult to tell since different games cope better or worse with low resolutions. This should simply be about "which looks better" which without a comparison point is obviously impossible to say.

So citing Ryse as an example of why 1080p isn't important because people can't tell the game isn't running at 1080p even from up close is an invalid argument IMO. This would only be valid if there were both 1080p and 720p versions of Ryse running either side by side or sequentially on the same screen and people still couldn't define which one looked better.

I've done a fair bit of testing around this myself in the past on my 50" TV from around 12 feet so much less optimal than the viewing distances you guys are using and while I doubt I could define the specific resolution a game is running at from that distance, I know for certain I could define which version of the game looked better if I played it at both 720p and 1080p one after the other. In fact it's a very real compromise I have to make since 3D gaming through my TV will allow only 720p @60fps or 1080p @30fps. I chose differently depending on the game.
 
Good point, PJB. Just because you can't tell that Ryse isn't running 1080p doesn't mean it wouldn't look better if it were.

Or worse if the upscaler affords the free perfect antialiasing that they would then have to additionally add on to the rendering at 1080P which could make some effects suffer.

Just saying.
 
Upscalers don't and can't add free perfect antialiasing. Even if the upscaler includes a post-fx AA (which an upscaler shouldn't really), it'll be inferior to a true AA and inferior to what developers could implement themselves in engine. 900p means higher possible quality AA than 1080p could afford, everything else being equal.
 
I think you guys are focusing too much on specific pixel counts. This shouldn't be about specifically "which resolution is the game running at" which is always going to be difficult to tell since different games cope better or worse with low resolutions. This should simply be about "which looks better" which without a comparison point is obviously impossible to say.

I think the point is not what resolution looks 'better' but if a developer can improve IQ in other areas, is the difference between 1080p and 900p *important*? These consoles will have tradeoffs and compromises, there isn't a limitless amount of power available to them so its just about prioritizing tradeoffs and where to slot 'resolution' on the food chain.
 
My friends if anyone wants to digg in the art of scaling, for sure cannot miss the Exquires test suit.

http://exquires.ca/


Some people have missed my earlier post on the matter:
Developers may be biased in using monitors at less than 1.5m and this focused samples are not a good measure of the trade-off involved.
Gamers have an increased Contrast Sensitivity Function so in the words of ImageScienceFoundation the most important aspects of picture quality are (in order): 1) contrast ratio, 2) color saturation, 3) color accuracy, 4) resolution. Resolution is only the 4th on the list, so look at other factors first. Also, be sure to calibrate your display!

http://en.wikipedia.org/wiki/Optimum...ewing_distance
To clarify, these distances are calculated based on the Visual Acuity and thats very different from Vernier Acuity (ALIASING).

Freiburg Visual Acuity Test + Contrast Test + Vernier Test + Grating Test
http://michaelbach.de/fract/index.html
 
Last edited by a moderator:
I think the point is not what resolution looks 'better' but if a developer can improve IQ in other areas, is the difference between 1080p and 900p *important*? These consoles will have tradeoffs and compromises, there isn't a limitless amount of power available to them so its just about prioritizing tradeoffs and where to slot 'resolution' on the food chain.

I completely agree. My argument was purely about how noticeable higher resolution is, not about the cost/benefit ratio of it on a fixed performance platform.
 
A critical point that some people seem to miss with regards to the resolution of a rastered image is that what is resolved to the framebuffer is all the information you will get about the scene being rendered.

This isn't like film or any high resolution source image where you have a lot more data to work with before its downsampled. Resolving sub pixel detail is still a very big problem, and attempting to anti-alias those small details without enough data to work with still causes artifacts because of the inability of non-SS anti aliasing methods to properly reconstruct that missing data.

Given identical games running at 720p and 60 fps (one with super sampling and the other with a common post process AA), I would be confident that 9 out of 10 people (that one person being blind) would take the super sampled image because the artifacts would be far less noticeable.

For that reason I highly disagree with people who say lower resolution buffers are adequate with console level AA because the final display resolution isn't the issue, its the source resolution. Realistically these new consoles aren't going to be super sampling leaving post AA and (unlikely) MSAA to fill in the gaps. There is no substitute for resolution.
 
Upscalers don't and can't add free perfect antialiasing. Even if the upscaler includes a post-fx AA (which an upscaler shouldn't really), it'll be inferior to a true AA and inferior to what developers could implement themselves in engine. 900p means higher possible quality AA than 1080p could afford, everything else being equal.

Perhaps some personal puffery there as the images seem to have no clear aliasing with lots of visual clarity and going on Crytek's bosses own statement...

Ryse runs at 1600x900 for best perf & res, we apply our upscaler for AA, framebuffer native 1080p. SAME as E3 Xbox One! No change, No compromise

http://www.eurogamer.net/articles/2...sses-xbox-one-exclusive-ryses-900p-resolution
 
Yup, that's why it's so difficult to test yourself. Chances are you've seen the images beforehand even if you attempt to make them random. Hence you'll already know what subtle things to look for.

Going into it totally blind (with no reference beforehand), like most consumers in their living room with a new game, video, etc. makes it extremely difficult to tell if what is displayed on screen is 720p or 1080p at typical living room distances.

I still have to point out that imo your method of testing while very interesting and useful is still not more or less useful than watching pictures back and forth. In your method the deck is stacked against the viewer, basically it's the hardest case scenario, while watching pictures back and forth is the easiest. When you lose the reference point you start to second guess yourself, when watching the 1080p image you'll be thinking "is this it or did it look even sharper" and while watching the 720p footage it's the opposite.

Your method requires the largest differences in images to be able to sort them out and even if one fails your test it doesn't mean that the higher resolution doesn't make a difference, but like I said it also doesn't invalidate that test.
 
I still have to point out that imo your method of testing while very interesting and useful is still not more or less useful than watching pictures back and forth. In your method the deck is stacked against the viewer, basically it's the hardest case scenario, while watching pictures back and forth is the easiest. When you lose the reference point you start to second guess yourself, when watching the 1080p image you'll be thinking "is this it or did it look even sharper" and while watching the 720p footage it's the opposite.

Your method requires the largest differences in images to be able to sort them out and even if one fails your test it doesn't mean that the higher resolution doesn't make a difference, but like I said it also doesn't invalidate that test.
SB's case most closely replicates actual experience. How often will the same game be released for the same system at two resolutions? If you're playing a game on a console, and can't tell it's not 1080p without comparing against the game running at 1080p somewhere else, have you really lost anything?
 
There's the technical bits about being able to distinguish individual pixels. That alone has a lot of variables. Contrast, movement, eyesight, and training. This last bit is often forgotten. The first time I saw BluRay on my new TV back when I first got it, it felt like looking outside into the real world, now I see indivindual pixels, panning issues for converted 24hz footage, led red-shift and lousy compression artifacts (especially when they go together in dar scenes). First time I saw the new iPad I was amazed and shocked how great our photos looked, but that passed too and now I often see individual pixels again (or flaws in the camera).

Also, there's something like an emotional response to certain things that people don't know about. I remember the first time I heard a glass fall on playing back a (borrowed) LaserDisc with extermely high quality sound and good speakers, and there was an emotional response (cringe) like I had to a glass falling the first time.

We'll get to a point in time where this disappears and things just look real, but we're definitely not there yet (and it may require 3D).
 
If 4K and Retina content do arrive en masse, then perhaps we will become more aware of resolution differences over time. Otherwise, it may be somewhat difficult because a lower res game can impress the gamers with other eye candy and art style.

It's an open question for me, but the counter argument is: if we go VR and better 3D again, the higher res game will have the extra headroom to do it.
 
Upscalers don't and can't add free perfect antialiasing. Even if the upscaler includes a post-fx AA (which an upscaler shouldn't really), it'll be inferior to a true AA and inferior to what developers could implement themselves in engine. 900p means higher possible quality AA than 1080p could afford, everything else being equal.
Traditional scalers can't and have worse quality with aliased images.

One can resolve AA into a bigger buffer, it should be nicer than just Post AA and basic upscale, especially if one uses temporal re-projection and/or edge detection like with SMAA.
 
Last edited by a moderator:
If 4K and Retina content do arrive en masse, then perhaps we will become more aware of resolution differences over time. Otherwise, it may be somewhat difficult because a lower res game can impress the gamers with other eye candy and art style.

It's an open question for me, but the counter argument is: if we go VR and better 3D again, the higher res game will have the extra headroom to do it.

TBH, I expect the best you can expect WRT VR quality on this generation of consoles is the PS4 being less bad than the XBOne. Neither of these consoles is remotely capable of pushing the resolution and refresh rate needed to do VR optimally.
 
Yep I think if VR catches on during this generation and I think it has a shot this time, it will be the primary driver for another hardware generation.
 
So what about post-processing effects added into the mix? If you look at a game like The Order 1886, it looks sub-HD due to the blurring caused by chromatic aberration.
 
So what about post-processing effects added into the mix? If you look at a game like The Order 1886, it looks sub-HD due to the blurring caused by chromatic aberration.

You mean sub-1080p, right? I think it still looks pretty sharp. 720p is still HD, and it doesn't look 720p. But, yeah, post-processing tends to be lower res, as do a lot of other effects that will affect image quality.
 
Status
Not open for further replies.
Back
Top