Digital Foundry Article Technical Discussion Archive [2013]

Status
Not open for further replies.
I think it's because most are looking at 40-46" TVs from 6-8'. If you can't make out individual pixels, will squeezing more into the same space make a noticeably difference?

I sit about 6 feet from a 50" screen and the difference between 720p and 1080p on a computer game is huge. Moving closer (or bigger screen) and increasing resolution will enable you to see more detail. Lowering resolution and moving further away will perhaps hide the differences, but it hides the details also. Not a good option for me.
 
I think it's because most are looking at 40-46" TVs from 6-8'. If you can't make out individual pixels, will squeezing more into the same space make a noticeably difference?
In terms of output resolution, you quickly start running into extreme diminishing returns. Viewing a heavily-supersampled 720p image on my 37" TV from 10' seems to be well within my "I don't particularly care anymore" range.

However, I said "heavily supersampled" for a reason. Output resolution will restrict how sharp your image can be... but by itself it says nothing about whether or not you're doing a good job dealing with aliasing. And high-frequency high-amplitude visual components can need a ton of sampling in order to look stable.

Before I made this post, I decided to run a little experiment. What's a good example of a high-frequency high-amplitude visual component from this gen? Halo 3's normal-mapped specular highlights, obviously. Now, Halo 3 samples at 640p. If what some people say is true, and all that matters is that you can't easily distinguish side-by-side pixels, it's almost certainly the case that Halo 3 should look totally stable and fine on my 37" TV if I use a massive viewing distance like 15', right? I went to a place in game with normal-mapped Forerunner surfaces and started looking left and right with the flashlight on to make them shimmer. I was never able to measure how far back I'd have to be to make the shimmering stop, because the aliasing was still blatantly obvious when I ran into the farthest part of my building from which the TV is still visible. I measured that viewing distance to be ~53'.
Let me reiterate that: the aliasing from Halo 3's specular reflections is easily visible when viewed on a 37" TV from a distance of AT LEAST FIFTY-THREE FEET.

So yeah. 720p versus 1080p probably doesn't matter all that much in terms of visual clarity for console gamers who sit a substantial distance from their TVs (although with how often people sit less than two yards from 50" screens...). But in terms of sampling sufficiently to produce a stable image? A 720p backbuffer versus a 1080p backbuffer can make a very visible difference, even from a large viewing distance. Your eyes can pick out inaccurate garbage crawling and shimmering, even if they can't easily distinguish individual pixels.
 
But then how many next-gen games will be 640p with no kind of AA? We'll be looking at 720p at a minimum and >900p more generally with various improved AA solutions.

I agree that the difference between 720p to 1080p should be clear enough for most now.
 
...
still visible. I measured that viewing distance to be ~53'.
Let me reiterate that: the aliasing from Halo 3's specular reflections is easily visible when viewed on a 37" TV from a distance of AT LEAST FIFTY-THREE FEET.
....

If you are talking about shimmering and highlit pixels going on and off due to their choice of effect having higher frequency detail than the resolution would allow , it can probably be felt from even further away, but that's not the point. From how much further away can you tell those pixels apart, that the 60x 100y pixel was lit instead of 59x 99y?

It's developers responsibility to choose the appropriate shader with details having adequate frequency. You could have high contrast shimmering at 4k, if you choose to do so. Very easy with a procedural shader with inadequate frequency detail (you could sample at nanometer scale, after all)
 
Last edited by a moderator:
But then how many next-gen games will be 640p with no kind of AA? We'll be looking at 720p at a minimum and >900p more generally with various improved AA solutions.

I agree that the difference between 720p to 1080p should be clear enough for most now.

Actually we'll be looking at 1080p scan-out resolution with native 1080p HUDs overlaying the game content. That's quite a bit of a difference than what we have now. I suspect the difference wont be 'clear enough for most'.
 
It's developers responsibility to choose the appropriate shader with details having adequate frequency.
There are a lot of intricacies underneath that sentence, though.

Even plain old geometric edges have no maximum detail frequency. You can sort of throw a "44kHz it" argument at wide surfaces, since the jaggie crawling is the most noticeable issue (I stopped noticing Halo 3's geometric aliasing on large objects like cliffs at around 20 feet from the 37" screen), but what about thin objects that can be viewed from many distances, and whose undersampling will cause highly visible frame-to-frame differences in luminance/colour on a distinguishable small slice of the screen? There are ways to tackle the issue, but it's not entirely trivial.

I'm not saying that we always need extremely huge resolutions, just that for practical intents and purposes you can get reasonably tangible benefits by using a >720p or >1080p backbuffer even at TV viewing distances. When people say stuff like "720p is as much as you can make use of at 12 feet on a 32" screen" or some such talk, they're often not all that wrong assuming that the image is being generated in a theoretically perfect way, but that's not usually a good assumption.
It gets complicated, basically.
 
People thought Blacks Ops II ran at 1080p? I sure hope no one here did.

True story.

720p to 1080 should present a clear difference to most.

While I agree with that, I still think that image quality and scene complexity would be more noticeable to ~90% of console gamers.

720p is different from 480p because it is enough IMHO. You can show enough detail to start to care more about the quality of that detail.
So as long as resources are limited it makes less sense to increase resolution IMHO.
 
Why are people going back and fourth arguing about this?

Bottom-line is that until we get a Xbox One in our homes with the games running on our TVs we have no idea what it will look like, good or bad.
 
Why are people going back and fourth arguing about this?

Bottom-line is that until we get a Xbox One in our homes with the games running on our TVs we have no idea what it will look like, good or bad.

Looks good running at the old vacation home.
 
I can only repeat myself here.

You guys have spent a generation playing games at 720p without complaining - at least I've never seen anyone expressing problems with full 720p games. Some got flak for upscaling, but only here - and there were people who thought Black Ops II on the PS3 was running at native 1080p. This means that the resolution was not much of a problem in the real world.

Also, if I had to make a choice, I'd prefer a 720p render with as little aliasing issues as possible, at the best rendering quality the developer can push. Yeah 1080p would be sharper, but I'd rather not see lowres geometry and textures with lots of aliasing in more detail...

Exactly how i see it as well. But then I'm the kinda guy who'll deliberately opt for anti-aliasing methods most people here would label destructive because I'll take a slight blur over the fake look that comes courtesy of the excessive sharpness of real-time graphics. Heck, most modern games spend tons of resources on effects to deliberately "worsen" image quality anyway.
 
Last edited by a moderator:
I sit about 6 feet from a 50" screen and the difference between 720p and 1080p on a computer game is huge. Moving closer (or bigger screen) and increasing resolution will enable you to see more detail. Lowering resolution and moving further away will perhaps hide the differences, but it hides the details also. Not a good option for me.

Geez, how do you get that close? With a coffee table in my living room the closest I can get to the TV is approximate 9-12 feet depending on how much room I want to leave for people to walk between the TV and the coffee table. And out of all the families I know, my sofa is the closest to the TV.

Hence, while I game at 2400x1500 or 2560x1600 on PC, I game at 720p (HTPC output) on TV with no noticeable difference in IQ other than in game text taking up more of the screen. 1080p brings zero benefits and tons of drawbacks. By going 720p I can enable more effects which leads to a more pleasing experience. Sure I could put a beefier GPU in my HTPC to do the same effects at 1080p, but why? No noticeable difference but far greater power consumption and initial hardware cost (beefier GPU).\

As well if people didn't know that 540p and 600p games weren't 720p (a far more noticeable difference) during the current generation, they certainly are not likely to notice if a game isn't 1080p. And that was extremely common this generation, with people thinking that COD: MW was 720p (and some even thought it was 1080p) until they were told otherwise.

Regards,
SB
 
I'm about 9-10' from a 60" screen, I can see the difference between 720p and 1080p. I can usually spot the ones that are less, too, based on the size of the jaggies.
 
I'm about 9-10' from a 60" screen, I can see the difference between 720p and 1080p. I can usually spot the ones that are less, too, based on the size of the jaggies.

I'd love to have you over for my blind resolution test on my 55" TV. Not a single person (lots of people with 20/10 or better eyesight have tried it) has successfully managed to tell the difference between 720p and 1080p images, videos, or games (games always have 2-4x MSAA or SSAA and always without UI). The best someone managed to do was to guess the correct resolution 53% of the time. Which is well within the margin of error.

And a ton of them just like you have claimed to be able to tell the difference. Not saying you wouldn't be the first, but past empirical evidence doesn't support that claim. It's one thing to look at an image that you know is 720p and the same image that you know is 1080p and then pick which one looks better. It's not so easy when you have no clue what resolution an image/video/game is running at and cannot view it side by side with the opposing image.

It may be possible in my blind resolution test that if I had 2 TVs side by side someone with 20/10 vision may be able to pick correctly. However, when you can't do that, it's basically almost impossible.

Regards,
SB
 
I'd love to have you over for my blind resolution test on my 55" TV. Not a single person (lots of people with 20/10 or better eyesight have tried it) has successfully managed to tell the difference between 720p and 1080p images, videos, or games (games always have 2-4x MSAA or SSAA and always without UI). The best someone managed to do was to guess the correct resolution 53% of the time. Which is well within the margin of error.

And a ton of them just like you have claimed to be able to tell the difference. Not saying you wouldn't be the first, but past empirical evidence doesn't support that claim. It's one thing to look at an image that you know is 720p and the same image that you know is 1080p and then pick which one looks better. It's not so easy when you have no clue what resolution an image/video/game is running at and cannot view it side by side with the opposing image.

It may be possible in my blind resolution test that if I had 2 TVs side by side someone with 20/10 vision may be able to pick correctly. However, when you can't do that, it's basically almost impossible.

Regards,
SB

To exclude confirmation bias, did you invent/run this test before, or after you learned about the Xbox1 specs?

Myself I did notice the difference between 1080P and 720P on many occasions, when I was watching some blu rays I had I noticed that the image quality was not that great, but both my TV and PS3 were displaying 1080P. When I looked it up on the internet, it turned out that you could downscale those movies to 720P, then upscale to 1080P and have no difference with the original 1080P image. So the movie was scanned at 720P and then upscaled.

A game however, is much more prone to detection: the horizon in racing games will always be sharper in 1080P, because you can, and will focus on it.
Every shooter will have you focus on distant pixels as well.
So in conclusion: I am very happy that Sony for example went with a really powerfull system.
And Xbox1 can also display 1080P games, albeit with less effects and with static low precision lighting in the case of Forza5
 
I'm also 9-10' from unfortunately a 32" screen, so 1080 doesn't mean much to me. I can differentiate 1080p no AA vs 720p no AA (not really sure about 900p vs 1080p) but once they apply AA then I'm lost. So I prefer 720p with better pixel quality. For example, if I can have 720p with good AA vs 1080p with no AA then I'll choose 720p. Basically I will choose the resolution that will produce little to no aliasing while still maintaining details (no QAA stuff for 720p, but maybe at 1080p it would fit better).
 
Sitting more than 6' from a 40" TV, 720p and 4xMSAA about covered it for most games. I'd take a wider field of view over more pixels with that setup.

And Xbox1 can also display 1080P games, albeit with less effects and with static low precision lighting in the case of Forza5
You are aware that Forza 5 is 60 frames/s and a launch title? "1080p" shouldn't a problem for Xbox 1 either.
 
Last edited by a moderator:
Geez, how do you get that close? With a coffee table in my living room the closest I can get to the TV is approximate 9-12 feet depending on how much room I want to leave for people to walk between the TV and the coffee table. And out of all the families I know, my sofa is the closest to the TV.

Well I'll be the first to admit that my living room makes some trade-offs VS a regular living room and basically is a dedicated entertainment room.

The video is pretty bad quality, but that is my current setup...



You can also view the TV unobstructed from that couch behind the divan. If necessary I could move about 45cm closer to the screen with this setup and naturally further back. Before I bought my 50" TV I was contemplating on a 42" model due to the ability to change the distance at will.

I would like to have a 65" 4K TV with this current viewing distance and perhaps move the couch in the back 50cm closer and move the back speakers and that divider which they stand on behind the couch, this way I can have two usable stationary viewing distances for varying quality of content with surround speaker setup and also still have some room to move either watching positions if needed.

edit:

2 pictures



 
Last edited by a moderator:
...or in the case of the PS4, you get 1080p at the expense of lower fps like Killzone, Infamous, and Drive Club alike...
...or put in an insane amount of blooming, motion blur, and FOD in the scene in the post-process to "blur up" the images intentionally.

Yeah, thanks, will take 60fps any time.
 
Last edited by a moderator:
...or in the case of the PS4, you get 1080p at lower fps like Killzone, Infamous, and Drive Club.
...or put in an insane amount of blooming, motion blur, and FOD in the scene in the post-process to "blur up" intentionally.

Yeah, will take 60fps any time.

I don't see how 1080P 30fps is bad or how it is PS4 specific. I think gamers are expecting 30fps 1080P as the new baseline, with 60fps being achieved via concessions. I don't think anyone was expecting sub-1080P 30fps or 720P anything.
 
30 fps for a racing game is bad, like Forza 1 bad.
native resolution is a limit, frame rate is also a limit.
some prefer one, others prefer the other.
As a gamer I expect most things to run above 30fps because it's pretty unbearable to my eyes.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top