Screensize & resolution & distance

zed

Legend
started from this thread
http://beyond3d.com/showthread.php?t=64079&page=5

People here have no doubt seen similar charts on the internet like the following
resolution_chart.jpg


Now these are based on the person having 20/20 vision, which for some reason a lot of people think is perfect vision but aint eg see the following
contrary to popular belief, 20/20 is not actually normal or average, let alone perfect, acuity. Snellen, he says, established it is a reference standard. Normal acuity in healthy adults is one or two lines better. Average acuity in a population sample does not drop to the 20/20 level until age 60 or 70.
http://lowvision.preventblindness.org/eye-conditions/how-visual-acuity-is-measured

Without further ado, I present a more accurate screensize graph
you can check the source & pick out any errors I may of made
http://auzed.com/crap/screensize.html

If theres demand I'll make one for phones
cheers zed
 
I'm more interested in field of view in relation to screen size and distance than resolution (i.e. my 23" PC monitor compared to my 60" TV). I already know all I need to know about resolution, which is what I can see with my own eyes.. I couldn't give a damn what some chart says I can see (that's not referring to you specifically, but to anyone who posts one of these charts and tries to use it as "proof" that I'm not supposed to be able to see 1080p at such-and-such a distance).

No one should ever use a chart to determine whether or not they can "see" 1080p at a certain distance. Use your own eyes. If you can personally see a difference, then you can see a difference, it's as simple as that. Even if you're outside the range that the chart says you "should" see.
 
started from this thread
http://beyond3d.com/showthread.php?t=64079&page=5

People here have no doubt seen similar charts on the internet like the following
resolution_chart.jpg


Now these are based on the person having 20/20 vision, which for some reason a lot of people think is perfect vision but aint eg see the following

http://lowvision.preventblindness.org/eye-conditions/how-visual-acuity-is-measured

Without further ado, I present a more accurate screensize graph
you can check the source & pick out any errors I may of made
http://auzed.com/crap/screensize.html

If theres demand I'll make one for phones
cheers zed

These charts are useless because they are all calculated based on the Snellen visual acuity test you mention, but ignore the fact that the test concerns a black and white image on a piece of paper across the room. As you point out he was creating a system of relative measures in order to establish some baseline of normalcy. As such, it makes no accounting for emitted light, color, motion, artifacting, etc, all of which have an impact on someone playing a game on a television.
 
No one should ever use a chart to determine whether or not they can "see" 1080p at a certain distance. Use your own eyes. If you can personally see a difference, then you can see a difference, it's as simple as that. Even if you're outside the range that the chart says you "should" see.
Yes thats what prompted me in the first place, apple telling us at X distance the we cant see individual pixels with a retina screen (even though I and other ppl I asked could plainly see them) or the above graph which says the average person cant see better than 1080p on a 55" screen at 9 feet even though with a blind test 48 of 49 people could tell the difference between 1080p & 2160p content at those parameters
link for details
http://www.hdtvtest.co.uk/news/4k-resolution-201312153517.htm

@Brad Grenz fair enuf, but see the above test in the link which is watching content on a TV (exactly what is being discussed) you will see ~20/15 does fit in very nicely with the realworld test
 
These charts are useless because they are all calculated based on the Snellen visual acuity test you mention, but ignore the fact that the test concerns a black and white image on a piece of paper across the room. As you point out he was creating a system of relative measures in order to establish some baseline of normalcy. As such, it makes no accounting for emitted light, color, motion, artifacting, etc, all of which have an impact on someone playing a game on a television.

Correct me if I'm wrong, but by introducing more factors like emitted light etc, you are only increasing (or decreasing) the potential where a higher resolution might still be distinguishable. In other words, the lines will just take a slightly steaper angle, but there is still a point where depending on the screen size and resolution that the difference becomes moot. I don't think these graphs should be taken literal anyway, since not everyone has the same eye-sight anyway, but more as a "ball park" kind of reference.

As console gamers, this is something that I think is quite relevant, as most tend to play games from further distances than for example PC gamers who, due to the setup, usually find themselves closer to their screen.
 
These depend on tv calibration, then if a ballpark console population is found the resolution comparison in consoles can end!
 
Well, science has some tips on this issue. Definitely it change over time as too much factors collude. But a nice recent read is http://www.schubincafe.com/2012/07/07/all-you-can-see

Detail or "sharpness " is proportional to the area in the MTF curve.
MTF.jpg


Any movement changes our sensitivity. My take would be dynamic lower to 900p to improve any judder, tearing or +Anisotropic filtering or AA in tough sections.
HDR is way more important than resolution because it increases the slope on that curve. So the area improves more than 6x.
 
Not too scientific a take, but dabbling in Xbox One now I've got a few random thoughts:

I think next gen assets are unmistakably a bigger factor than even 720P vs 1080P. I started out playing Forza 5 and Killer Instinct. One a full 1080 game the other 720. The thing is when you switch to KI, the lower resolution doesn't really jump out at you at all as a big problem. You dont go "man this game looks crap" at all, the unwashed 95% probably would never notice unless you told them.

I think it's the same for BF4 (720P on XBO). It looks really good to me. However, I have this nagging feeling it's "rough" and 1080P would help.

Next the Destiny Beta, looked really clear and sharp to me on XBO. Very big difference to X360 version. The combo of next gen assets and 900P is just gravy, since next gen assets and 720 really doesn't look bad imo.

However I wonder if it's a case of not knowing what I am missing, with Destiny.

The thing is, the on TV experience is different from DF, youtube comparison videos, and others. You view those sitting one foot away from 1080P PC monitor, as well they are compressed video.

The final thing is I have a 42" TV which I sit a good 8-10 feet from. a 55" TV is 1.7 times the screen area and a 60" is 2X+ I believe. So, obviously I wonder if 1080P would become a lot more noticeable if I upgraded TV's, as I plan to at some point.
 
I think it's more of a case of not knowing what you're missing. I know on my PS4 with most games being 1080P when I put on BF4 which is 900P you definitely notice it's not as clean/sharp.
 
Next the Destiny Beta, looked really clear and sharp to me on XBO. Very big difference to X360 version. The combo of next gen assets and 900P is just gravy, since next gen assets and 720 really doesn't look bad imo.

Explained because Destiny on X360 is only at 1024x624 (not 720p) with mediocre texture filtering (like PS3), that's why you can notice some big difference compared to 900p with good texture filtering.

900p has 2.25 times more pixels than 1024x624, which is in fact a ~14.8/9 aspect ratio, not 16/9.

2.25 is exactly the number of pixels ratio between 720p and 1080p. ;)
 
Not too scientific a take, but dabbling in Xbox One now I've got a few random thoughts:

I think next gen assets are unmistakably a bigger factor than even 720P vs 1080P. I started out playing Forza 5 and Killer Instinct. One a full 1080 game the other 720. The thing is when you switch to KI, the lower resolution doesn't really jump out at you at all as a big problem. You dont go "man this game looks crap" at all, the unwashed 95% probably would never notice unless you told them.

I think it's the same for BF4 (720P on XBO). It looks really good to me. However, I have this nagging feeling it's "rough" and 1080P would help.

Next the Destiny Beta, looked really clear and sharp to me on XBO. Very big difference to X360 version. The combo of next gen assets and 900P is just gravy, since next gen assets and 720 really doesn't look bad imo.

However I wonder if it's a case of not knowing what I am missing, with Destiny.

The thing is, the on TV experience is different from DF, youtube comparison videos, and others. You view those sitting one foot away from 1080P PC monitor, as well they are compressed video.

The final thing is I have a 42" TV which I sit a good 8-10 feet from. a 55" TV is 1.7 times the screen area and a 60" is 2X+ I believe. So, obviously I wonder if 1080P would become a lot more noticeable if I upgraded TV's, as I plan to at some point.
I have a hard time discerning 1080p Forza 5 from 720p Killer Instinct on my 32" TV.

Plus, now that you mention it, I work in a place where we have a 55" HDTV and I don't find it to be much larger than my 32" TV, although it is a very neat TV.

When 4K is standard on consoles, it will be the time to make the jump. For now, a new TV is the least of my worries.
 
42" at 8 feet is 720p range. For 1080p lower the distance to 6 feet!
55" 1080p is at 7 feet.

Rule of thumb is 1.5X tv height
 
I already know all I need to know about resolution, which is what I can see with my own eyes.. I couldn't give a damn what some chart says I can see (that's not referring to you specifically, but to anyone who posts one of these charts and tries to use it as "proof" that I'm not supposed to be able to see 1080p at such-and-such a distance).

No one should ever use a chart to determine whether or not they can "see" 1080p at a certain distance. Use your own eyes. If you can personally see a difference, then you can see a difference, it's as simple as that. Even if you're outside the range that the chart says you "should" see.

+1...

Though it's always fun to try and lie with numbers and charts aka "facts" to "win" an argument...:LOL:

These charts are useless because they are all calculated based on the Snellen visual acuity test you mention, but ignore the fact that the test concerns a black and white image on a piece of paper across the room. As you point out he was creating a system of relative measures in order to establish some baseline of normalcy. As such, it makes no accounting for emitted light, color, motion, artifacting, etc, all of which have an impact on someone playing a game on a television.

+1...though it's always fun to post charts when one doesn't understand the scope of what it means...:LOL:
 
+1...though it's always fun to post charts when one doesn't understand the scope of what it means...:LOL:

heres no chart, but actual test data
http://www.hdtvtest.co.uk/news/4k-resolution-201312153517.htm
The results are now in, and an overwhelming majority of participants correctly identified the 4K TV, indicating that there exists a perceptible difference even from as far as 9 feet away on a 55in screen.
48 out of 49 people coul;d tell the difference between 4k and 1080p @ 9 feet on 55" screen (not a black and white piece of paper ;) ) and some people here say people can't tell the differnce between 720p and 1080p at similar distances/sizes :rolleyes:
thanks for agreeing with me for saying that those charts were wrong, thats the reason I made the thread in the first place, to correct this misinformation

btw I prefer 4k at 58fps than 320p at 59fps
 
some people here say people can't tell the differnce between 720p and 1080p at similar distances/sizes :rolleyes:

Isn't 4K two times 1080p? :LOL:

Didn't know 1080p was two times 720p...:LOL:

http://en.wikipedia.org/wiki/4K_res...ile:Digital_video_resolutions_(VCD_to_4K).svg

Looking at that graphic I would hope people could tell the difference between 4K and 1080p...:LOL:

btw I prefer 4k at 58fps than 320p at 59fps

Why would anyone NOT prefer that given nobody could distinguish between 58fps vs 59fps? Unless you could distinguish that 1fps difference...;)
 
Last edited by a moderator:
That chart is silly. Playing PS1 game on 4K tv with simulated 3D says that it looks more resolutions.

I don't actually have a PS1. I just wanted to give shifty some more mod work. :mrgreen:
 
That chart is silly. Playing PS1 game on 4K tv with simulated 3D says that it looks more resolutions.
How many more resolutions? This is a high-brow forum and we like to keep things as objective and scientific as possible. :yep2:
 
Back
Top