This misconception about "Human eyes can't resolve 4K resolution in distance." is born from several misguided assumptions. SMPTE has calculated and made such claim and viewing distance recommendation based on several factors.
- used the same way Dr. Hermann Snellen has used in his famous eye chart to calculate vision acuity
- used regular SD/HD broadcast
- does not take into account video processing variable
The problem with using Snellen's method is that his eye test is performed with stationary target. Real video contents do not. Not only that, but current displays behave very differently from real world when resolving moving objects. What we conceive as motion on displays aren't really motion. They are just pixels turning themselves on and off, with flickering in between. Real life moving lines do actually move continuously, but on displays, they do not because they are only comprised of array of fixed pixels, and since some displays have huge gap between pixels, (low pixel fill rate) and some do not, we can't make a uniform conclusion based on very few TVs and projectors samples.
In Snellen's eye test, static acuity indeed hold majority of share, but when starting to resolve real life motion, dynamic acuity begins to be important as well, and when such tests are done on displays, importance of dynamic acuity increases even more because of how display's pixel array structure is quite different than that of real world. This point has been raised by Pixelworks in SID and they've made two conclusions.
1. “When the sampling-phase effects are taken into account, the perceivable pixel density limit increases from 60 ppd to nearly 90 ppd”.
2. “Common video processing techniques can cause the fixed- pixel structure of the display to be visible regardless of the display resolution. Fortunately, alternative approaches are available that can greatly reduce the effect of fixed- pixel sampling, as will be discussed in the next paper in this series”.
Another thing to consider is pixel quality, because not all pixels are born equal. Even among 2160p/4K materials, there are vast differences, ranging from 8 terabytes of Hollywood RAW camera capture all the way down to Youtube 4K. Even some of Netlix's 4K materials aren't really 4K. Breaking Bad for example, has been discovered to be only using vanilla HD Bluray, upscaled to 4K. Heck, the Netflix version actually loses out to the 1080p Bluray original, despite having 4 times the resolution, how? Because the Netflix 4K version has less data than the BD original. Well...not quite, because Netflix uses HEVC(H265). Since HEVC is said to bring at least 40% compression improvement to the H.264, (which current HD BD uses) compression efficiency should have covered small bitrate deficiency, but it didn't, why? One Bluray compressionist actually makes a claim all Netflix compressions are automated due to huge catalog, while for Blurays, painstaking care is given to each and every titles.
So, now we have two more variables: bitrate, and compression quality. Unfortunately, a video guru, Stacey Stanley (president of Stencil & Muncil/ the guy who has done work on Xbox One's AV software interface) makes a remark that such 4K upscales are actually detrimental to compression efficiency because output resolution is compressed as 4K. He uses Martian UHD Bluray as an example. That movie's DI (Digital Intermediate) is only 2K, but upscaled to 4K so that they won't be litigated when every one of UHD Bluray discs already have "4K UHD Bluray" logo.
So, if we were to use 4K upscales such as Netflix 4K Breaking Bad as testing materials, then nobody will be able to spot picture quality improvement between HD and UHD displays because it uses HD Bluray as a source. Change the testing material to Apple ProRes 4:4:4 DI that Hollywood uses and many people will change their tune. And games are especially different from movies because games already use the most pristine uncompressed RAW quality Bluray movie collectors can only dream of. I've seen my share of reference quality Bluray movies on my plasma TV and every one of them did not fare well next to games. While not HDR, even among SDR titles, there are quite a big difference with dynamic range, (my plasma TV has very good contrast ratio to actually allowing me to distinguish) and on average, games have far superior dynamic range than movies because games do not have real world constraints such as movies (like the choice of movie set, weather, lighting condition, etc) So when people say "People can't distinguish between 1080p and 4K on such and such distance", you should understand such claim is only intended for video contents and never for games.
- used the same way Dr. Hermann Snellen has used in his famous eye chart to calculate vision acuity
- used regular SD/HD broadcast
- does not take into account video processing variable
The problem with using Snellen's method is that his eye test is performed with stationary target. Real video contents do not. Not only that, but current displays behave very differently from real world when resolving moving objects. What we conceive as motion on displays aren't really motion. They are just pixels turning themselves on and off, with flickering in between. Real life moving lines do actually move continuously, but on displays, they do not because they are only comprised of array of fixed pixels, and since some displays have huge gap between pixels, (low pixel fill rate) and some do not, we can't make a uniform conclusion based on very few TVs and projectors samples.
In Snellen's eye test, static acuity indeed hold majority of share, but when starting to resolve real life motion, dynamic acuity begins to be important as well, and when such tests are done on displays, importance of dynamic acuity increases even more because of how display's pixel array structure is quite different than that of real world. This point has been raised by Pixelworks in SID and they've made two conclusions.
1. “When the sampling-phase effects are taken into account, the perceivable pixel density limit increases from 60 ppd to nearly 90 ppd”.
2. “Common video processing techniques can cause the fixed- pixel structure of the display to be visible regardless of the display resolution. Fortunately, alternative approaches are available that can greatly reduce the effect of fixed- pixel sampling, as will be discussed in the next paper in this series”.
Another thing to consider is pixel quality, because not all pixels are born equal. Even among 2160p/4K materials, there are vast differences, ranging from 8 terabytes of Hollywood RAW camera capture all the way down to Youtube 4K. Even some of Netlix's 4K materials aren't really 4K. Breaking Bad for example, has been discovered to be only using vanilla HD Bluray, upscaled to 4K. Heck, the Netflix version actually loses out to the 1080p Bluray original, despite having 4 times the resolution, how? Because the Netflix 4K version has less data than the BD original. Well...not quite, because Netflix uses HEVC(H265). Since HEVC is said to bring at least 40% compression improvement to the H.264, (which current HD BD uses) compression efficiency should have covered small bitrate deficiency, but it didn't, why? One Bluray compressionist actually makes a claim all Netflix compressions are automated due to huge catalog, while for Blurays, painstaking care is given to each and every titles.
So, now we have two more variables: bitrate, and compression quality. Unfortunately, a video guru, Stacey Stanley (president of Stencil & Muncil/ the guy who has done work on Xbox One's AV software interface) makes a remark that such 4K upscales are actually detrimental to compression efficiency because output resolution is compressed as 4K. He uses Martian UHD Bluray as an example. That movie's DI (Digital Intermediate) is only 2K, but upscaled to 4K so that they won't be litigated when every one of UHD Bluray discs already have "4K UHD Bluray" logo.
So, if we were to use 4K upscales such as Netflix 4K Breaking Bad as testing materials, then nobody will be able to spot picture quality improvement between HD and UHD displays because it uses HD Bluray as a source. Change the testing material to Apple ProRes 4:4:4 DI that Hollywood uses and many people will change their tune. And games are especially different from movies because games already use the most pristine uncompressed RAW quality Bluray movie collectors can only dream of. I've seen my share of reference quality Bluray movies on my plasma TV and every one of them did not fare well next to games. While not HDR, even among SDR titles, there are quite a big difference with dynamic range, (my plasma TV has very good contrast ratio to actually allowing me to distinguish) and on average, games have far superior dynamic range than movies because games do not have real world constraints such as movies (like the choice of movie set, weather, lighting condition, etc) So when people say "People can't distinguish between 1080p and 4K on such and such distance", you should understand such claim is only intended for video contents and never for games.