4k resolution coming

Status
Not open for further replies.
In this case 145 m would equal to infinitely high meaningless useless distance. Either it will be in the neighbour district in relation to the location of the stadium, or from this distance that board would look tiny. All sectors within the stadium lie in much less meters away anyway

I am leaving this pointless discussion. Now
 
In this case 145 m would equal to infinitely high meaningless useless distance. Either it will be in the neighbour district in relation to the location of the stadium, or from this distance that board would look tiny.
No-one specified how big the jumbotron was. If it was large enough, it could occupy a large FOV and be retina quality (although that's infeasible for the display technology used). It comes down to FOV. There's no example you can cite where the FOV doesn't affect the perceived quality, where PPI is the only factor that matters regardless of FOV. This is because human visual acuity, and indeed all optical based imaging, is based on an angular measure, not a linear one.
 
Incidentally telescopes resolution is measured in arc serconds or some fraction of that.

Size of the moon is 0.5 degree when we look at the sky, right?
Maybe 100 pixels is good for the moon, so a 50° FOV display would need 10000 pixels, 16000 pixel wide for 80° FOV. Ultra HD 8K would be made for a display that has 38.4° of your FOV? (which is maybe big)

BTW pcchen, I wonder if you chose that same approximation. 0.3 arc minute for a pixel, moon is 30 arc minutes, moon is 100 "pixel" wide with human resolving power :)
 
On laser printers you have a choice between 300 and 600 dpi and well 300 is damn good enough. But maybe 600 is good for old people who need to read legalese small print with a looking glass, or it's like supersampling and would deal with noise patterns.
Ha, I was about to use the laser printer example to illustrate just the opposite. ;)

Back in the eighties, when the first LaserWriter came out, I thought 300dpi was absolutely gorgeous. Then I saw the first 600dpi and I was blown away about the difference. (Linotype was 2400dpi equivalent at the time, I believe?)

It's not about old people reading legalese, but simply how good it looks for normal sized characters.

The reason not to use this in this context, is that paper is a completely different medium. Our eye is much better at seeing resolution (and color shades) on a non-self illuminating display. And most laser printers are BW with no way of doing anti-aliasing. So LCD/OLED screen should need less than paper printing.
 
BTW pcchen, I wonder if you chose that same approximation. 0.3 arc minute for a pixel, moon is 30 arc minutes, moon is 100 "pixel" wide with human resolving power :)

Ha, that's certainly interesting. Normally we'd think that the moon has more than 100 "pixels", but again, looking at this picture from Wikipedia:

http://upload.wikimedia.org/wikipedia/commons/thumb/e/e1/FullMoon2010.jpg/253px-FullMoon2010.jpg

it's 253x240, and that's still more detailed than looking with my naked eyes.
 
If you had 6 projectors and ran an eyefinity setup (or nv surround) 5760x2160, but instead of a massive display you kept it at 24inch what ppi would you have
 
Incidentally telescopes resolution is measured in arc serconds or some fraction of that.

Size of the moon is 0.5 degree when we look at the sky, right?
Maybe 100 pixels is good for the moon, so a 50° FOV display would need 10000 pixels, 16000 pixel wide for 80° FOV. Ultra HD 8K would be made for a display that has 38.4° of your FOV? (which is maybe big)

BTW pcchen, I wonder if you chose that same approximation. 0.3 arc minute for a pixel, moon is 30 arc minutes, moon is 100 "pixel" wide with human resolving power :)
We have to be a little careful, because there's more to human vision than retina resolution. The brain is able to discern (and infer) details smaller than the distance between each photoreceptor by accumulating information over time. Generally though, that's only on super-close inspection straining to see, and not when relaxed taking in a film or game, for which retina quality/1 arc minute is good enough along with AA if necessary.
 
I am playing with my phone and for example with camera on, I look at the display to see whether the image is the same as what I see without the phone- unfortunately looking around without the phone still delivers the real visuals, while the phone is not capable to fully re-create reality as it were through the window.

There are many deficiencies with your phone's camera sensor's ability to capture an image of your environment and its display's ability to represent that image relative to your eyes and brain's ability to do the same. Of those deficiencies, resolution of the display is not close to the top of the list.
 
Except that you have just exposed a common fallacy that many people fall for.
But that article has a MASSIVE fallacy

it uses the 20:20 == perfect eyesight

WRONG 20:20 aint perfect IIRC 20:20 is minimum acceptable standard to get a driving license etc, its not perfect
IIRC typically normal eyesight (before age etc degenerates it) is around 20:12 some up to 20:8
 
But that article has a MASSIVE fallacy

it uses the 20:20 == perfect eyesight

WRONG 20:20 aint perfect IIRC 20:20 is minimum acceptable standard to get a driving license etc, its not perfect
IIRC typically normal eyesight (before age etc degenerates it) is around 20:12 some up to 20:8

How's that a fallacy? The main point of that article is that "distance is everything", and this does not change with how good your eyes are, or if you are actually an eagle :)

The normal accepted number (one pixel per 0.3 arc minute, which I used in my posts above) is 20/12, that's actually very good vision for normal people. 20/8 is basically the physical limit of human pupil diffraction, so very few people retain such vision acuity for a long time. If you really want this, just increase the PPI number by 50%. It does not magically require a much higher PPI number.
 
How's that a fallacy? The main point of that article is that "distance is everything"
yes distance is everything but Cause all the numbers on the tables etc use the 20:20 measurement from reading that article you will get the idea that if you sit 3m away from a 50" screen you cant tell the difference between 720p or 1080p (or whatever numbers the article saiz) where in fact most people with healthy ideas can. Its like when apple came out with their retina thing claiming the human eye couldnt distinguish the pixels at X distance, great marketting but I conducted the experiment & saw I could and my eyes are in no way flash
 
...where in fact most people with healthy ideas can.
Which is what proportion of people? The reason opticians test for 20/20 is because most people are worse than that. Those who are much worse get their eyes tested and corrected. Those with slightly worse eyesight will never be aware they have less than 20/20 vision. Camera makers used to add a little dioptre correction to the viewfinder lenses on the assumption most people a little near-sighted.

This Googled report says half of US citizens don't have 20/20 vision, and that's probably only going to get worse as people spend more time staring at small screens at near distance for ever increasing amounts. The notion that lots of people have better than 20/20 vision doesn't work for me. There are such people, sure, but not enough of the populace to be a target IMO. It's certainly not the people 4k was developed for!
 
This Googled report says half of US citizens don't have 20/20 vision
you might wanna recheck the title of that report
Half of U.S. Adults Lack 20/20 Vision

Heres wiki's take (read citations if in doubt)

In humans, the maximum acuity of a healthy, emmetropic eye (and even ametropic eyes with correctors) is approximately 20/16 to 20/12, so it is inaccurate to refer to 20/20 visual acuity as "perfect" vision.[15] 20/20 is the visual acuity needed to discriminate two points separated by 1 arc minute—about 1/16 of an inch at 20 feet. This is because a 20/20 letter, E for example, has three limbs and two spaces in between them, giving 5 different detailed areas. The ability to resolve this therefore requires 1/5 of the letter's total arc, which in this case would be 1 minute. The significance of the 20/20 standard can best be thought of as the lower limit of normal or as a screening cutoff. When used as a screening test subjects that reach this level need no further investigation, even though the average visual acuity of healthy eyes is 20/16 to 20/12.
I rest my case :)
 
you might wanna recheck the title of that report
Half of U.S. Adults Lack 20/20 Vision

Heres wiki's take (read citations if in doubt)


I rest my case :)

What case are you resting? The one where you missed the fact that the calculations that shifty was referencing are assuming an acuity of 20/12?
 
You rest your point? It's representative of people aged 20 or older. Prior to that the eyes are still undergoing changes, which is why laser eye surgery isn't generally done on people under 20. And as the eyes can continue to change for a time after 20, people still require screening from a laser eye specialist to make sure their eyes have reached a stable state before they can proceed with surgery.

Basically between the ages of 20 and 40 the human eye is at its most stable. Once past 40, the eyes start to suffer from Presbyopia, a condition that is pretty much unavoidable.

Hence it is no stretch to say that most people will have 20/20 vision or less. Unless you believe that people between the ages of 10-20 make up the majority of the population. :p This is, of course, compounded by the fact that prolonged use of computers, reading, or any other fixed distance viewing for prolonged periods of time will lead to degeneration of a person's vision over time. Those things just so happen to be quite prevalent in first world countries (computers, smartphones, tablets, etc. all contribute to this).

Regards,
SB
 
I rest my case :)
So 4k displays and super-high DPI phones are intended for the under 20s? The target audience for the ~£400 Galaxy S4 and Sony Xperia Z with their 440 dpi screens is 12 to 14 year old? And UniversalTruth's LG 538 dpi phone is intended for 9 year olds to use held 5 inches from their face?

Furthermore, how many people complaining about seeing jaggies on 300 dpi phones are talking about seeing them in games and movies? Because a 4k TV isn't intended to display text at retina quality to be read from distance.
 
Shifty (and SilentBuddah) you're forgetting that people with bad vision eg myself wear glasses/contact lenses/lazer surgery to correct their vision up to 20/20 or better.
I got some new ones 3 weeks ago in chaing mai (1/4 the price of nz :) ) with these on my vision was better than 20:20 (btw asian doctors etc in asia are better than western ones as they tend to listen to the customer, north vietnam excepted) I tested this on the eye chart.
My point is we've gotta stop using this 20:20 number as the holy grail of vision. Me (who wears glasses can spot the pixels in a retina display at the stated distance where you supposedly cant)
 
I won't disagree. But at the same time, 20/20 is a good overall target. When you say you can spot the pixels on a retina display at the supposedly-can't distance, was that looking at a photograph or text/lines? Because minute details like that on a moving game or movie I'd hazard you couldn't notice. A laserbeam drawn at 60 fps with zero AA stepping at 1 arcminute resolution will, I'm confident, have the stepping invisible to pretty much everyone while playing the game.
 
Cant remember exactly its a while ago, I do remember looking at my first IOS game
https://itunes.apple.com/us/app/vampire-balls/id519606069?mt=8
and definietly noticing the pixels, yes its harder when its moving but sometimes it makes them even stand out more eg a polygon moving over a high contrast background, the aliasing is very noticable. (30cm) true at 300dpi its way less apparent than 150dpi but its still there, I dont know what dpi you would need for it to disappear 600dpi perhaps?
 
Status
Not open for further replies.
Back
Top