More pixels or prettier pixels?

https://uihc.org/health-topics/what-2020-vision

Only about 35% has 20/20 vision without glasses, contact lenses or corrective surgery.
Which spectacles-wearing people choose to watch TV without their eye-wear?! You get eye-correction and then see the world in >20/20 vision (for a bit). What proportion of people don't have >20/20 vision when using whatever standard visual aids they have?
 
Which spectacles-wearing people choose to watch TV without their eye-wear?! You get eye-correction and then see the world in >20/20 vision (for a bit). What proportion of people don't have >20/20 vision when using whatever standard visual aids they have?
The article says 25% of adults don't have 20/20 vision even with corrective measures. Besides, not all of remaining 75% adults have better vison than 20/20 vision (some have 20/20, and some have better).

So this is why 1080p is sufficient for many people at certain distance. 1080p with better quality/framerate and 4k with more details, are both important for players.

What are you trying to argue? Do you think multiple display modes are unnecessary in next-gen? Or you think 1080p is not sufficient as the baseline of next-gen consoles?
 
What are you trying to argue? Do you think multiple display modes are unnecessary in next-gen? Or you think 1080p is not sufficient as the baseline of next-gen consoles?
I understood you to be saying 1080p was good enough and people didn't gain from higher resolutions, based on a commonly referred-to graph of distance/resolution/benefit. You didn't qualify as 'some people' or a percentage.

My argument is just to point out that 20/20 vision doesn't denote perfect vision and shouldn't be the reference point, so that chart shouldn't be considered an end to such discussions. There is value in higher resolutions.
 
Without a doubt prettier.

For me I prefer a thousand times 1080P/ 24fps with this* quality than 4k/60fps shaders

*
You are assuming that particular trade off is possible at all.

Furthermore, it is interesting that you select 1080p/24fps as your standard, framerates that requires careful scene planning to work even in films. Those conditions do not apply to games.

My vote is, for gaming, to apply processing power where it makes the most sense for game play. (Cut scenes of course being their own domain as far as tradeoffs are concerned.)

Note - that can mean resources can also be moved from ”pretty/many pixels” to ”interactive objects” or whatever else the developers deem important. Rendering quality simply isn’t the main consideration determining the game play experience,
 
You are assuming that particular trade off is possible at all.

Furthermore, it is interesting that you select 1080p/24fps as your standard, framerates that requires careful scene planning to work even in films. Those conditions do not apply to games.

My vote is, for gaming, to apply processing power where it makes the most sense for game play. (Cut scenes of course being their own domain as far as tradeoffs are concerned.)

Note - that can mean resources can also be moved from ”pretty/many pixels” to ”interactive objects” or whatever else the developers deem important. Rendering quality simply isn’t the main consideration determining the game play experience,

Very interesting points you said and agree at most part . (Y)

Find a word where I said or would have stated as possible?

What I said is that I prefer this quality a thousand times than 4k 60fps shaders just simple like that.

And someday if is possible ... who can say with absolute certainty that in the coming years we don't have this quality in game? At least in games escripted like Heavy Raim or corridors like Resident Evil 2 Remake or even beyond?

Did you predict that NVIDIA would have released RT ( Counting tensor cores,cache, RT core ...maybe 50% to 2/3 total transistor dedicated to RT) hardware this year? We need to remember that in March, NVIDIA needed 4 Volta GPU to reach same result as one Turing card. What will be the next in coming years?

However I agree with the part that we need evolution in the gameplay, because in fact was developed more the graphic aspect than physics, gameplay, AI.
 
Last edited:
USA sample of adults aged between 18 -> 79 https://www.cdc.gov/nchs/data/series/sr_11/sr11_003.pdf

median vision acuity unaided 20/19
median vision acuity aided 20/16.5

Well we can discard the 20/19 figure cause if you need glasses to watch TV well then you will normally wear glasses to watch TV, so in the USA the median vision is 20/16.5
These numbers seem to be collected in 1962 (56 years ago). Can they represent vision acuity for recent years?


I understood you to be saying 1080p was good enough and people didn't gain from higher resolutions, based on a commonly referred-to graph of distance/resolution/benefit. You didn't qualify as 'some people' or a percentage.

My argument is just to point out that 20/20 vision doesn't denote perfect vision and shouldn't be the reference point, so that chart shouldn't be considered an end to such discussions. There is value in higher resolutions.
No, I agree that people can see more details if they have better vision or they sit closer to TVs if the resolution is higher. So we need multiple display modes. 4k for those who prefer more details, and 1080p for players preferring better framerate/visual quality.



You should not just make up things and claim that as facts. You have to do actual studies on actual human beings to see if people can see a difference with higher resolution images on a given screen size/distance. None of that is presented in that article.
These results are deduced from scientific definition of vision acuity. Therefore people with 20/20 vision cannot distinguish pixels of 50" 1080p at 1.93m. It's definiton.
https://en.wikipedia.org/wiki/Visual_acuity.
 
Can they represent vision acuity for recent years?
Prolly not I assume ppl's eyessight uncorrected is worse now, due to more use of screens etc
but I assume ppl's eyesight corrected is better, due to advances in tech

so prolly something like
median vision acuity unaided 22/20
median vision acuity aided 20/15

Also keep in mind this ignores ppl under 18, which is a larger percent of game players. there eyesight would be at the better end of the spectrum. If I would hazard a guess the average game players eyesight would be ~20/13 In fact I am willing to put money on that its far closer to that than 20/20
 
I feel there is a sweet spot of resolution and pretty pixels and I feel that next gen it might be something like 2k with upscaling.
 
Prolly not I assume ppl's eyessight uncorrected is worse now, due to more use of screens etc
Relevant...

https://www.independent.co.uk/news/...children-outside-more-guangzhou-a8433331.html
Failing to act put children at increased risk of going blind, according to researchers who observed around 4,700 infants from primary and junior high schools in Guangzhou, China. They found 12 per cent of children in grade one, who were around the age of seven, were nearsighted. This rose to 67 per cent by grade seven, when the children are around 13-years-old.
 
Isn't this mostly academic? Once consoles went 4K, they won't go back. But like the phone resolution wars that finally ended, 4K TV will likely be mainstream for a long time, even to gen 9 and 10, so we could finally see generation transitions that increase graphics with no resolution increase.

Imagine what the XBO X could do if devs made something from the ground up at 1080p. We could see that actually happen when we go from 4K -> 4K in PS5, PS6. So just be patient.
 
Isn't this mostly academic? Once consoles went 4K, they won't go back. But like the phone resolution wars that finally ended, 4K TV will likely be mainstream for a long time, even to gen 9 and 10, so we could finally see generation transitions that increase graphics with no resolution increase.

Imagine what the XBO X could do if devs made something from the ground up at 1080p. We could see that actually happen when we go from 4K -> 4K in PS5, PS6. So just be patient.
Too bad PS5 to PS6 may be a time when silicon can't give the same 10x performance improvements we used to get decades ago.
 
Yes I've been to China very high percentage of ppl wearing glasses, & yes I'm sure this is getting worse but like I also said I'm sure they can correct vision much better than they could 50 yeasr ago, I know from my own experience 30 years ago, its far more tech orientated now, 30 yeasr ago it was an eyechart and that was it, now you have machines that measure etc.
 
Last edited by a moderator:
Too bad PS5 to PS6 may be a time when silicon can't give the same 10x performance improvements we used to get decades ago.
This is very true.
I’ll submit again that efficiency will win out over time. As advances in computational resources slow down, cleverness in terms of how you use those resources to achieve a given visual impression will be increasingly important. (Mobile being an ever larger part of the total computing economy will reinforce that.)
 
I'd prefer if developers and focused on 1080p for 3D.

I wonder how much complex the shaders, lighting and number of objects would be vs a native 4K with the same limited resources.

Shouldn't this be easy to gauge by people who does some basic 3D game development?

I'd love to see comparisons.

Please don't mention about shadows going from something like "medium" to "ultra" or throwing in another AA multiplier. I somewhat think those are lazy approaches compared to what a 3d game dev might show of focusing on more assets, better shaders, etc. Something more obvious as an eye candy like previous PhysX fur/particles in 1080p vs no PhysX in 4 where it's more obvious than slightly better shadows/AA.

I am by no means an expert though.

All I remember was that someone arguing with me who said was an experienced programmer say that a focused 1080p, rather than 4K with the same resources, won't result in much better visuals because architecture is a huge factor.
 
Back
Top