The maths isn't too hard, although you ned to know the human field of view/focal length and stuff. But basically, divide the horizontal resolution of your screen by length in mm to give you the size of a pixel in mm. Then position yourself (calculated from viewer's focal length) so that each pixel occupies the same amount of retina on your eye. The end result would be the 1080p set occupying a larger field of view with the same size pixels as a 720p set, but the resolution being the same.How could it be, when 1080p set has ~2x the pixels?
Is there some kind of "formula" to measure visual acuity and pixel size, but also includes distance as a parameter?
The problem here is we're changing two variables, resolution and size. If the 1080p screens weren't getting larger, they'd offer more resolution per square inch so be higher fidelity. But as the screens get larger, you lose fidelty in favour of size. Then we throw in viewing distance for a third variable, and basically all attempts at objective analysis go flying out the window!