I really don't care what your charts say, the difference is there, I can see it. Stop assuming that everyone has the same limitations because 'smart' people said so.
Then you need to go see a doctor or something, because you are a medical marvel. You have 10x the visual receptor density than everyone else. This isn't based on a chart that is derived from the medical and scientific research, but the actual research into optics and eyeball anatomy. Not research that measured 100 people and decided that because they couldn't see the difference between 1080p and 720p then no-one else can, but real research spread over decades into rods and cones and pupils and light and nerves and stuff.
Scientific fact of the same sort that says matter is made out of elements made out of electrons, neutrons, and protons. Scientific fact of the same sort that says humans produce energy by controlled oxidation of carbohydrates producing CO2 and H2O and turning ADP into ATP. Sam sort of science that means if someone posted on a PC forum that they had overclocked their i7 to 12 GHz using air cooling with the stock fan and heatsink at 2000 rpm, you'd reply they weren't because that's scientifically impossible.
There are several possibilities for your being able to see the difference between 1080p and 720p on a 55" set at a distance on 20 feet.
1) Science is wrong, and the actual visual acuity of the human eye is 10x what the scientists say
2) You are a remarkable individual with 10x the visual acuity of everyone else
3) My maths is wrong and I'm made a ridiculous cock-up
4) The difference you see were imagined
5) The difference you saw were caused by something else
In response to these:
1) Science is often wrong, and I don't place 100% faith in it. If someone tells me they can hear radio on their fillings, I'd consider it a possibility and seek a proper test rather than just dismissing it out of hand. If someone tells me they can hear above 25kHz, I might believe them. In this case though, the research is well documented and makes sense across a wide range of disciplines. I see no reason to doubt the visual acuity target as 1 arcminute for the average Joe, and 0.4 on the more optimistic evaluation. If you want to convince me otherwise, you'll need a stronger argument than, "well I can".
2) Very unlikely, but not impossible. This could only be proven with a proper investigation.
3) Quite possible. I'll recheck now...
Using Pythagoras, 55" diagonal on a 16:9 screen gives:
(16x)^2 _ (9x)^2 = 55^2
256x^2 + 81x^2 = 3025
337x^2 = 3025
x^2 = 8.9
x ~ 3 inches
Therefore 16 units across = 16 x 3 = 48"
Viewing at a distance of 20' = 240" straight on forms an isosceles triangle. Taking half of that triangle, we have a right-angled triangle of width 24" and height 240". The angle of that triangle is found with arctan(24/240) = 5.71 degrees.
Therefore the FOV of the display is 11.42 degrees.
11.42 degrees / 1920 pixels = 0.00575 degrees per pixel
11.42 degrees / 1280 pixels = 0.00892 degrees per pixel
Human visual acuity at 1 arcminute means 11.42 x 60 = 685.2 samples in that viewfield
Human visual acuity at 0.4 arcminute means 11.42 x 60 = 1713 samples in that viewfield
Ah, hang on...
There's your problem; a fault in my initial maths. My bad. I was missing a decimal place in my theoretical limit calculation. Quite possibly because I keep hitting the '.' instead of the '0' on this mobile phone calculator! The actual higher-end resolution is 0.007 degrees, not 0.07. That places 1080p at above the threshold, but 720p below. Although that still places your eyesight as
remarkably good, and I'd like to see test on what you can and can't differentiate, it's not the order of magnitude difference I thought.
That's why ideas should be double-checked and independently verified. It also shows science can be trusted, but my maths can't.