And i am not calling you blind, i just suggested your eyesight might not be perfect, which your glasses do suggest is true
But failts in eyesight are addressed by the glasses. Chances are someone wearing glasses has better vision than someone who doesn't who has never had their eyes checked, because they will not be aware of the small focus deviations they might have. Camera manufacturers used to add +1 dioptre focus adjustment to viewfinders as it was considered most people's eyes were a smidgeon out. I don't imagine they do now because they have inbuilt dioptre adjustment. Hence you're better off asking what specs wearers can see if you're after best-case evaluations!
Bourne isn´t a tour de force of HD and not a great example,
I picked Bourne because it's handycam footage is constantly moving meaning it's constantly blurring, hence that the detail is lost - it's fight scene look no better in HD then SD because it's all just a blur. Scale that back for other other movies with camera pans and moving people, and there's often a degee of blur present that crosses pixel boundaries.
Pixar moves on my small 37 inch looks fantastic, yes the lowres versions looks nice, but not fantastic, there is a difference.
And I already mentioned the quality varies with scene and title, and HD is best suited to CGI because there are no optical artefacts from the camera!
How much you notice and care is clearly up to the person that is watching, that is the reality.
Yes, I agree. You can't say of someone if they don't notice 1080p is better than SDTV on their TV that they are blind. There are other factors. Half of that is personal perception. Don't ignore the actual limits of cinematography though. A lot of what you're seeing in a filmed movie isn't pixel-perfect sharp.
Might depend on the game. A game like Red Dead Redemption that has a long draw distance can always use more resolution to see the finer details far off.
I principle you'd think that, but if you have buckets of SSAA that does a surprising job of providign the information if not the fidelity. As my dad pointed out 20 years ago, you can see a single hair on a person's head in an SDTV broadcast, a hair far finer than the pixels that make it up, because the eyes interpret the information so well. In HD it'll look cripser, but there is a surprising amount of detail that can be caught and presented. HD will help tiny details stand out, and it will give a sense of cripsness versus the approximation of SD res, but you don't lose agreat deal from the experience without it. Blade Runner is clearly opening to a city of thousands of lights, with the Tyrell Corp HQ desnly packed with hundreds of rooms. The light may be sharper in HD, but that info isn't lost in SD. In a game situation, you don't need to see every single lightbulb in the distance, so just the impression of thousands of light would work well enough. Especially when deliberate DOF is all the rage, and modelling cinematic cameras, some small amount of blurring of the background is probably seen as a good thing in some cases because it makes the foreground stand out.
Still though it is fascinating to me how something so patently obvious to my eye where I can easily tell the difference between dvd and blu-ray from 15 feet away with one eye tied behind my back while someone repeatedly punches me in the head, yet it's completely not noticeable to others
I had the same with framerate, seeing Age of Booty as clearly better in 720p than 1080p, but my friend not really noticing the difference and happy to stick to 1080p.
Incidentally this is probably the same reason why abominations like qaa exist, since many like you probably can't tell the detail loss difference anyways.
Ahem! I am no proponent of whole-image blurring! 720p QAA lies between 720p and a lower resolution upscaled. That is not the same as the difference between 1080p and 720p crisp visuals often viewed from a distance when the 1080p advantage is reduced under theoretical maximums.
Let's be clear on my points here. For 720p versus 1080p, I'm saying the difference mostly isn't worth the cost as most people can't perceive it. For films, the difference is even less and often an SD movie doesn't look much worse than the HD version to many folk. That's two separate cases though. I'm not saying games should render in SD res, because that is clearly a disadvantage. Anyone who started this gen on an SDTV and bought an HD one will have seen the huge improvement. Warhawk is pretty rough on SD. Borderlands isn't a pinch as swish on an SD set. The eyes strain to see detail that isn't there.
So for game, IMO, 1080p ~ 720p in terms of resolving detail. And given the lower costs of rendering and opportunities to do more effects etc., 720p is the preferred design choice for any developer. We've even had discussions on this board about how good a game might look if it rendered at SD res and used all that processing power to make it as photorealistic as possible!
Does your circle of friends disagree, and your developers mates all strive for 1080p native rendering because they feel it's the better experience? (yeah, that's pretty much a rhetorical question. Anyone can look up the rendering resolutions thread to see 720p is the resolution fo choice with very few trying to provide 1080p options
)