Benefit of 1080p over 720p: Is it worth it?

One thing which I notice while gaming on my PC is that whenever I use a resolution that's one step below my monitor's native resolution (1440*900) I'm easily find even the smallest of the differences but I failed to find much difference when I compared 720p with 1080p on my HDTV.

I once remember hooking up my PC with my 40 inch HDTV for doing a comparison between 720p and 1080p, the games I used were Operation Flashpoint: DR, COD:MW2, Team Fortress 2 and a few others and I was barely able to find any differences apart from minor hints of blur caused due to upscaling, and even that wasn't very apparent until I did a closer inspection. I never tried to compare them again simply cause its a chore to setup my PC with my HDTV.

Like on the new Tomb Raider game that plays from a 3/4 perspective. I a/b'd the 360 version with the pc. The 360 has a very good upscaler so you would think it would be close, but the pc version at native 1920x1080 was clearly better looking. Because of the camera view and distance the pc version let you see the finer details of everything.
Well, the new Tomb Raider game isn't 720p on 360.
 
Last edited by a moderator:
Not at all. My eyes are tested and I wear corrective glasses, so they're just fine. As are plenty of other people's. Lots of people watch DVDs on their HDTVs thinking they're getting HD, because they can't notice any difference. It's disingenuous to assume when I and my friends sat down to watch a 1080p movie for the first time and failed to see any advantage, and subsequent movies too, that we're all blind. We'lll sometimes point out when we notice a difference, like Blade Runner, or Monster's Inc., but it's not that often. Are you telling me in the bourne Supremacy with all those wild camera motions, it looks better at 1080p than at standard resolution? It's just a blur!

I dont know, I was pretty impressed the first time I watched a Blu Ray, which was Apacalypto, on my new at the time 42" HDTV. It was incredibly clear and like looking through a window. Not all Blu Ray's seem to be as high quality though.
 
And i am not calling you blind, i just suggested your eyesight might not be perfect, which your glasses do suggest is true
But failts in eyesight are addressed by the glasses. Chances are someone wearing glasses has better vision than someone who doesn't who has never had their eyes checked, because they will not be aware of the small focus deviations they might have. Camera manufacturers used to add +1 dioptre focus adjustment to viewfinders as it was considered most people's eyes were a smidgeon out. I don't imagine they do now because they have inbuilt dioptre adjustment. Hence you're better off asking what specs wearers can see if you're after best-case evaluations!

Bourne isn´t a tour de force of HD and not a great example,
I picked Bourne because it's handycam footage is constantly moving meaning it's constantly blurring, hence that the detail is lost - it's fight scene look no better in HD then SD because it's all just a blur. Scale that back for other other movies with camera pans and moving people, and there's often a degee of blur present that crosses pixel boundaries.
Pixar moves on my small 37 inch looks fantastic, yes the lowres versions looks nice, but not fantastic, there is a difference.
And I already mentioned the quality varies with scene and title, and HD is best suited to CGI because there are no optical artefacts from the camera!

How much you notice and care is clearly up to the person that is watching, that is the reality.
Yes, I agree. You can't say of someone if they don't notice 1080p is better than SDTV on their TV that they are blind. There are other factors. Half of that is personal perception. Don't ignore the actual limits of cinematography though. A lot of what you're seeing in a filmed movie isn't pixel-perfect sharp.

Might depend on the game. A game like Red Dead Redemption that has a long draw distance can always use more resolution to see the finer details far off.
I principle you'd think that, but if you have buckets of SSAA that does a surprising job of providign the information if not the fidelity. As my dad pointed out 20 years ago, you can see a single hair on a person's head in an SDTV broadcast, a hair far finer than the pixels that make it up, because the eyes interpret the information so well. In HD it'll look cripser, but there is a surprising amount of detail that can be caught and presented. HD will help tiny details stand out, and it will give a sense of cripsness versus the approximation of SD res, but you don't lose agreat deal from the experience without it. Blade Runner is clearly opening to a city of thousands of lights, with the Tyrell Corp HQ desnly packed with hundreds of rooms. The light may be sharper in HD, but that info isn't lost in SD. In a game situation, you don't need to see every single lightbulb in the distance, so just the impression of thousands of light would work well enough. Especially when deliberate DOF is all the rage, and modelling cinematic cameras, some small amount of blurring of the background is probably seen as a good thing in some cases because it makes the foreground stand out.

Still though it is fascinating to me how something so patently obvious to my eye where I can easily tell the difference between dvd and blu-ray from 15 feet away with one eye tied behind my back while someone repeatedly punches me in the head, yet it's completely not noticeable to others
I had the same with framerate, seeing Age of Booty as clearly better in 720p than 1080p, but my friend not really noticing the difference and happy to stick to 1080p.

Incidentally this is probably the same reason why abominations like qaa exist, since many like you probably can't tell the detail loss difference anyways.
Ahem! I am no proponent of whole-image blurring! 720p QAA lies between 720p and a lower resolution upscaled. That is not the same as the difference between 1080p and 720p crisp visuals often viewed from a distance when the 1080p advantage is reduced under theoretical maximums.

Let's be clear on my points here. For 720p versus 1080p, I'm saying the difference mostly isn't worth the cost as most people can't perceive it. For films, the difference is even less and often an SD movie doesn't look much worse than the HD version to many folk. That's two separate cases though. I'm not saying games should render in SD res, because that is clearly a disadvantage. Anyone who started this gen on an SDTV and bought an HD one will have seen the huge improvement. Warhawk is pretty rough on SD. Borderlands isn't a pinch as swish on an SD set. The eyes strain to see detail that isn't there.

So for game, IMO, 1080p ~ 720p in terms of resolving detail. And given the lower costs of rendering and opportunities to do more effects etc., 720p is the preferred design choice for any developer. We've even had discussions on this board about how good a game might look if it rendered at SD res and used all that processing power to make it as photorealistic as possible!

Does your circle of friends disagree, and your developers mates all strive for 1080p native rendering because they feel it's the better experience? (yeah, that's pretty much a rhetorical question. Anyone can look up the rendering resolutions thread to see 720p is the resolution fo choice with very few trying to provide 1080p options :p)
 
Who are these many folks you talk about? Do you have anything to back such a statement up, or is just you and your friends?
Look at all the discussion we've had on this board over the years about BRD movies, for starters...
 
Heh, HD video is brought into the discussion. I was talking only about in-game HD graphics :p
 
I definitely notice a difference in rendered 3D graphics on my 46" (soon to be 55") from 2m. It's a bit harder to notice it in video...
 
One thing which I notice while gaming on my PC is that whenever I use a resolution that's one step below my monitor's native resolution (1440*900) I'm easily find even the smallest of the differences but I failed to find much difference when I compared 720p with 1080p on my HDTV.

That's because computer LCDs typically have rather horrendous scalers, especially when you start going to the budget/low cost LCDs where manufacturer's are looking for 1-5 USD price advantages over their competition. The scaler generally tends to be one of the first items skimped on. And even more expensive LCDs quite often skimp on the quality of scaler. Especially when LCD manufacturer's assume that in the great majority of cases people will be operating at native panel resolution.

TV manufacturers on the other hand have to assume that there is just as much, if not more chance that people will be watching video sources at non-native resolution. Compare for instance the number of BRD or HDVD players sold to the number of HDTVs sold. A lot of people will still be watching SDTV broadcasts. Some will still have VHS players for movies that have yet to be released on DVD much less BRD. And a LOT will still be watching DVD. And then you situations where the TV panel isn't one that matches up with any SD, ED, or HD signal.

So TVs generally have far far better scalers than computer LCDs. But even there, you can have quite some variety. Some of the better TVs have scalers so good that it would be extremely difficult to tell if a signal is SD or HD if not for the fact that most SD content will have black borders on the sides due to the 4:3 resolution.

I have quite successfully fooled people in RL into thinking they were watching a HD video stream when in fact they were watching a DVD. Even those that claim they can distinguish between 720p and 1080p, and then made the claim they would never, ever mistake DVD for HD. :D I've won a lot of money on those bets. ;)

Regards,
SB
 
I'd rather have a 720p image with 4X FSAA than a 1080p image without it. They could use some anisotropic filtering as well. I was watching my brother play Bioshock 2 the other day on 360 and wow talk about aliasing and ugly texturing!

The next consoles had better stop neglecting AA and AF. I'd love to see them figure out shader aliasing too, maybe with that new MLAA tech.
 
Last edited by a moderator:
Look at all the discussion we've had on this board over the years about BRD movies, for starters...

That just isn´t good enough. The discussion on this board about BRD is filled with Console War fragments and the left overs from the bloody format war. It´s the last place i would look for any conclusive evidence on the idea that most wont notice the difference between SD and HD.
 
That's because computer LCDs typically have rather horrendous scalers
I'm using a Dell 24", and 720p content generally looks very good.
Since my system is a little dated, I often prefer to game in 1280x720 with tons of AA rather than the native 1920x1200.

I sit about 2 feet from this screen which gives me a horizontal viewing angle of about 47 degrees. I really wouldn't want to go any smaller than that. A 30" is definitely on the horizon.
 
Back
Top