720p vs 1080p on same size screen?

slapnutz

Regular
I know similar question have been asked but I was after a specific senario.

You have two TVs. Both are 42" or 46". You sit 3-4meters away..


Case #1
720p game native res > console outputs at 720p > 720p tv receives signal as 720p > Displays result

Case #2
720p game native res > console outputs at 720p > 1080p tv receives signal as 720p > TV upscales signal to 1080p > Displays result

Case #3
720p game native res > console upscales to 1080p > 1080p tv receives signal as 1080p > Displays result


Which would offer the best result? (I'm guess #1)

Also, how much worse would 2 & 3 be compared to 1? I guessing I'm trying to gauge the quality of modern upscaling hardware.
 
Your asking a broad question for 100's of displays that give you different results. No one can answer that question for every display as to what would be best. Not all displays are built the same that is why you get different results of quality regardless of any of the scenarios above. Bottom line is you have to go look yourself. Your eyesight is different than anyone here that would lay claim that what they say is fact as to which is better.
 
"Which is best?" depends on your definition of "best".
#1 is the most accurate representation of what the game is rendering.
The upscaler on the 360 is very high quality. From a signal-processing point of view, #3 with a 360 is likely the highest quality signal. I'm not familiar with the details of the upscaler in the PS3, but I believe it is effectively bilinear.
The upscalers that are built-in to TVs vary wildly. #2 may or may not be an improvement over letting the PS3 handle it.

Bottom line:
#1 if you know you are going to play lots of exactly 720p games and you are a pixel-accurate purist.
#3 if you are playing 360 games and your definition of a "nice" image favors smoothness over hard, square pixels.
#2 if you are going to play PS3 games and you know that the upscaler in the TV is very high (better than bilinear) quality.
 
Actually, #1 is probably the worst option, unless you have a cheap 1080p TV but a good 720p TV - like an Olevia 1080p TV and a 720p Samsung of this years model (not saying Olevia is bad, but I'd take the LNxx450 Samsung over any Olevia 1080p TV). Otherwise, I would go with the 1080p TV.

Between 2 and 3, it depends on the quality of your scaler. If your TV has a better scaler than your hardware, then use your TV's scaler, or vice versa. Rarely will a TV's scaler be better than a good hardware scaler, though. But in most cases, it doesn't make much of a difference IMO.
 
Last edited by a moderator:
Thanks guys...

Well the senario is its a balance between watching heaps of 1080p movies and playing heaps of 720p games. Kinda 40% 1080p movies and 60% 720p games. (although i'll probably spend a million hours with GT5 in 1080p if it ever comes out)

Thus my definition of "best" is the least "fuzzy/blurry" picture. Basically, if I get a 1080p tv, I'm hoping to avoid getting crappy upscaling issues where the image is "blurred" or "fuzzy" for games. I know hardware scaling has come a long way over the last several years but I do have memories of seeing horrid, fuzzriffic upscaling with older LCD monitors.

Maybe its simply getting over this fear since my new Dell monitor upscales extremely well from 1680x1050 > 1920x1200.

I'll just have to demand the salemen to show me both setups since I'm gonna be spending my +$1000 there. However looks like 1080p is safe. (looking at Sony/Panasonic/Pioneer ... lcd/plasmas)
 
If you're in North America, I highly recommend the Samsung LNxxA650 (European equivalent is the 656). It has a great scaler, and it's arguably the best LCD behind the A950 series (LED backlit Samsungs). They are dirt cheap now compared to when they were released, especially in the US.

BTW, the A650/A750/A850/A860 all use the same panel, the higher models just have extra goodies like built in woofers, DLNA support, video over wiselink etc., but the PQ remains essentially the same.

The Sony Bravias (V, W, Z series) are also great TV's.
 
If the scaler chip applies a perfect interpolation/anti-aliasing filter, you should get exactly the same image on all of them (no quality decrease/increase). However, perfect filtering requires infinitely long filter which is not practical (computational and storage requirement + image size is fixed). On the bright side, the farther you go away from the pixel, the smaller the filter taps get so it is possible to truncate the AA filter after some point w/o having any noticeable effects.

That being said, this all requires some amount of computational power. With resolutions like 1080p/720p. my guess is any scaler chip will even use more simplfied interpolation algorithms.

If you ask me, I would pick #1, but as I said, if the scaler is doing a good job in interpolation/anti-aliasing filtering, you should get very close results to #1.
 
#1 is pretty much out of the question unless you have a TV that you can set to display pixel-for-pixel on smaller resolutions, in which case it will display 720p inside a large windowbox. Which is vastly inferior to either of the other two options, and I don't see any reason for anyone to do that aside from pixel-counting.

Remember that if your TV is 1080p, then everything it displays must, by definition, be 1080p. The upscaling has to happen somewhere, or else the image won't fit the whole screen. Nine times out of ten, it happens in the TV and you're none the wiser. So in the vast majority of situations, option #1 is really option #2. A lot of people don't realize that, which is why Best Buy can make so much money selling upscaling DVD players... they neglect to mention that the buyer's TV is already upscaling it, it's just a matter of which one is better.

Now, as for 2 vs 3, it depends on your setup. Based on personal viewing, I'd go for option #2, but that's partially because A) my X360, which uses option #3, is analog through VGA, while my PS3, which uses option #2, is digital HDMI, and B) my TV's got a pretty good scaler in it.
 
#1 is pretty much out of the question unless you have a TV that you can set to display pixel-for-pixel on smaller resolutions, in which case it will display 720p inside a large windowbox. Which is vastly inferior to either of the other two options, and I don't see any reason for anyone to do that aside from pixel-counting.

Remember that if your TV is 1080p, then everything it displays must, by definition, be 1080p.

Case #1
720p game native res > console outputs at 720p > 720p tv receives signal as 720p > Displays result.

Your point still stands of course.

I think he's wondering whether to get 720p tv or 1080p tv and whether 720p content (game) would look better/worse on 720p tv or on 1080p tv.

If you got the money, I think you should buy 1080p tv, since 720p games upscaled to 1080p wouldn't result as much loss (if any) compared to playing 1080p content on 720p.

If he's planning to use PS3, there's something else to consider when choosing game resolution. If I'm not mistaken, unlike XBOX360, PS3 use different basic resolution for 720p games and 1080p games.... (like 1280x720p vs 960x1080p).. so it's going to be kinda subjective as which game resolution would suit you better.
 
You have two TVs. Both are 42" or 46". You sit 3-4meters away..
At 3m from a 46" display, you'd need a bit better than 20/20 vision to fully resolve even just 720p. Viewed at a lesser distance a higher resolution display can look better assuming it has a good scaler. Also, hardly any TVs are actually 720p anyway, most so-called "720p TVs" are really 768p.
 
Thanks guys.

Just a question about tv scaling. Most hdtvs nowdays have a usb input for picture viewing. If I was to put a 1280x720 bmp on a key and view oit on the tv, would that be a good way of pratically viewing the scaling quality?

e.g. two usb keys, both with 720p images. View them on a 720p and 1080p tv side by side.
 
Back
Top