1080p smoke and mirrors?

geo said:
Well, what this guy seems to be asserting is that for "most" current 1080p sets they will show 1080i at worse quality than 720p sets show 1080i. I'm not buying it without more than he's shown us.

Best Buy has a couple 1080p panels in my local store, and surprisingly they were the 2 worst looking panels there. All the LCDs were fed the same signal, but these panels just couldn't seem to handle scaling things cleanly. Feed them a true 1920x1080 signal (like from a computer), and they look gorgeous. Anything else, and they don't look so hot.

I have a feeling that these panels could look great, but to make them affordable the manufacturers have to cut costs somewhere. And these costs are probably being cut in the scaling/deinterlacing/etc... department.

I'd actually had my mind set on a 1080p set until I compared them, but I went with a 37" 720p LCD instead. My chances of watching a real 1080p signal on this thing are almost nil, so it didn't make much sense to spend the extra money for it, especially since quality suffered with everything else.

Note: I only saw a couple screens, so who knows if what I saw was representative of the whole market. But hey, it is a real world example. :)
 
pdp5000exlrg.jpg


http://www.avland.co.uk/pioneer/pdp5000ex/pdp5000ex.htm
 
JBark said:
Best Buy has a couple 1080p panels in my local store, and surprisingly they were the 2 worst looking panels there. All the LCDs were fed the same signal, but these panels just couldn't seem to handle scaling things cleanly. Feed them a true 1920x1080 signal (like from a computer), and they look gorgeous. Anything else, and they don't look so hot.

I have a feeling that these panels could look great, but to make them affordable the manufacturers have to cut costs somewhere. And these costs are probably being cut in the scaling/deinterlacing/etc... department.

I'd actually had my mind set on a 1080p set until I compared them, but I went with a 37" 720p LCD instead. My chances of watching a real 1080p signal on this thing are almost nil, so it didn't make much sense to spend the extra money for it, especially since quality suffered with everything else.

Note: I only saw a couple screens, so who knows if what I saw was representative of the whole market. But hey, it is a real world example. :)

Anecdotal evidence isn't conclusive, but I actually take that report more seriously than the original link! So thanks for that. Won't be in the market for one anytime soon anyway. . . still happy with the two 720p (62" DLP and 42" LCD).
 
I actually have noticed that at best buy as well, but I assumed it was something with their signal. Often certain TVs look like crap, but I usually assume it is b/c they have too many sets and the signal amplification thing isn't set up right or something like that. Of course you know what they say about asumptions :)

Geo, I have found that site to have very good information about HDTV antennas and I find it hard to credit why you think the guy is simply blowing smoke.

I certainly understand a healthy skepticism though. Especially since he did not provide specific models, and more impotantly the info on deinterlacing units in the sets. If you come up with some questions you would like to ask we could (read I would) email him and ask him, perhaps we could clear up some of the misconceptions.

If we want to do that though we should get supporting links for assertions such as "There is a 1080p transmission format" etc...
 
Well, I don't want to seem like I'm picking on you, since I just cuffed you on another thread :LOL: --but where did anyone deny there is no 1080p transmission standard?

I won't hold myself out as an expert --but why would a transmission standard for 1080p be needed in order for a 1080p TV to show something close to 1080p by de-interlacing 1080i at the receiving set? As I read his argument, it's not that something close to that could not be done, it's that the TV guys are cutting cost corners (on sets that often cost $3k or more, btw) to not do it. And that's where I get dissonance, particularly when he won't point for either brickbats or bouquets at specific manufacturers and models.

This is a pretty expensive market still (particulary 1080p sets) dominated by enthusiasts with enthusiast tastes and enthusiast budgets --somebody wouldn't want to compete on quality? And we couldn't find someway therefore to compare good to bad, and create lists of models for savvy consumers to check before they go shopping? It doesn't pass the smell test for me (but then neither did X1900XTX specs, and we saw I was utterly wrong on that. So it happens :LOL: ).

Edit: You can always just point him at this thread with a "I got a couple of loudmouthed know-it-alls named geo and london-boy who are talking smack on you. . .why don't you drop by and stomp 'em into jelly?" ;)
 
Last edited by a moderator:
As for the Europe EICTA has defined following standard signals via HDMI/DVI and/or Component as must features to get HD Ready certification:
- 720p50
- 720p60
- 1080i50Hz
- 1080i60Hz
TV panel must have at least 720 physical pixel lines, but the x-resolution is not defined.

Also, I have found on my tests that most LCD TVs with HD Ready certification accept
- 1080p24
- 1080p25
- 1080p30

...via DVI or HDMI. Again, 1080p50 and 1080p60 are extremely rare. In this light, I don't think that we'll see devices only supporting 1080p-singaling in Europe.
 
(Sorry i missed some posts, i have a life outside the cyberspace... ;))



Sxotty said:
I dislike how a lot of LCDs are some random resolution, that just seems silly to me, why aren't they exactly 1080, or 720 and not some in between resolution?

Well the typical LCD resolution for 32" to 37" sets these days is 1366x768 which is strange but really isn't "random". It actually makes sense in PC-monitor terms (it's the 16:9 ratio of a resolution with 768 vertical lines, so it's basically 1024x768 but with a perfect 16:9 ratio). It also handles 720p feeds perfectly, so it was never a problem.

Best Buy has a couple 1080p panels in my local store, and surprisingly they were the 2 worst looking panels there. All the LCDs were fed the same signal, but these panels just couldn't seem to handle scaling things cleanly. Feed them a true 1920x1080 signal (like from a computer), and they look gorgeous. Anything else, and they don't look so hot.

Without names, that example is just useless. For all we know, they were just cheap 1080p sets that are nothing more than high res monitors with TV tuners inside.
Even a Samsung, as cheap as it is, would have decent enough processing to make everything look good, or at least as good as 720p sets. When you start looking at the Panny or Sony or Pioneer sets, it's a whole different story.
Obviously the signal that was fed to them was crap, and the sets were not set up, like they never are after all.

Not talking about you in particular, but I find it quite miffing that some people are against 1080p for obvious costs reasons, and to make themselves feel better about the fact that they can't get one, feel the need to spread misinformation about "how 1080p sets are WORSE than 720p sets", which is absolutely ridiculous, 1080p will not make things WORSE, it will make them better. A 1080p from a decent company will look better than their past efforts with 720p sets, and that's all there is to it.

There is one set in Europe, the Philips 37" 9830 LCD model, which has a full 1920x1080 resolution (progressive, it's LCD), but can't take a 1080p signal properly (yet). The set upscales everything to its native resolution of 1080p, but even if it takes a 1080p signal, it downscales to 1080i during processing, then upscales it again to 1080p as the screen itself runs at that resolution. Very silly, but what the hell, the image quality is absolutely amazing.
 
Mortimer said:
london-boy said:
Well the typical LCD resolution for 32" to 37" sets these days is 1366x768 which is strange but really isn't "random". It actually makes sense in PC-monitor terms (it's the 16:9 ratio of a resolution with 768 vertical lines, so it's basically 1024x768 but with a perfect 16:9 ratio). It also handles 720p feeds perfectly, so it was never a problem.

1366x768 handles 720p anything but perfectly. Stupid ass resolution that's no good for anything (not even for pc:s, they'd need 1368x768) but bragging right for more pixels.

You ALWAYS have to scale. Idiotic.

Mmm all the 720p videos i've seen running on LCDs look absolutely beautiful. The scaling needed really isn't that much, and generally they all do a great job. Sure you don't get 1:1 pixel mapping, but on video you just cannot notice, even at close distances.
 
Actually, there is a display here in Europe that has a resolution of 1920x1080, can display 1080p/60, has HDMI and has a size of 37 inch. It's the Amoi LC37AF1E
The price in Sweden: $2900 (tax included, 25%).

A 42" screen is soon to be released.
 
Mortimer said:
london-boy said:
Well the typical LCD resolution for 32" to 37" sets these days is 1366x768 which is strange but really isn't "random". It actually makes sense in PC-monitor terms (it's the 16:9 ratio of a resolution with 768 vertical lines, so it's basically 1024x768 but with a perfect 16:9 ratio). It also handles 720p feeds perfectly, so it was never a problem.

1366x768 handles 720p anything but perfectly. Stupid ass resolution that's no good for anything (not even for pc:s, they'd need 1368x768) but bragging right for more pixels.

You ALWAYS have to scale. Idiotic.

No you don't. :) with PC you can have 1366 by having 1368 with active pixels of 1366. :) for example I have Philips 26pf5520D that can show native resolution with this trick without any scaling or side effects. Of course, tuning that resolution to be available with powerstrip is a bit tedious, but it's really worth of it.

again, most of these TVs don't accept native resolution from DVI / HDMI, so with those you can't do much about it. (for example, Sony Bravias and Samsung cheaper models only accept 720p, 1080i or 1080p. all the rest are centered in image to next bigger supported resolution and then pushed thru scaler, which again makes 1366x768 look like a napkin with very strong black bars on all sides. Image is centered in 1920x1080 and pushed thru scaler to fit in screen.)
 
I just (a few weeks ago) got a 42" Westinghouse 1080p. You can get them at BB for $2500-ish which makes them one of the most afordable 1080p monitors (thats right no TV tuner at all but I am feeding it via a Comcast HD DVD thats doing all the tunning). These sets do accept 1080p over HDMI and DVI and have been confimed to be able to accept a true 1080p single via AVS forum members (in fact there is a 45 pages long thread on this tv) as some of the other 1080p LCDs were not able to do this (Sharp). The set has a pretty good scaler so most of the SD does not look bad. The HD signals from Comcast really make this set shine. Last nigh I was watching ESPN HD and just could not believe how clear and sharp everything was. I could see sweat on the players..then it occured to me, do I really want to see that? I have tried varies res and feeds (480p, 4801, 720p and 1080i...dont have any 1080p sources yet) and its done a good job. This set does not have the best scaller but it did very well and has gotten some priase for being able to de-interlace properly.

Anyways the reason I brough it up is that not ALL LCD are the same and this one seems to give a great picture for the price... Playing GRAW on this has been loads of fun :)
 
Last edited:
JB on AVS forums the westinghouse lcds have a very good reputation overall, so it is little surprise that they work well.

london-boy said:
Not talking about you in particular, but I find it quite miffing that some people are against 1080p for obvious costs reasons, and to make themselves feel better about the fact that they can't get one, feel the need to spread misinformation about "how 1080p sets are WORSE than 720p sets", which is absolutely ridiculous, 1080p will not make things WORSE, it will make them better. A 1080p from a decent company will look better than their past efforts with 720p sets, and that's all there is to it.

Te opposite is true as well, those that wasted money buying a 1080p set that does not properly do what it should b/c it scales and swaps the signal around unecessarily try to say it is wonderful to justify the cost they have already paid. I promise the guy who wrote that post is not against 1080p for cost reasons, he is just saying they haven't reached the point where the premium is worth it. I could buy one if I wanted right now, but I see no reason to until they settle down a bit.

Although that may be a perfect 16:9 resolution as you brought up it is still a stupid thing not to just use 720p one to one pixel mapping is the way it should always work. B/c you have different resolutions obviously you need to pick one, but picking one in between is asinine. All the signal manipulations are dumb as well, I really don't know why it seems to be so complicated. And I have seen plenty of 1080p sets from panasonic, pioneer etc, I think they look good, but I haven't seen something that simply blew me away in comparison to 720p sets by similar manufacturers.
 
Sxotty said:
he is just saying they haven't reached the point where the premium is worth it. I could buy one if I wanted right now, but I see no reason to until they settle down a bit.

I agree with that 100%.

Although that may be a perfect 16:9 resolution as you brought up it is still a stupid thing not to just use 720p one to one pixel mapping is the way it should always work. B/c you have different resolutions obviously you need to pick one, but picking one in between is asinine.

For PC use, it really makes no difference in terms of usability as you can output that resolution (or 1360/1368) anyway. And in terms of IQ, i rather have those 48 vertical pixels and those 86 horizontal ones more, if i'm using it with a PC. For video, it really makes no difference as the originals (movies and other video signals) are already so "naturally antialiased" so to speak that you will never see a benefit from using 1:1 pixel mapping. Even pausing a 720p movie and trying to look for flaws or benefits from using 1:1 pixel mapping will not give you much, if anything, to complain about.
Of course, many people won't agree with me.
 
Sxotty, are you reading my mind? :) That's exactly how I feel about it as well. Hell, I was 100% sure I was getting one of those 37" Westinghouse LCDs, but once I compared it to other screens, I really didn't see any benefit from the 1080p. Everything either looked the same, or worse on the Westinghouse, probably due to a lower quality scalar.

100% of the broadcast HD is in 720p or 1080i, and will be this way for the forseeable future. We will get 1080p with Blu-Ray (and later on with HD-DVD), but there's no guarantee that studios will actually take advantage of that for the majority of their releases.

If you're using a 1080p screen hooked up to a computer, then I can definitely see the benefit. You get a whole lot more real estate for regular computer work, and you've probably got the hardware to do some really good scaling of movies/TV to the screen's native resolution.

But in the real world, increasing the number of pixels does not always mean better picture quality. I can tell that easily enough now in my own house. Sanyo Z4 projector downstairs is 1280x720 at 104". LG LCD upstairs is 1366x768 at 37". Using my PC to display the same thing on both displays, the projector always looks better than the flat panel. It all comes down to things like scalars, contrast levels, etc....

What we're seeing here is just an extenson of a debate that's been going on in the FP world for years. When you just watch DVDs, does it really matter what your projectors resolution is, as long as it's above 720x480?. You can't display what doesn't exist, so will watching a DVD on a 854x480 projector be much different than watching that same movie on a 1280x720 projector?

Same debate, different resolutions. And of course there's really no right answer, but that won't stop people from giving one. :)
 
london-boy said:
For PC use, it really makes no difference in terms of usability as you can output that resolution (or 1360/1368) anyway. And in terms of IQ, i rather have those 48 vertical pixels and those 86 horizontal ones more, if i'm using it with a PC. For video, it really makes no difference as the originals (movies and other video signals) are already so "naturally antialiased" so to speak that you will never see a benefit from using 1:1 pixel mapping. Even pausing a 720p movie and trying to look for flaws or benefits from using 1:1 pixel mapping will not give you much, if anything, to complain about.
Of course, many people won't agree with me.
londonboy btw I agree and at the moment I use a PC, but I just dislike the idea that if I was not using a PC I would see a degredation in the quality.

As to whether people will see a problem, I probably would not, but those people who are nit picking about ATI and Nvdia's AA quality should definitely be able to see it :)

My main beef is, are they spending more money to give us that unecessary extra resolution? (and I don't mean 1080p, I mean some in between resolution like 1366x768 ) Or perhaps does it have something to do with the manufacturing process itself. If that is the case maybe they need a 36.8 LCD panel instead of 37"

If they use the 1366x768 resolution then you und up having to scale everything, that seems silly to me. It seems that using an exact 720p would result in less scaling b/c at least one type of signal would fit on it. Now though when the jump to 1080p is made they will actually use that resultion on the panel from what I know, but perhaps by then people will be making 1120p panels instead who knows :)
 
Sxotty said:
londonboy btw I agree and at the moment I use a PC, but I just dislike the idea that if I was not using a PC I would see a degredation in the quality.

My point is that not-using a PC wouldn't show a degradation in quality, for the reasons i gave (1:! pixel mapping being kinda useless in HD movie playback)

As to whether people will see a problem, I probably would not, but those people who are nit picking about ATI and Nvdia's AA quality should definitely be able to see it :)

Well, people picking nits about ATI and NVIDIA IQ quality will do the same on 1366x768 panels, but not because of the panels because all ATI and NVIDIA cards can output 1360x768 or 1368x768 resolutions, giving you 1:1 pixel mapping with either 3 empty pixels on each side (big deal!) or a tiny bit of overdraw (2 pixels worth of it). 1:1 pixel mapping is all that's needed to avoid scaling and all PCs can output those resolutions, even without Powerstrip (My NV35 automatically set to 1360x768 when i plugged my PC to my TV, i didn't have to change anything)

My main beef is, are they spending more money to give us that unecessary extra resolution? (and I don't mean 1080p, I mean some in between resolution like 1366x768 ) Or perhaps does it have something to do with the manufacturing process itself. If that is the case maybe they need a 36.8 LCD panel instead of 37"


Well apparently they're spending less - it's cheaper for them to produce panels that are basically the same as 1024x768 panels, just "longer". But i'm only speaking from what i've read around some articles. It would certainly explain why LCD screens have come down in price so much.


If they use the 1366x768 resolution then you und up having to scale everything, that seems silly to me. It seems that using an exact 720p would result in less scaling b/c at least one type of signal would fit on it. Now though when the jump to 1080p is made they will actually use that resultion on the panel from what I know, but perhaps by then people will be making 1120p panels instead who knows :)

As i said many times, on PC inputs there will be no scaling, so there's no problem. On video input, it really is a non-issue.
1080p will really help because the scaling will be a bit more "normal". From SD resolutions, it is exactly 6.75 times the resolution, which is a much "nicer" number compared to the 3.379427083333333333etc number you get to go from 640x480 to 1366x768...
 
Interesting thread :)

A few observations:

1366 panels
I agree with LB on the fact that the 1366x panels are cheaper as the tooling is already in place for this "extension" of 1024x, so it comes down to ecconomies of scale. Why bother spending millions on re-tooling for 1280x when in the near future you will need to re-tool again for 1900x.

Input Resolution
Many LCD TVs only accept the 720p, 1080i or 1080p through HDMI so if you want 1-1 pixel matching on a 1366x panel you have to use analogue VGA :(

Scalers
Scaling has come a very long way since LCD's were first on the market, I remember back in '98 using a 21" ($4K) NEC LCD monitor which was ok in it's native res (i think 1280x1024) but using any other res (1024x768, 800x600) even the desktop was unacceptable, with bluring and ghosting.

Taking the 720p input to a 1366 panel TV and scaling it to 786p is quite a simple job (apparently). This would explain why most LCD TVs look very good once setup. However the difficult part is the deinterlacing of interlaced source material such as PAL and 1080i. Currently nearly all LCD scalers do a simple hack which results in 1080i being displayed with a vertical resolution of just 540 lines :rolleyes:

Apparently if you want fully resolved 1080i on a 1366x panel (or indeed 1280 panel) you will require a rather special video scaler from the likes of lumagen, or something similar which retail in the region of £1K+ :(
I haven't read anything about the scallers in the few 1080p panels that are available in the UK so far, but I would expect they are not that great either and the high cost being down to the panel.

NOTE: I do not actually own an LCD TV yet[i/] but I have been doing some research for the last 6 months or so and this is what I have gathered from people who are much more knowledgeable than me in these matters ;)

Personally I think i'm gonna hold out until the 40" Sony X series comes out, also the soon to be released Toshiba 40WLT66 and 40WL66 are supposed to be 1080p panels but weather they are capable of accepting 1080p input is unknown.

Just my 2p worth.
 
sir doris said:
Input Resolution
Many LCD TVs only accept the 720p, 1080i or 1080p through HDMI so if you want 1-1 pixel matching on a 1366x panel you have to use analogue VGA :(

Yep, true. Most LCD TVs only accept "video" signals on HDMI (720p, 1080i etc), meaning all the normal PC resolutions don't work properly.
VGA input is very good on all LCD TVs though. On my Samsung (not even to of the range), it's absolutely perfect at 1360x768.

NOTE: I do not actually own an LCD TV yet[i/] but I have been doing some research for the last 6 months or so and this is what I have gathered from people who are much more knowledgeable than me in these matters ;)


Heh i bought my LCD TV in February and i did so much research i thought i was going crazy!! :D

Personally I think i'm gonna hold out until the 40" Sony X series comes out, also the soon to be released Toshiba 40WLT66 and 40WL66 are supposed to be 1080p panels but weather they are capable of accepting 1080p input is unknown.

Oh i'm definately selling my set when 1080p panels come down in price, and maybe even earlier than that. I feel like i need a bigger set... 32" just doesn't cut it, even though i sit quite close to it... :LOL:
 
LB: quick question re. your sammy; have you tried connecting your PC via the HDMI and if so what was the effect of the internal scaller on 720p games and desktop?

Thanks
 
Back
Top