Panel Resolution Upscale Spinoff

I figure that 1366x768 was settled on by all parties for a wide range of reasons. Not the least of which being a primary goal of TV makers and end users like, to obtain the best results they can while spending as little as reasonably possible. The 1366x768 display resolution fits into this nicely because that is just about as much resolution as 20/20 eyesight can resolve when seated at a normal distance from a given size display.

For example, with a viewing distance of around 10', a person will likely chose at display size of about 50". At that distance from that size of display 20/20 vision will resolve a bit more than 720p, but anything beyond 768p at 10' and beyond on a 50' display will effectively be downsampled by the eyes anyway. And of couse the same holds true for smaller displays which people will generally sit closer to and larger ones which people tend to sit further back from. Hence, 768p makes a practical display resolution for a wide range of display sizes.
 
Here Al, I moved your post in quotes:

Are those 768p displays usually good with upscaling 720p content?
They have to be good at scaling in general to look good at all since many don't even accept their native resolution though any input, and most sources won't output that resolution anyway. Granted, some have better scaling than others, and 720p scaling sometimes gets less attention because 1080i and 1080p signals are far more common. That said, most displays today, 768p or otherwise, have respectable scaling from 720p and any other supported resolution. Proper deinterlacing of certain 1080i content is far less common, but that is a seprate topic.
It's one of the things I worry about with the consoles because of all the scaling that *could* happen. Even the VGA cable for the 360 doesn't have 1366*768. The closest is 1280*768 IIRC.
Acutally, it supports 1360x768 as well, but as long as your display has a decent scaler you'll likely not notice any difference on anything but the very few 360 games which are rendered above 720p.
So there's the framebuffer resolution, then scaling through component, and then another scaling for the TV? :|
Sure, and again, even when the output resolution matches the display resolution most TVs will still do a bit of scaling to avoid showing viewers the ugly things that often happen on the edges of various content.
Psychologically for me, I would want to just get the native 1080p display and not worry about all that (even if it isn't a big deal at all), you know? :(
I don't know personally, but you are far from the first person I've seen who suffers from the condition. I suggest it be dubbed; scalaphobia, the irrational fear of image scaling. :p
 
Thanks for the reply :)
I don't know personally, but you are far from the first person I've seen who suffers from the condition. I suggest it be dubbed; scalaphobia, the irrational fear of image scaling. :p

:LOL:
 
Shifty, could you move my post too? I think I was in the middle of posting when you moved a bunch of things. :oops:
I would do, but I didn't move the thread (it was moved while I was posting!) and I don't have rights to make changes in this forum.
 
1080p out to be enough to everybody.

I'm joking a bit, but for video signals I guess it would be fine except if "home cinema" is taken in a more literal way.
to build on Kyleb's first comment, pixel resolution is only one metrics. not only color accuracy, contrast etc. of the display ; the source's quality and bitrate is also a major one. Back in the early divx days, (around half a dozen years ago), I used to curse bad "hi-res" (640x480) divx files which required more processing power and looked worse than good 352x288 or such ones :).

and really scaling works well for video. basically every video playback on PC has been upscaled forever except the old "CDROM" games and fixed size "multimedia CDROM" and web video.
 
1080p out to be enough to everybody.

I'm joking a bit, but for video signals I guess it would be fine except if "home cinema" is taken in a more literal way.

1920x1080 is pretty close to diminishing returns.

Given a one arc minute angular resolution (limit of the fovea in the human eye) and a 36 degree FOV, the upper end of the THX specification, yields 2160 pixels horizontally on a curved screen, or 2234 pixels horizontally on a flat panel.

For reference 36 degree FOV is approximately a 100 inch screen viewed from 10 feet away.

Cheers
 
Except that since we have these stupid screens on the desktop hardly 20 inches from our eyes, we'll see a drive for higher resolutions continue anyway, and I wouldn't be surprised if TVs go along with it, maybe afford some nice upscaling options for 1080p signals or something, but probably mostly just because they can.

I have a 1600x1200 22" screen in front of me right now and I can still see pixels reasonably well. But I'm fairly confident that at 3200x2400, that's going to go away for all intents and purposes.
 
I can imagine we'll get to high dpi displays. text will get much more readable, editing or displaying 5 megapixel photos will be better, and simple pixel doubling will allow to still look at web pictures and some other bitmaps. Still mainly meant for computer use.

BTW the 22" 1680x1050 screens have quite low dpi (like 19" 1280x1024 and 27" 1920x1200)
 
Not only that, film is getting scanned at 4k.

Look at how far consumer displays have gone in the past 5 years, from sub-720p. Even though 1080p isn't being supported by current broadcast standards, the market is converting to 1080p.

Don't count out 2160p in within 10 years.
 
Not only that, film is getting scanned at 4k.

Look at how far consumer displays have gone in the past 5 years, from sub-720p. Even though 1080p isn't being supported by current broadcast standards, the market is converting to 1080p.

Past 5 years? It has taken 40 years to go HD from regular PAL and NTSC resolutions. It's not just the panels, it's the entire tool chain of TV production that has to change to support at higher dpi format. The investment is substantial.

My guess is it won't happen for another 40 years.

Cheers
 
Last edited by a moderator:
I'm talking about the production and distribution of HD displays.

Five years ago, we had CRT mostly and those offered sub 1080i resolution. Now, it seems at least they're advertising more 1080p displays than lower resolution displays.

Yes PAL and NTSC have reigned for half a century. The big change was going from analog to digital, countries allocating spectrum for digital transmissions.

That's a paradigm change whereas going to higher resolutions in more incremental.

I don't know, you would think in the production chain, they have higher than 1080p equipment?

Like I said they've been scanning at higher than 1080p. I'm not saying they're ready to broadcast in greater than 1080p resolution soon.

But it wouldn't be as big a jump to extend storage media like Blu-Ray to support higher resolution.

I'd rather see them push higher data-density. The alternative would be to go to online distribution, which means going to smaller file sizes, lower bitrates, lower data-density.
 
I don't know, you would think in the production chain, they have higher than 1080p equipment?

They don't.

Friend of mine has his own small production company, they just took the plunge and went HD. As an example a colour calibrated reference monitor (5 inch diagonal) sets you back $12,000 compared to $3,000 for the PAL version. Cameras has similar price differentials although they are coming down in price (and optics can be reused in most cases). Then there's the editing equipment: Computer hardware upgrades, new software licenses etc. All in all a non-trivial investment.

The movie industry might very well use much higher fidelity standards/equipment, but it's broadcasting that is pushing the current wave of panel upgrades, not the movie industry (just look at the poor BluRay/HD-DVD penetration).

What is important in broadcasting is that formats are interoperable. A Champions league fixture is transmitted to hundreds of different broadcasting operations all over the world that all need to accept and push the feed further. And today they are already locked in to 720p or 1080i (for HD)

Cheers
 
But 1080p panels seem to be selling well, even if Blu-Ray/HD-DVD aren't yet. What would be driving sales of those displays?

Lot of those sales are to people who already have HDTVs so they've had a taste of HDTV and want more with 1080p displays and 1080p sources.

Direct TV is launching a whole bunch of HD channels this month and the rest of the year. Not more than 3 years ago, TV industry was bemoaning the investment it was going to require to upgrade their studios and infrastructure to support just 720p or 1080i.

Again, it's a paradigm shift from analog to digital but surely incremental increases in resolution are going to pale compared to converting from analog to digital?

I think also Europe is behind NA in the transition to digital so maybe the costs are still relatively high.
 
To the topic:

The PS3 with HDMI should look much better than 360 with component.

The component connection has quantization errors which introduce a gray noise in the entire image. The HDMI is nearly noise free and produce beatifull colors and deep blacks. I noticed that testing with my HDTVs (LCD and plasma) using videos and games.
 
Could you please expand on your comment? I'm not even sure what you are referring to when suggesting a connection has have quantization errors. Regardless, I've seen very little difference in siwthcing between HDMI and component with my PS3, so little that I've never even botherd to try my new 360 with HDMI.
 
But 1080p panels seem to be selling well, even if Blu-Ray/HD-DVD aren't yet. What would be driving sales of those displays?

BS marketing ?

People want to be "HD ready". Full HD must be better than not-so-full-HD.

Lot of those sales are to people who already have HDTVs so they've had a taste of HDTV and want more with 1080p displays and 1080p sources.

People are sheep. It's easy to market resolution since it's a one dimensional metric, and more *must* be better.

You'd need to sit closer than 6 feet from a 50 inch 1920x1080 display to appreciate any increase in resolution.

Contrast, colour reproduction and response time are more important than resolution at current HD resolutions, but that is a lot harder to explain to the average consumer, especially since there's no clear standard way to report these things (ANSI contrast/absolute contrast/dynamic contrast and grey-to-grey response time vs white-to-black response time etc.)

Again, it's a paradigm shift from analog to digital but surely incremental increases in resolution are going to pale compared to converting from analog to digital?
But again, your even-higher-def provider needs to buy the programming somewhere, HD (1080i30 and 720p60) *is* the new standard and will be so for the next 40 years IMO. Which is good, because going beyond 1920x1080 is just money out the window.

I think also Europe is behind NA in the transition to digital so maybe the costs are still relatively high.

Oh definately. GB is doing reasonable well with BBC HD, Sky HD etc, but the rest is lagging badly.

Cheers
 
Correct if I am wrong but an analog component video conection between a DVD player and a HDTV needs two conversion, one DA (digital to analog) and one AD (analog to digital). Both a subject to quantization errors http://en.wikipedia.org/wiki/Quantization_error

My guess there is two more things that may play a more important role which are the difference between the calibration of both conversors (The DAC and ADC) and noise which may introduce much more errors.

I used the "quantization error" as a general term encompassing all three effects.
Sorry for my mistake.

Anyway we tested our new Panasonic plasma HDTV (1024x768) with analog video component (Sony DVD player) and HDMI (PC with DVI<>HDMI cable). The difference is big. The analog connection introduce some kind of gray noise in the image.

I did the following test: we saw a entire DVD movie with the video component and after that we saw part of it again but this time with HDMI. Everybody had the same opinion, HDMI is much better.

Also some people here tested the VGA against HDMI and the diference is big two.
 
Back
Top