Isn't 1080i REALLY just 720i on MOST HDTVs?

BenQ

Newcomer
1080i is a resolution of 1920 X 1080, but displayed interlaced ( so 1920 X 540 per refresh ).

But do you still not NEED a display of 1920 X 1080 to properly display 1080i?

Pretty much ANY HDTV you look at will have the 1080i sticker on it, but you look a little closer, and you'll find that the actualy resolution of that HDTV is 1280 X 720 ( or someting close, such as 1366 X 768 ).

Except for THE LATEST HDTV which are touting 1080p, I don't see ANY HDTV's with 1920 X 1080 resolutions.... yet essentially ALL HDTV have that 1080i sticker on them anyway.

But how can a true 1080i signal be displayed on a canvas of 1280 X 720? It's obviously going to have to be scalled down.

And when it get's scalled down, I'm assuming it's format remains interlaced, so by limitation of the HDTV, would the ACTUAL display wind up being 720i ( 1280 X 360 per refresh ).

???
 
I think you already answered your own question.

The thread title is confusing though because 1080i != 720i.
1080i has more lines being displayed.

The (i) inherently means that your only getting hald the lines displayed at a time.
 
seismologist said:
I think you already answered your own question.

The thread title is confusing though because 1080i != 720i.
1080i has more lines being displayed.

The (i) inherently means that your only getting hald the lines displayed at a time.

I understand that 1080i has more lines than 720i would.

I also understand the difference between interlaced and progressive scan.

Look at it this way....

Any standard TV = 640 X 480.

480p = a FULL 640 X 480 every refresh
480i = 640 X 240 every refresh ( but even 480i has the full 640 X 480 screen to play with as it alternates between fields ).

1080i on the other hand doesn't have that same luxury on pretty much ANY HDTV.

Does a TRUE 1080i signal not NEED a screen with a display of 1920 X 1080? It's only going to be displaying 1920 X 540 every refresh, but it still needs the full 1920 X 1080 screen.

But HDTV's DON'T have screens with 1920 X 1080 resolution ( except the newest most expensive ones that tout 1080p ).

So what exactly is happening to a TRUE 1080i signal as it's being displayed on a screen that's only 1280 X 720?

It obviously has to be downsampled, and am I wrong to assume that the format remains interlaced?

So it seems to me that in the end, when you take a true 1080i signal and display it on an HDTV's with 1280 X 720 resolution, your only actually getting 720i ( 1280 X 360 per refresh ).

Have I made a mistake somewhere?
 
oh I see what your saying. I think whats happening is the horizonton resolution is downsampled

1920-> 1280

and the vertical res is scaled up

540-> 720

That's just a guess

The interlacing creates the illusion that your seeing 1080 lines at the same time (I think you knew that though) so your getting more resolution than you would see from a 720p or 720i source.
 
BenQ said:
I understand that 1080i has more lines than 720i would.

I also understand the difference between interlaced and progressive scan.

Look at it this way....

Any standard TV = 640 X 480.

480p = a FULL 640 X 480 every refresh
480i = 640 X 240 every refresh ( but even 480i has the full 640 X 480 screen to play with as it alternates between fields ).

1080i on the other hand doesn't have that same luxury on pretty much ANY HDTV.

Does a TRUE 1080i signal not NEED a screen with a display of 1920 X 1080? It's only going to be displaying 1920 X 540 every refresh, but it still needs the full 1920 X 1080 screen.

But HDTV's DON'T have screens with 1920 X 1080 resolution ( except the newest most expensive ones that tout 1080p ).

So what exactly is happening to a TRUE 1080i signal as it's being displayed on a screen that's only 1280 X 720?

It obviously has to be downsampled, and am I wrong to assume that the format remains interlaced?

So it seems to me that in the end, when you take a true 1080i signal and display it on an HDTV's with 1280 X 720 resolution, your only actually getting 720i ( 1280 X 360 per refresh ).

Have I made a mistake somewhere?

Most CRT HDTVs have 1080i native resolution.

With regards to converting from 1080i to 720p, I remember reading that most HDTVs will just use one 1920x540 field and convert that to one 1280x720 frame. It's no longer interlaced.

Edit: this is exactly what seismologist described. Beat me to it :p

It seems like you may be a bit confused on what exactly interlacing is. 1080i and 540p are not the same thing. With an interlaced picture, it alternates between drawing the even and odd scanlines. You end up with information from two different moments in time on the screen at the same time, since the even lines haven't faded when the odd lines are being drawn and vice versa. This is why some people prefer 720p to 1080i. Obviously they both kick 480i's butt.
 
Shark Sandwich said:
BenQ said:
I understand that 1080i has more lines than 720i would.

I also understand the difference between interlaced and progressive scan.

Look at it this way....

Any standard TV = 640 X 480.

480p = a FULL 640 X 480 every refresh
480i = 640 X 240 every refresh ( but even 480i has the full 640 X 480 screen to play with as it alternates between fields ).

1080i on the other hand doesn't have that same luxury on pretty much ANY HDTV.

Does a TRUE 1080i signal not NEED a screen with a display of 1920 X 1080? It's only going to be displaying 1920 X 540 every refresh, but it still needs the full 1920 X 1080 screen.

But HDTV's DON'T have screens with 1920 X 1080 resolution ( except the newest most expensive ones that tout 1080p ).

So what exactly is happening to a TRUE 1080i signal as it's being displayed on a screen that's only 1280 X 720?

It obviously has to be downsampled, and am I wrong to assume that the format remains interlaced?

So it seems to me that in the end, when you take a true 1080i signal and display it on an HDTV's with 1280 X 720 resolution, your only actually getting 720i ( 1280 X 360 per refresh ).

Have I made a mistake somewhere?

Most CRT HDTVs have 1080i native resolution.

With regards to converting from 1080i to 720p, I remember reading that most HDTVs will just use one 1920x540 field and convert that to one 1280x720 frame. It's no longer interlaced.

Edit: this is exactly what seismologist described. Beat me to it :p

It seems like you may be a bit confused on what exactly interlacing is. 1080i and 540p are not the same thing. With an interlaced picture, it alternates between drawing the even and odd scanlines. You end up with information from two different moments in time on the screen at the same time, since the even lines haven't faded when the odd lines are being drawn and vice versa. This is why some people prefer 720p to 1080i. Obviously they both kick 480i's butt.

So your saying that a 1080i signal get's converted into, more or less, a 720p image?

If that's true then why do SO MANY HDTV's have that 1080i sticker on them? Or does that 1080i sticker simply mean that the HDTV can "accept" a 1080i signal? :?
 
BenQ said:
So your saying that a 1080i signal get's converted into, more or less, a 720p image?

If that's true then why do SO MANY HDTV's have that 1080i sticker on them? Or does that 1080i sticker simply mean that the HDTV can "accept" a 1080i signal? :?

That's exactly right. Most LCD HDTV's, for example, have a native resolution of 1280x720 (or 1366x768) so they'll convert all 1080i signals into it's native format (720p). Most CRT displays, contrastively, will convert all 720p material to 1080i.
 
BenQ said:
So your saying that a 1080i signal get's converted into, more or less, a 720p image?

Exactly.

If that's true then why do SO MANY HDTV's have that 1080i sticker on them? Or does that 1080i sticker simply mean that the HDTV can "accept" a 1080i signal? :?

Yup that's how it works. As far as I know, there aren't any HDTVs out there that can support both 1080i and 720p natively. They have to first convert a signal to their native resolution.
 
Well the truth wound up to be somewhat dissapointing, but thanks for clearing that up for me guys :D
 
Shark Sandwich said:
BenQ said:
So your saying that a 1080i signal get's converted into, more or less, a 720p image?

Exactly.

If that's true then why do SO MANY HDTV's have that 1080i sticker on them? Or does that 1080i sticker simply mean that the HDTV can "accept" a 1080i signal? :?

Yup that's how it works. As far as I know, there aren't any HDTVs out there that can support both 1080i and 720p natively. They have to first convert a signal to their native resolution.

There are plenty of HDTVs out there that support 1080i and 720p natively (i.e. actually output two different resolutions, not scale one of them), my Toshiba is one of them. Whether it can actually resolve it 100% is another story, but it's like on my 19" CRT monitor - you can't actually make out every little pixel at 1600x1200, but that is the resolution it's displaying, and it looks a darn sight better than 1280x960 (which is what it CAN resolve entirely).

There's some info about it here.

CRTs can be told to display very high resolutions, but they don't really have fine enough phosphor dots to do a whole lot more than 72 DPI. A "19 inch" monitor with a 36cm horizontal dimension will be running out of phosphor dots above 1280 by 960...

That article's a little out of date btw, HDTV CRTs have improved a lot since then :)

But the basic idea is that although they don't have the phospher units to display each pixel exactly, it still looks better. 1080i looks better for moving images than 720p on my set (but for computing, 720p looks better, because you can notice the interlacing in Windows at 1080i).

God, anyone remember when computer monitors used to display interlaced images? I think my 14" displayed 1024x768 interlaced :)
 
_leech_ said:
That's exactly right. Most LCD HDTV's, for example, have a native resolution of 1280x720 (or 1366x768) so they'll convert all 1080i signals into it's native format (720p). Most CRT displays, contrastively, will convert all 720p material to 1080i.

I wonder which is worse. Both 720p and 1080i are rougly 60M dots/s. 720p to 1080i is no problem on the horizontal with CRT, but the lines become both stretched and interlaced.
Visa versa compressing horizontals is ugly on LCD. Then you have several ways to go about making a whole picture from interlaced lines.
Many recordings are only 24 fps. So you could motion blur or do more advanced techniques on every still frame between.
 
Back
Top