CES 2006 News & Announcements

Titanio said:
I would have thought if it was encoded, for example, at 1080p/24, you'll get 24 frames a second regardless of how you are viewing it (720p, 1080i, 1080p), but with the progressive formats, you get a full frame, not a "half-frame"? So it certainly would look different..
I don't know how it works, but I asume it works like this...

1080i, 30 FPS source
Hz 1 shows 1920x540 pixel, odd lines of frame 1
Hz 2 shows 1920x540 pixel, even lines of frame 1
Hz 3 shows 1920x540 pixel, odd lines of frame 2
Hz 4 shows 1920x540 pixel, even lines of frame 2
Hz 5 shows 1920x540 pixel, odd lines of frame 3
Hz 6 shows 1920x540 pixel, even lines of frame 3

As LCDs and Plasmas are progressive displays anyway, I suppose they cache the two interlaced lines and renderer them non-interlaced, top to bottom progressively. So what the viewer sees is 30 frames per second drawn top to bottom at 1920x1080 resolution, the same as 1080p

The difference comes at 60 FPS. For material recorded at 60 FPS like sports, 1080p will show 60 1920x1080 images a second, whereas 1080i will show 60 1920x540 images a second with a one line offset. Dunno how that works on panels. Do they double up the vertical lines, or interleave the current frame with the previous frame?

Quality wise I don't think 1080p has any over 1080i at <= 30 fps.
 
Shifty Geezer said:
I can't see where this is announced. Is this in reference to Buena Vista BRD movies? How can they be authoring 1080p without a 60 fps source? Is this just PR talk, but the actually res and frame rate is no better than 1080i? :???:
I'm not too sure how the whole thing works, but both SPHE and Buena Vista have announced 1080p content. I don't know what the other studios are doing.
And that is from here:
http://www.prnewswire.com/cgi-bin/stories.pl?ACCT=104&STORY=/www/story/01-05-2006/0004243866&EDATE=

Blu-ray Disc is the next generation home entertainment medium featuring
storage capacity 5 times greater than current DVDs, a picture compressed and
authored in full high-definition 1080p resolution (1920 x 1080) and lossless
audio (bit-for-bit directly from the sound stage) allowing for the first time
a pristine digital audio and video home entertainment experience equal or
superior to theatrical feature presentations. In addition to film, the Blu-
ray Disc will also support music, gaming and other digital entertainment.
 
Shifty Geezer said:
I don't know how it works, but I asume it works like this...

1080i, 30 FPS source
Hz 1 shows 1920x540 pixel, odd lines of frame 1
Hz 2 shows 1920x540 pixel, even lines of frame 1
Hz 3 shows 1920x540 pixel, odd lines of frame 2
Hz 4 shows 1920x540 pixel, even lines of frame 2
Hz 5 shows 1920x540 pixel, odd lines of frame 3
Hz 6 shows 1920x540 pixel, even lines of frame 3

As LCDs and Plasmas are progressive displays anyway, I suppose they cache the two interlaced lines and renderer them non-interlaced, top to bottom progressively. So what the viewer sees is 30 frames per second drawn top to bottom at 1920x1080 resolution, the same as 1080p

The difference comes at 60 FPS. For material recorded at 60 FPS like sports, 1080p will show 60 1920x1080 images a second, whereas 1080i will show 60 1920x540 images a second with a one line offset. Dunno how that works on panels. Do they double up the vertical lines, or interleave the current frame with the previous frame?

Quality wise I don't think 1080p has any over 1080i at <= 30 fps.


Shifty, i already explained it to you...

Have you seen the difference between the typical 30fps game running at 480p on current gen consoles, and the same game running at 480i? That's the same difference between 1080i and 1080p. No interlacing-related flaws and generally a cleaner image.

To me it's not a huge difference, and at such high resolutions it will be even less noticeable, but it definately doesn't mean that there is not difference at :love:0fps. Otherwise there would be no point today in having the vast majority of Xbox games at 480p. And most of them run at 30fps.

The problem with your example is that at 1080i, when things start moving (whether at 30 or 60fps or 15 or whatever), things will start moving as "fields" (first the even lines then the odd ones at different intervals, which obviously even at 60Hz is still not quick enough for the eye not to see the flaws) which we all know create interlacing flaws. With a progressive scan image the next frame comes up in its entirety, whether the video runs at 30fps or not.
 
Shifty Geezer said:
Quality wise I don't think 1080p has any over 1080i at <= 30 fps.
A stable image?

I haven't got a HDTV yet, but since people always bitch about the quality of 1080i and saying 720p is preferable since its progressive (for TV not just games), I assumed progressive, 1080p, must be a pretty big deal (the 2x quality leap that 1080i affords over 720p without IQ issues).
 
Shifty Geezer said:
The difference comes at 60 FPS. For material recorded at 60 FPS like sports, 1080p will show 60 1920x1080 images a second, whereas 1080i will show 60 1920x540 images a second with a one line offset.

I'm not entirely sure, this is as much a question for myself now aswell, but is this not true regardless of the framerate? i.e. at 24fps, 1080p gives you 24 1920x1080 images, 1080i gives you 24 1920x540 images? So surely the former is better than the latter?
 
london-boy said:
Have you seen the difference between the typical 30fps game running at 480p on current gen consoles, and the same game running at 480i?
No. The only interlaced displays I see are standard TVs, and the only progressive displays I see are PC monitors. The obvious difference is no flicker from an LCD panel which is nothing to do with interlaced or progressive signals.

That's the same difference between 1080i and 1080p. No interlacing-related flaws and generally a cleaner image.
Well I'm not saying there isn't per se as I've no experience, but from a purely theoretical POV which is the only one I can come from, I don't understand why there should be a difference. Given two identical 1920x1080 res TVs side by side, showing the same 24 fps movie, only one with a 1080i signal and the other with a 1080p signal, what is it about the 1080p progressive signal that'll give it a better display?

I've a friend (the one eyeing the Pioneer) who has an LCD TV and progressive-scan DVD player which I haven't seen yet (in his bedroom. Living room is a large Sony CRT and older DVD player) but keep meaning to. I'll try an get a glimpse this weekend and maybe compare progressive vs non-progressive DVD output.
 
After I posted how slightly underwhelmed I was with HD trailers I downloaded from LIVE i went back to them and downloaded both 720p and 480p tried to compare them. Don't get me wrong the difference is noticable but its nowhere near as wide as I expected. The 480P version of Narnia is as you would expect has a DVD look to it, very crisp and clear. Therse one scene in the Narnia trailer where the eagle lands and speaks, this looks incredible at 480p. I checked out the 720p and yeah its more detailed but nowhere the jump we got from VHS to DVD if thats what people are expecting. What I suspose I'm trying to say is that I think the HD DVD and Blu Ray manufactures are going to have a job on there hands persuading people to make to leap from cheap DVD players and cheap films to HD which many causual buyers are laping up, manu of whom what I have encountered so far in terms of HD film will have trouble noticing the differnece. Buy the way London I watch all my DVD's in 720p or 1080i obviously upscanned and again I can't tell the differnece between them.
 
Last edited by a moderator:
Shifty Geezer said:
No. The only interlaced displays I see are standard TVs, and the only progressive displays I see are PC monitors. The obvious difference is no flicker from an LCD panel which is nothing to do with interlaced or progressive signals.

Well I'm not saying there isn't per se as I've no experience, but from a purely theoretical POV which is the only one I can come from, I don't understand why there should be a difference. Given two identical 1920x1080 res TVs side by side, showing the same 24 fps movie, only one with a 1080i signal and the other with a 1080p signal, what is it about the 1080p progressive signal that'll give it a better display?

I've a friend (the one eyeing the Pioneer) who has an LCD TV and progressive-scan DVD player which I haven't seen yet (in his bedroom. Living room is a large Sony CRT and older DVD player) but keep meaning to. I'll try an get a glimpse this weekend and maybe compare progressive vs non-progressive DVD output.

Oh right, well obviously you need to see these things before judging ;) (which is the nicest way i can say "u dont know what u're talking about" to you.. err...).
 
Shifty Geezer said:
No. The only interlaced displays I see are standard TVs, and the only progressive displays I see are PC monitors. The obvious difference is no flicker from an LCD panel which is nothing to do with interlaced or progressive signals.

At the very least, you'll be getting more information / smoother framerate has each progressive frame is a complete refresh where as a frame off an interlaced signal will only always have half the information refreshed (odd or even lines) at a given refresh point. A progressive signal will result in a more fluid transition between frames and less artefacts.
 
Titanio said:
I'm not entirely sure, this is as much a question for myself now aswell, but is this not true regardless of the framerate? i.e. at 24fps, 1080p gives you 24 1920x1080 images, 1080i gives you 24 1920x540 images? So surely the former is better than the latter?
Both should be encoded at 60 fps, so the TV is receiving 60 frames a second. In less than 60 fps material frames are being duplicated. 1080p gives you 60 1920x1080 images a second to your TV, with 24 different images. 1080i gives you 60 1920x540 images a second. Now I guess at the encoding phase, on a 24 fps movie maybe interlaced scans record alternating fields, so you will get a sort of interlace shake thing going on, whereas a progressive scan won't have that. Though I don't see any reason why you can't scan the material progressively, basically have 24 digital stills for each second and then encode the video signal as either whole frames or interleaved half frames. Or maybe, I suppose the interlacing can't alternate fields from current and previous frames at 24 fps nicely so you'd get some cross-over between frames that would/could mess up the quality a bit. :???:
 
london-boy said:
Oh right, well obviously you need to see these things before judging ;) (which is the nicest way i can say "u dont know what u're talking about" to you.. err...).
Well I admit as much, but AFAIK my theory is sound. I'm sure progressive can look better, but I'd like to understand why. Vocalising my thoughts in the previous post shows where I'm coming from.
 
Pugger said:
Buy the way London I watch all my DVD's in 720p or 1080i obviously upscanned and again I can't tell the differnece between them.

I know, that's why i said in the past that these upscaling DVD players are really just a new way big manufacturers found to keep selling DVD players at a certain price point, as they hardly ever make a difference, and never make a big difference. ;)

I think the difference between VHS and DVD can be quite large as it can't. I mean some old runied VHS tapes look like crap mostly because they're old and the tape is f**ked. Obviously with optical formats, you always get the same image (if you damage the disc, you don't get an image at all). So it's easier to quantify differences between optical formats than between analog and optical, as analog vastly depends on the state of the tape.
 
Shifty Geezer said:
Well I admit as much, but AFAIK my theory is sound. I'm sure progressive can look better, but I'd like to understand why. Vocalising my thoughts in the previous post shows where I'm coming from.

london-boy said:
The problem with your example is that at 1080i, when things start moving (whether at 30 or 60fps or 15 or whatever), things will start moving as "fields" (first the even lines then the odd ones at different intervals, which obviously even at 60Hz is still not quick enough for the eye not to see the flaws) which we all know create interlacing flaws. With a progressive scan image the next frame comes up in its entirety, whether the video runs at 30fps or not.

There... :D
 
Nicked said:
A stable image?

I haven't got a HDTV yet, but since people always bitch about the quality of 1080i and saying 720p is preferable since its progressive (for TV not just games), I assumed progressive, 1080p, must be a pretty big deal (the 2x quality leap that 1080i affords over 720p without IQ issues).

I've seen a 1080i TV broadcast signal on a Sony SXRD 1080p HDTV and the image quality was incredible. The majority of TV signals are broadcast in 1080i not 720p. I believe even Fifa 2006 is being broadcast in 1080i not 720p.

1080p will provide a an improved picture than 1080i, but 1080i looks fantastic.
 
london-boy said:
Okay. I get it now. I guess in theory there's nothing prevent an interlace signal being buffered and deinterlaced, but that'd need fields to be matched with frame numbers and displayed as whole frames which I guess there's method for.
 
Shifty Geezer said:
Okay. I get it now. I guess in theory there's nothing prevent an interlace signal being buffered and deinterlaced, but that'd need fields to be matched with frame numbers and displayed as whole frames which I guess there's method for.

Good boy.

Some LCD sets have very good deinterlacers inside, so things would look good even when fed interlaced images, but it is still preferred, especially with LCD screens, to feed them a progressive scan image, simply because even the best de-interlacers can get it wrong sometimes and show artifacts.
Some other LCDs are just hopeless and feeding then an interlaced video signal looks awful.

It really all depends on the countless configurations you might find in a house afterall, but a progressive scan image will always be preferred to the interlaced one, at any refresh rate, as long as your TV can display it obviously.
 
Pugger said:
I checked out the 720p and yeah its more detailed but nowhere the jump we got from VHS to DVD if thats what people are expecting. .

Don't think "they" are expecting that. But for big screens the higher resolution really helps.

Did you watch it on a big screen or on a computer monitor?

On my 720p PJ it does make a difference.. DVD seems unsharp and blurred when i make A/B comparison.
 
Back
Top