TVs and HD resolutions of Nextgen consoles

DemoCoder said:
What happened to HDTV in Europe, why is it so behind Japan/US/Canada/Australia/Korea/et al?


DirectTV in the US just launched its Spaceway satellite which will provide 500 Mpeg-4 local HD channels in 2005. In 2007 they are launching 2 more, which will provide 500 national HD channels, and an additional 1,000 local channels.

That will bring the total up to 2,000 channels of HD content, although in each local market, it will end up about 600 HD channels. In other words, by 2007, pretty much all of the continental US will have HD on every satellite channel.

I expect Cable to follow. For all intents and purposes, 2007 will make the US an HD-nation.

Yep, Europe is a long way behind that. Europe tried to establish a HDTV standard of some form or another in the early nineties, and it went belly up. I don't think that helped. All the PAL regions seem to have been slower to adopt HDTV - the fact that so many countries and so many TV companies are involved has also been an issue I suspect.

In the UK, the BBC has made a commitment to shift over to producing everyting in HDTV by the end of the decade, but no word on distribution yet. The "Freeview" terrestrial broadcast system that the BBC owns (as well as distibuing over satellite and cable) is somewhat bandwidth limited and I'm not sure how going HD would affect the number of channels they could use. Sky TV is going to be trying out a HDTV sometime this year I think, but again there's been a lack of hard details - bandwidth may again be an issue if they wished to make all channels HD.

I'm sure that 1080p isn't going to be an option for the majority of TV content here any time soon.
 
It seems to me that rather than adopting their own standard, Europe should just take ATSC or Japan's ISDB and make it fit their own needs. Perhaps part of the slowdown is a "not invented here" syndrome, where they are trying via committee or consortium to brew up their own solution.
 
DemoCoder said:
It seems to me that rather than adopting their own standard, Europe should just take ATSC or Japan's ISDB and make it fit their own needs. Perhaps part of the slowdown is a "not invented here" syndrome, where they are trying via committee or consortium to brew up their own solution.

Heh sounds about right... I mean it can't be THAT difficult if you guys have hundreds of HD channels, and so do many other countries...
 
Hey Demo,
Thanks for the feedback. I should have known better than to just read one review then fork out a few thousand dollars.

You said there was a huge difference (lifting of veil") in going to 720p. Since then, have you experienced going from 720p to 1080p? How much of a difference is there between the two HD formats? What's it like?

Also, you mention good 1080p sets comming in June. Are these front or rear projection? My biggest issue with rear projection is the loss in contrast when you view the picture standing up. Has this been resolved?

Thanks
 
Yes, there is a compelling difference between 720p and 1080p, but again, it's something that has to be demoed. At CES'05, they had a "split screen" demo showing 720p and 1080p side by side, and also 1080i vs 1080p. The problem is, there is next to zero content authored for 1080p.

The viewing cone angle problem has gotten better and better, but I won't say they eliminated it yet. It's mostly caused by the high gain screens they are using. If you use a matte white screen, you don't have this problem, but you lose brightness overall. But you'd never notice it anyway, so the psychology of knowing that you lost some brightnesss would never arise.

Sony has a new screen technology, a BLACK screen that only reflects RGB wavelengths from the projector and not ambient light, that may change all that.

http://www.gizmodo.com/archives/images/sony_blackscreen.jpg

This is a picture of the screen, used as a front projection screen, in very bright lighting conditions at CES. On the left is a traditional screen.
sony_black_screen2.jpg


Now they just need to adapt it for rear projection to reject incoming ambient light, but transmit projector wavelengths.
 
"Things" are rarely black and white to the extent that mere numbers suggest.

DemoCoder said:
How does one arrive at a figure of 10%? How did he measure that? It's bollocks. A 50% reduction in pixels, and he claims 10%?

We don't necessarily perceive resolution as an "area" function. Hence citing a % improvement based on a simple heigth x width figure can be misleading. Perhaps an equally relevant way to quantify a resolution improvement is to simply stick to citing you get x% of vertical resolution and y% horizontal resolution. It gives a less impressive number for marketing, but it keeps you more honest, as well. So it works out to more like 22% reduction in pixels each orthagonal direction. Given that analog SD TV's have a resolution limit, like typical analog, that limit is not a brickwall, but something that "ramps down". So you observe a gradual taper (relatively) of resolution at higher frequencies of detail content. So 22% ==> 10% does not seem so out of line to reflect the performance loss occurs in an analog system.

If you want approximate the effect, go find a hi-res photo at 1280x720 and display it on a 1280x1024 monitor. Next, resize the image in photoshop to 854x480, put your monitor into 640x480 mode and look at the photo.

This is not exactly representative when it is extended to mpeg content. With jpeg, you can achieve a pretty impressive level of sharpness and detail compared to the source image. With mpeg, there is already a sacrifice in detail as a consequence of the additional work of motion compensation. It's pretty hard to achieve a comparable image in mpeg as you can in a single snapshot in jpeg. The extend upon the example, take the very best hi-res jpeg snapshot you got and compare it to the very best mpeg snapshot from a motion sequence. You will NEVER see the mpeg achieve the detail performance of the jpeg. It's also arguable that a computer monitor still remains a FAR more accurate and detail resolving device than an HDTV. If you put a 1280x720 image of computer text on a computer monitor and on an hdtv, the hdtv won't even compare. THAT is where the rubber hits the road when it comes to real resolution of high detail. I realize these examples don't exactly match up with your example point for point. I just wanted to illustrate that there is a bit of ambiguity in the statement given the particular devices involved. More simply, you can't exactly submit a visual example using computer monitors and then imply that is exactly how it will work on your hdtv. The example of a jpeg scenario to an mpeg scenario (hdtv) is a similarly strained comparison.

Or better yet, go buy an HDTV tuner card, put it in your PC and watch HD broadcasts at 1280x1024 and low res. Then return the card after you're done demoing it.

Naturally, the HD will look better than the SD, but you should be sure to note that even the HD looks not so sharp on a computer monitor (no where close to what we are typically accustomed to seeing from actual computer generated material sent straight to the monitor). That brings us a quandry. The HD feed is nearly that of the computer desktop, so why the soft-focus (or seeming lack of "HD detail")? The answer is, mpeg perceptual compression still takes a hefty toll out of the source, regardless of the presented resolution.

On a true 720p or 1080i display, when watching HD content, you'll see way more detail in people's faces. Every lit acne pit, dimple, and piece of sweat glistening.

Maybe, maybe not, and the actual lense settings on the camera will have a great deal to do with this, rather than if it is an SD or an HD camera. Just take a survey when you do see these sweat-gland-face detail kind of scenes. What's the background of the room look behind him/her? It's blurred to oblivion, right. This is clear sign it is a camera lense effect rather than a unique quality to HD-level performance. You can also choose a lense combo that gives medium detail quality on the subject face while retaining medium focus on the background, as well (essentially, this is the "SD look" we are pretty familiar with by now). You can achieve a similar look, either way, just with the lense combo with the SD feed. It may not be ultimately as sharp as HD, but the whole skin-imperfection detail scene is hardly a unique consequence of HD. It's a camera lense effect. Make no mistake, this effect works great to really sell home the hdtv point. It's just not being exactly truthful to the whole story if you go so far as to say only HD can achieve that effect.

When downscaled, alot of that detail is lost. That's the "removing the veil" experience I had when I saw my first real 1080i broadcast. It was a simple talkshow, but everything looked way better than the best 480p DVDs I had.

I'm not denying that more resolution typically will achieve more detail. However, also there are additional veils here that will cause you to see one way or another here- mpeg quality, scaling/smoothing algorithms, native resolution vs. scaled resolution, etc. To be sure, the modern hdtv is far from the direct-in/direct-out kind of device to make quick black and white comparisons. By virtue of all the things it does to the signal before actually displaying it to your eyes, you are viewing a relatively heavily doctored image rather than exactly the digital image that flowed down that wire from the wall. This is not to say it doesn't look good. It's just not exactly representative of the source as it existed before entering your hdtv.

One quick experiment u can do is to buy T2 Extreme condition and watch 1080p at 720p or 480p.

Native resolution and upconversion/scaling/smoothing algorithms will have a big impact on this scenario.

As for 480p looking better for DVDs because of upscaling needed on 720p, that's more nonsense. If you have a good scaler (like a DVDO iScan HD), it will actually look the same or better when upscaled. Sometimes it looks better because DVDO does a good job at removing artifacts and enhancing imagery, not to mention a slight anti-aliasing effect from the interpolation.

The logic extends that not everybody will have a DVDO, and the mere premise of a DVDO as "superior" also suggests that not all scalers (the ones that are found built-in to almost all hdtv's sold today) are created equal, either.

No amount of talking can convince one tho. You must look at two TVs side by side playing back a true HD program. IMHO, I can see way more detail in peoples faces, in rocks, on the floor, etc on a real HD display vs an SD one.

Well, that isn't really up for debate. Of course you will see more. The question is how much more. (...and does that increase justify the cost of the upgrade to hdtv) Be careful- as discussed above already, just the plain resolution numbers don't tell the whole story. Think of it as analogous to the "theoretical GFLOPs of a console vs. the actual GFLOPs achieved in practice" idea.

If HD content only looked 10% better on HD displays, we wouldn't need HD displays at all.

I thought the comment that spawned that notion was referring to viewing HD content on an SD device? How much more performance does HD offer over SD (on their respective devices) is truly a whole new can of worms topic that could eclipse the one we are discussing in now.
 
Now that the PS3 has be unveiled with the emphasis on 1080p, I though this thread deserved one more post.

I'm glad Sony have gonna the extra mile with their system.

As an aside, after a lot of browsing on the web, I discovered that in Australia they have gone with 576p as their HD TV format. It seems like they've been conned.
 
About 1080p...

A question for Deano and those actually working on the games. How realistic should our expectations for 1080p games be? Is it just an exaggerated PR bullet point? Will there really be many games released for that resolution? What I'm asking is - should we expect 1080p games on PS3 to be in the minority or majority? Thanks!

I'm asking 'cause I plan on getting a TV next year. :p
 
It's also arguable that a computer monitor still remains a FAR more accurate and detail resolving device than an HDTV. If you put a 1280x720 image of computer text on a computer monitor and on an hdtv, the hdtv won't even compare. THAT is where the rubber hits the road when it comes to real resolution of high detail.

What the hell are you talking about. HDTV, whether DLP, LCD, PDP, LCoS, D-ILA, SXRD, or OLED, all feature discrete pixels. Most HDTVs on the market *ARE* the equivalent of oversized computer monitors. If I plug a computer into an HDTV, there is no MPEG involved. Most HDTVs are DVI/HDMI *MONITORS*



Or better yet, go buy an HDTV tuner card, put it in your PC and watch HD broadcasts at 1280x1024 and low res. Then return the card after you're done demoing it.

The answer is, mpeg perceptual compression still takes a hefty toll out of the source, regardless of the presented resolution.

That is entirely dependent on the codec and and the codec parameters used. Speak for yourself when making these claims. Do you actually own an HDTV and Tuner card? Do you own T2 Extreme Edition? Blu-Ray discs will be encoded at very high bitrates with far less artifacts and information through away.

At CES'05, Sony was showing Spider-Man 2 on an SXRD based HDTV off of Blu-Ray that blew the pants off of anything I've seen before.



On a true 720p or 1080i display, when watching HD content, you'll see way more detail in people's faces. Every lit acne pit, dimple, and piece of sweat glistening.

Maybe, maybe not, and the actual lense settings on the camera will have a great deal to do with this, rather than if it is an SD or an HD camera. Just take a survey when you do see these sweat-gland-face detail kind of scenes. What's the background of the room look behind him/her? It's blurred to oblivion, right. This is clear sign it is a camera lense effect rather than a unique quality to HD-level performance.

I am an amateur photographer and I know all about zoom lenses and open apetures, but bzzzt, wrong again. Have you actually taken screen captures of HD content? I used an ATI HDTV wonder to compare an NBC 720p broadcast with a scaled down NTSC 480i broadcast (they were using the same camera, same lens.) The video was of a talk show stage set. It was zoomed out, everything in focus.

You sound like someone who has never watch real HDTV. You can watch identical Discovery Channel HD content on both an SDTV and HDTV, as well, you can watch an SD version of the same program on an SD set. No matter how you slice it, the HD version on HD set has the "lifting the veil" effect, and the HD/SD version on SD set does not.

You remind me of people who used to be in denial about the benefits of 480p vs 480i ("black interlace lines? flicker? I can't see em. Image difference? Looks only marginally different!")



One quick experiment u can do is to buy T2 Extreme condition and watch 1080p at 720p or 480p.

Native resolution and upconversion/scaling/smoothing algorithms will have a big impact on this scenario.

No, they won't. T2 Extreme Edition is mastered at 1080p. I own 2 HDTVs, one a WXGA Sanyo projector, the other a Samsung DLP. Both 720p. Different scalers. I also own T2 Ultimate Edition (SDTV Widescreen).

The scenario is this: T2 Extreme Edition looks WAY better. T2 Extreme Edition at 480p looks much worse. Same codec, same disc, looks much worse on SDTV. The SDTV specific disc (Ultimate) looks worse too.



The logic extends that not everybody will have a DVDO, and the mere premise of a DVDO as "superior" also suggests that not all scalers (the ones that are found built-in to almost all hdtv's sold today) are created equal, either.

The most popular scalar built into almost every device sold today is DCDi Farouja which does a great job.

Be careful- as discussed above already, just the plain resolution numbers don't tell the whole story.

That's right, what tells the whole story is actually owning an HDTV, not sitting there philosophizing about it. It looks substantially better. "10% better"? Complete nonsense.
 
DemoCoder said:
No, it's complete BS. Never buy anything based on reviews. Go see demos for yourself to make a decision. How does one arrive at a figure of 10%? How did he measure that? It's bollocks. A 50% reduction in pixels, and he claims 10%?

If you want approximate the effect, go find a hi-res photo at 1280x720 and display it on a 1280x1024 monitor. Next, resize the image in photoshop to 854x480, put your monitor into 640x480 mode and look at the photo.

Or better yet, go buy an HDTV tuner card, put it in your PC and watch HD broadcasts at 1280x1024 and low res. Then return the card after you're done demoing it.

On a true 720p or 1080i display, when watching HD content, you'll see way more detail in people's faces. Every lit acne pit, dimple, and piece of sweat glistening. When downscaled, alot of that detail is lost. That's the "removing the veil" experience I had when I saw my first real 1080i broadcast. It was a simple talkshow, but everything looked way better than the best 480p DVDs I had.

One quick experiment u can do is to buy T2 Extreme condition and watch 1080p at 720p or 480p.

As for 480p looking better for DVDs because of upscaling needed on 720p, that's more nonsense. If you have a good scaler (like a DVDO iScan HD), it will actually look the same or better when upscaled. Sometimes it looks better because DVDO does a good job at removing artifacts and enhancing imagery, not to mention a slight anti-aliasing effect from the interpolation.

No amount of talking can convince one tho. You must look at two TVs side by side playing back a true HD program. IMHO, I can see way more detail in peoples faces, in rocks, on the floor, etc on a real HD display vs an SD one.

If HD content only looked 10% better on HD displays, we wouldn't need HD displays at all.

I am totally pro 1080p (see my comments about BluRay/HDDVD and how they *better* support 1080p)

Do we really want to see acne pits in HD? ;)
 
DemoCoder said:
It's also arguable that a computer monitor still remains a FAR more accurate and detail resolving device than an HDTV. If you put a 1280x720 image of computer text on a computer monitor and on an hdtv, the hdtv won't even compare. THAT is where the rubber hits the road when it comes to real resolution of high detail.

What the hell are you talking about. HDTV, whether DLP, LCD, PDP, LCoS, D-ILA, SXRD, or OLED, all feature discrete pixels. Most HDTVs on the market *ARE* the equivalent of oversized computer monitors. If I plug a computer into an HDTV, there is no MPEG involved. Most HDTVs are DVI/HDMI *MONITORS*

There is the extra matter of how the scaling/image enhancement/blending/smoothing stage would affect the image. It isn't exactly direct in/direct out like a traditional computer monitor. I just don't think it is wise to automatically consider hdtv's and computer monitors on the same level when it comes to rigid performance criteria, even though the spec'd resolution is more or less comparable. You could argue that you could feed pure DV direct to your hdtv to observe optimal image quality, but since virtually zero hdtv broadcast material is available w/o having been through a perceptual encoder, it's a pretty moot point.

That is entirely dependent on the codec and and the codec parameters used. Speak for yourself when making these claims. Do you actually own an HDTV and Tuner card?

The notion of "if you own it, then you will see it, and if you don't see it, then you are blind" is really a circular, self-affirming argument.

Do you own T2 Extreme Edition?

Everyone can see it just by downloading the samples off of the MS website. They aren't that stellar, imo. They aren't ugly, at all. It's probably a step above DVD, I can agree...but viewing it on a computer monitor (that makes some seriously good visuals when it comes to hi-res desktops) really didn't seem so great.

Blu-Ray discs will be encoded at very high bitrates with far less artifacts and information through away.

Artifacts aren't really that much of a problem to begin with on DVD (except under extenuating scene circumstances, perhaps). As far as mpeg-encoded material, there will always be a level of detail loss, even at relatively generous data rates. It's just inherent to the way it works and how it interacts with motion. I'm not really inclined to believe differently until I see some samples of otherwise...

As an amateur photography, I'm sure you have taken some stellar snapshots output to jpeg, right? I think you can only agree that if you took that same snapshot and made a short video of it in various slow pans, pullbacks, and zooms, the mpeg output would not quite compare to the jpeg when it comes to retaining image detail. It's just the nature of a video codec.

At CES'05, Sony was showing Spider-Man 2 on an SXRD based HDTV off of Blu-Ray that blew the pants off of anything I've seen before.

I'm sure the technology can look quite impressive when the stakes are there for the presentation. Whether or not that translates into comparable performance when movies are released in a consumer-level format, that's what I am concerned about. This has always been a factor when it comes to the content owner deciding how much quality to give you in this retail copy. Given how trivially quality can be scaled once mpeg is involved, it's always a source of suspicion for me.

I am an amateur photographer and I know all about zoom lenses and open apetures, but bzzzt, wrong again. Have you actually taken screen captures of HD content? I used an ATI HDTV wonder to compare an NBC 720p broadcast with a scaled down NTSC 480i broadcast (they were using the same camera, same lens.) The video was of a talk show stage set. It was zoomed out, everything in focus.

Naturally, you can expect a general increase in detail all around. Just the same, I have seen the scenario I described just as often, undoubtedly to really hit home the "detail" effect of hdtv. Also, you should account for the fact that broadcasters rarely, if ever, put any serious effort into quality digital SD broadcasts (why would they, when the agenda favors showing off hdtv). Many people just assume that is literally all that SD can offer, but I have observed that these broadcasts are considerably compromised when comparing it to even its analog counterpart (assuming best signal conditions, of course). So people get a false sense of where SD is at, only because it is done so poorly most of the time. Naturally, it would never be as good as HD, but the differences are far more emphasized just in the implementation, rather than just resolution numbers suggest. (as an example, consider a music track that has been mastered to a CD and SACD for comparison- it's actually happened where they doctor up slightly different copies of the performance such that the CD is slightly compromised and the SACD is poked up a bit just to sell the "SACD superiority" point that much more- after all, if you hear a "difference", then it's gotta be true, right?).

You sound like someone who has never watch real HDTV.

This is circular logic, again. The idea being if you can't reflexively speak highly of hdtv, then you must not have seen any. This ignores that it is entirely possible that a forthcoming format simply fails to wow on its own merits, under a more scrutinous eye. Please do note, that I am not saying you are absolutely wrong, and I am absolutely right. I'm just saying that there is another side to the story, and granted a lot of this really comes down to subjective impressions.

You can watch identical Discovery Channel HD content on both an SDTV and HDTV, as well, you can watch an SD version of the same program on an SD set. No matter how you slice it, the HD version on HD set has the "lifting the veil" effect, and the HD/SD version on SD set does not.

Yeesh! Discovery HD is a real walking contradiction when it comes to PQ. Yes, it does have that "HD resolution", but it also has frequent bouts of digital artifacts when it comes to difficult moving scenes. So you trade away the "resolution veil" only to substitute in the "macroblocking veil". It's really hard to say, imo, a clear improvement was achieved when taking all parameters into account.

You remind me of people who used to be in denial about the benefits of 480p vs 480i ("black interlace lines? flicker? I can't see em. Image difference? Looks only marginally different!")

Well there is always the distinction of what is a small difference and a blinding difference. When it comes to emerging technologies, the temptation is always great to say it is a revolutionary, massive improvement, when it may only merely be an incremental difference...but let's dispense with the "marginalize your opponent" technique, kay? It's really jumping the shark, once you play that card.

No, they won't. T2 Extreme Edition is mastered at 1080p.

This means relatively little, since you can master a big gray block in 1080p- does that mean it will look "HD"? What you get, is highly influenced by what the source/provider feels justified in giving you, rather than a number designation such as "1080p".

I own 2 HDTVs, one a WXGA Sanyo projector, the other a Samsung DLP. Both 720p. Different scalers. I also own T2 Ultimate Edition (SDTV Widescreen).

The scenario is this: T2 Extreme Edition looks WAY better. T2 Extreme Edition at 480p looks much worse. Same codec, same disc, looks much worse on SDTV. The SDTV specific disc (Ultimate) looks worse too.

Myraid factors can be involved there, not just the resolution. "Direct feed"/native scan/scaled material issues in the display device are easily suspect (if not, by default). I would suspect the HD version should look better, as well, but how much?... How well you can normalize other factors really makes that a challenge to really put a finger on.

The most popular scalar built into almost every device sold today is DCDi Farouja which does a great job.

That doesn't obviate the notion that every manufacturer will employ their own tweaks and settings to maximize their respective performance metrics. A scaler isn't exactly a "non-adjustable" device. It blends and enhances accordingly to how its internal adjustments are tweaked.

That's right, what tells the whole story is actually owning an HDTV, not sitting there philosophizing about it. It looks substantially better. "10% better"? Complete nonsense.

Again, the "10% better" comment was to describe how much better an HD feed viewed on an SD TV would be, no? It was not claiming that HD on an hdtv is only 10% better than SD on an sdtv. By all means, most hdtv advocates would freely say there should only be 0% improvement when showing HD on an sdtv, because after all, it is SD, and SD sucks no matter what, right? "10%" is really no big deal. All it really means is that there was a bit of resolution capability left in plain ole sdtv technology that SD broadcasts (even more so, considering digital SD) were not exploiting.
 
randycat99 said:
There is the extra matter of how the scaling/image enhancement/blending/smoothing stage would affect the image. It isn't exactly direct in/direct out like a traditional computer monitor.

Yes it is. You are confused. You are confusing the terms "computer monitor" and CRT. HDTVs have direct DVI/HDMI inputs. I have an HTPC hooked up to mine. It functions exactly like any LCD monitor. CRTs have their own version of scaling, the fact that its an analog process doesn't make it superior.

The notion of "if you own it, then you will see it, and if you don't see it, then you are blind" is really a circular, self-affirming argument.

Your posts are like trying to argue with someone who has never seen the Mona Lisa, but only read about it, how good it can look.


It's just inherent to the way it works and how it interacts with motion. I'm not really inclined to believe differently until I see some samples of otherwise...

All of which is irrelevent, since the topic of discussion is whether HD looks better than SD, not whether video compression is lossy. An HD MPEG video mastered at 1080p from source material at a high bit-rate, will simply look better on an HD set than an SD one, period.

Naturally, you can expect a general increase in detail all around. Just the same, I have seen the scenario I described just as often, undoubtedly to really hit home the "detail" effect of hdtv. Also, you should account for the fact that broadcasters rarely, if ever, put any serious effort into quality digital SD broadcasts (why would they, when the agenda favors showing off hdtv)

The shows are filmed with the same came and scaled down. Look, I can reproduce the effect myself. I own a Sony HDV. I have already produced NTSC versions of my videos (baby birth) using the best available conversions to play them at relatives houses who don't own HDTVs. If I film the same scene with my Sony TRV900 directly with the same settings, it's the same. Video content encoded for 480p with MPEG-2 or *scaled* from uncompressed video to 480p both crush out details that I can see on hi-res displays.

CD is slightly compromised and the SACD is poked up a bit just to sell the "SACD superiority" point that much more- after all, if you hear a "difference", then it's gotta be true, right?).

The reason why DVD-A and SACD sound better is because of 5.1 support, plain and simple. On my RX-Z9 receiver, even the best mastered CDs simply don't sound as good. In stereo mode of course, there the different is harder (except that the bass is separated better), but in 5.1 mode, there is no comparison, because the various surround decoders just can't achieve the same separation.



Yeesh! Discovery HD is a real walking contradiction when it comes to PQ. Yes, it does have that "HD resolution", but it also has frequent bouts of digital artifacts when it comes to difficult moving scenes. So you trade away the "resolution veil" only to substitute in the "macroblocking veil". It's really hard to say, imo, a clear improvement was achieved when taking all parameters into account.

Well, atleast you now admit that there is a resolution veil. Yes, quality on HD varies, and it does on SD digital cable and satellite too. (UPN used to be TERRIBLE). Of course, broadcast has its own reception quality. But things are changing. We'll see after DirectTV gets Spaceway up and running.


This means relatively little, since you can master a big gray block in 1080p- does that mean it will look "HD"? What you get, is highly influenced by what the source/provider feels justified in giving you, rather than a number designation such as "1080p".

Pedantic. Most speciality DVDs like T2 Ultimate are mastered as best as possible for that format. (e.g. superbit-like techniques) The different with 1080p version is primarily due to the resolution upgrade.

As for showing HD content on SD, it will be slightly better than SD on SD, if you use a good scaler, and bypass NTSC. If your TV only takes old NTSC inputs (not component), your quality is doomed.
 
Randycat99... I must say that you don't seem to know what you are talking about. Anyways we don't care about watching hdtv in this discussion, we care about games. And jump from 480p game to 720p is huge, as is from 480i to progressive. Jumping from 480p to 720p you get over 2x resolution and it shows, just like a pc game from 640x480 to well closest 4:3 would be 1024x768, the difference is huge, and saying otherwise just sounds grazy.

Also 720p movies look substancially better than normal 480p movies, the difference can be easily seen on computer monitor and even clearer on a big hdtv. bad tv broadcast's aren't the real issue here.
 
DemoCoder said:
Yes it is. You are confused. You are confusing the terms "computer monitor" and CRT. HDTVs have direct DVI/HDMI inputs. I have an HTPC hooked up to mine. It functions exactly like any LCD monitor. CRTs have their own version of scaling, the fact that its an analog process doesn't make it superior.

You think an LCD doesn't do any scaling from the source to its native resolution??? The fact is that all hdtv's employ scaling whether or not you are using the DVI/HDMI inputs. It's an integral part of the process. Otherwise, 720p would only appear as a small image on your set and presumably 1180 would fill the screen. One of the chief purposes of an hdtv is to accept a feed from any one of the possible standard hdtv formats and deinterlace/scale/"enhance" (as appropriate) to match the native resolution of the screen. So if you are attempting to compare an 1180i/p program on a typical LCD hdtv, you aren't even seeing the original program material. It has to be downscaled to 720-ish p or whatever the native resolution the LCD screen is. Same thing with making a comparison from a 480p source. You aren't seeing the original material by virtue of the display device itself- it has to be scaled up to fill the entire native resolution of the screen. So in comparing these 2 feeds, you are looking at the processing result of more than just the effect of different resolutions. This has got nothing to do with how "direct" your digital input is or how "analog" a CRT monitor is. You simply cannot make the comparison you are implying on a fixed-pixel device (such as an LCD panel) w/o severely accounting for the additional effects inherent to make the process possible, at all.

Your posts are like trying to argue with someone who has never seen the Mona Lisa, but only read about it, how good it can look.

It's based on more than that, but I'm thinking you immediately discredit/dismiss someone on this topic if their answer is anything other than "it's great, great, great!" Which do you think is more accurate here? I'm simply saying it is good, but not rosey, outright. You are saying it is super-awesome, period. Who sounds more extreme here?

All of which is irrelevent, since the topic of discussion is whether HD looks better than SD, not whether video compression is lossy. An HD MPEG video mastered at 1080p from source material at a high bit-rate, will simply look better on an HD set than an SD one, period.

It's all relevant, because anyone could just say it looks better, period. If it didn't with the increased resolution capability then it would truly be a dismal flop. What matters even more is how much better, is it? This little question seems to send the average hd fan into rages. It's a simple, honest question for anyone interested in seeing honest improvements in the genre, but it is like kryptonite to someone who cannot possibly even imagine that hdtv is anything other than fabulous as sliced bread.

The shows are filmed with the same came and scaled down. Look, I can reproduce the effect myself. I own a Sony HDV. I have already produced NTSC versions of my videos (baby birth) using the best available conversions to play them at relatives houses who don't own HDTVs. If I film the same scene with my Sony TRV900 directly with the same settings, it's the same. Video content encoded for 480p with MPEG-2 or *scaled* from uncompressed video to 480p both crush out details that I can see on hi-res displays.

A consumer-level mpeg2 encoding??? Yeah, who knows what and how much compromisation is occuring.

The reason why DVD-A and SACD sound better is because of 5.1 support, plain and simple. On my RX-Z9 receiver, even the best mastered CDs simply don't sound as good. In stereo mode of course, there the different is harder (except that the bass is separated better), but in 5.1 mode, there is no comparison, because the various surround decoders just can't achieve the same separation.

Multichannel sound is definitely a big bonus, but not the foremost metric of quality I hear come from the DVD-A/SACD fanatic. They are adamant that the sound quality itself is fundamentally better. When you move to doubleblind tests on a plain stereo encoding, a lot of that mystique disappears. That's the shocker of it all.

Well, atleast you now admit that there is a resolution veil.

I admitted there are a number of "veils", none the least is technical resolution which makes no distinction between the most hi-res material you've ever seen and one big gray block on the screen.

Yes, quality on HD varies, and it does on SD digital cable and satellite too. (UPN used to be TERRIBLE). Of course, broadcast has its own reception quality. But things are changing. We'll see after DirectTV gets Spaceway up and running.

This is always the fallback discussion point- wait till they get bird #xyz into the sky... Meanwhile, DirecTv has been offering this false hope for the better part of a decade of digtal broadcasting. It's all going toward increasing # of broadcast channels (==> more revenue), not increasing video quality (via bandwidth). It's been this way since the 90's, and I see no reason that they will suddenly seek a different strategy from this. I'd love to be wrong on this one, but I'm content to wait and watch them actually pull this off, rather than hold my breath.


Pedantic. Most speciality DVDs like T2 Ultimate are mastered as best as possible for that format. (e.g. superbit-like techniques) The different with 1080p version is primarily due to the resolution upgrade.

If you found out it was only a "10% increase" in HF content, you'd really be incensed, eh? ;) The technical resolution is just an indication of release format, rather than any sort of guarantee of greater detail content. They give you a bit more, just to placate, but the real goal is to simply sell you yet another "special deluxe" copy of the same movie, if they can.

As for showing HD content on SD, it will be slightly better than SD on SD, if you use a good scaler, and bypass NTSC. If your TV only takes old NTSC inputs (not component), your quality is doomed.

...not so. It's more information going in, hence it is only natural to expect the maximum technical limits of a TV to get better saturation. I'm not saying it will be optimal, just not nearly as bad as you let on. I'm not even suggesting SD over HD. [gasp!] It's just not as great a loss as some make it out to be, just as going to HD is not as great a gain as some make it out to be. It all comes down to the distinction of what is merely an incremental improvement vs. a huge improvement. It's completely common that people are seduced to the latter observation more out of emotional response, rather than plain, rational observation. We see this enough with console x vs. console y, don't we? HDTV's are no different.
 
randycat99 said:
You think an LCD doesn't do any scaling from the source to its native resolution???

That's exact my point. LCD's *are* computer montors. You claimed computer monitors don't scale inputs and HDTVs do, because you mistakenly assume that computer monitor = CRT, which is a false assumption. HDTV's do not more and no less scaling than today's computer monitors do.

Secondly, the scaling argument is a canard. Analog CRTs are subject to scaling too, the only difference is, the scaling process is noisy and lossy and done with analog circuitry, instead of digital. You can pretend that CRTs have a continuous resolution function, but they do not. Data that starts out digital in the framebuffer of your device, which has to go through a DAC and be modulated into an analog signal, which is then fed to your CRT, demodulated, filtered, and scaled again by analog circuitry.

Don't presume to lecture me on how these displays work, I know exactly how they work, and what transformations are going on. Maybe you should try looking at an analog CRT's output vs the source on an oscilloscope to see the quality loss inherent.

What matters even more is how much better, is it? This little question seems to send the average hd fan into rages. It's a simple, honest question for anyone interested in seeing honest improvements in the genre, but it is like kryptonite to someone who cannot possibly even imagine that hdtv is anything other than fabulous as sliced bread.

It can't be quantified mathematically, it's a subjective response. But atleast I am willing to see for my own eyes rather than sticking my head in the sand. The simple, honest, question is, is going and evaluating a few dozen HDTVs on various source material kryptonite for you, or do you simply wish to remain ignorant?

If you want to conduct science, you have to perform experiments. How impressive HDTV is to a person can't be DEDUCED from logic.


A consumer-level mpeg2 encoding??? Yeah, who knows what and how much compromisation is occuring.

Dude, no matter what, you're going to question everything to wriggle out of the silly hole you've dug for yourself. If I told you I used a professional quality PRO HW mpeg-2 converter to master a DVD, then you'd just question the quality of my HDCAM as "prosumer, not pro"

Moreover, in the context of this console discussion, the compression and scaling arguments are completely irrelevent, as the consoles will output in native format with HDMI signaling.

If you're not willing to sit and experiment with the actual HW yourself and perform real measurements (See AVSFORUM for how PROs evaluate HD quality), the discussion is pointless. You cannot deduce from first principles, spec sheets, and standards how beautiful HD will look. At best, you can argue it isn't going to look worse, and stands a big chance of looking alot better. But there is only one way to see for sure, and that's to go see one.

On specs alone, you will never be able to deduce how much better the HDR LCD+LED displays look in person, for example. They have to be seen to be believed.
 
The biggest assumption you've made here is that I have not gone and made some actual use observations. That has given me the bulk of my viewpoints on the matter just as they have for you. That we come to different conclusions does not automatically make one of us right and one of us wrong. This whole "if you don't think hdtv is awesome, then you haven't seen one" reasoning just doesn't get off the ground, imo. You have asserted that I am making claims of PQ based purely on number specs, but I think if you go back and review, it was me who who was precisely saying that numbers don't tell you the whole story. So which is it, uh? Is it really me or you that has taken a bit too much confidence in the marketing specs?

As for the CRT monitors bit, it's also my own opinion that computer LCD panels have unfortunately slipped a few steps backwards in PQ (despite the increased level of sophistication) and resolution compared to when CRT's had reached their pinnacle. I don't really know if it is a matter of the technology isn't willing or manufacturers have discovered they can push a whole lot more LCD panels at a price point, by making measured sacrifices in PQ. On any given day, I can go look at numerous LCD panels computer or hdtv, and I'm just not impressed with the output. It's not a "bad" look, per se, but the thought comes to mind, "Is this really what people are crazy about???"

As for the oscilloscope bit, yeah, try comparing signals after a pass through an mpeg encoder and scaler- now you're talking about some serious "damage". ;)
 
Back
Top