DemoCoder said:
It's also arguable that a computer monitor still remains a FAR more accurate and detail resolving device than an HDTV. If you put a 1280x720 image of computer text on a computer monitor and on an hdtv, the hdtv won't even compare. THAT is where the rubber hits the road when it comes to real resolution of high detail.
What the hell are you talking about. HDTV, whether DLP, LCD, PDP, LCoS, D-ILA, SXRD, or OLED, all feature discrete pixels. Most HDTVs on the market *ARE* the equivalent of oversized computer monitors. If I plug a computer into an HDTV, there is no MPEG involved. Most HDTVs are DVI/HDMI *MONITORS*
There is the extra matter of how the scaling/image enhancement/blending/smoothing stage would affect the image. It isn't exactly direct in/direct out like a traditional computer monitor. I just don't think it is wise to automatically consider hdtv's and computer monitors on the same level when it comes to rigid performance criteria, even though the spec'd resolution is more or less comparable. You could argue that you could feed pure DV direct to your hdtv to observe optimal image quality, but since virtually zero hdtv broadcast material is available w/o having been through a perceptual encoder, it's a pretty moot point.
That is entirely dependent on the codec and and the codec parameters used. Speak for yourself when making these claims. Do you actually own an HDTV and Tuner card?
The notion of "if you own it, then you will see it, and if you don't see it, then you are blind" is really a circular, self-affirming argument.
Do you own T2 Extreme Edition?
Everyone can see it just by downloading the samples off of the MS website. They aren't
that stellar, imo. They aren't ugly, at all. It's probably a step above DVD, I can agree...but viewing it on a computer monitor (that makes some seriously good visuals when it comes to hi-res desktops) really didn't seem so great.
Blu-Ray discs will be encoded at very high bitrates with far less artifacts and information through away.
Artifacts aren't really that much of a problem to begin with on DVD (except under extenuating scene circumstances, perhaps). As far as mpeg-encoded material, there will always be a level of detail loss, even at relatively generous data rates. It's just inherent to the way it works and how it interacts with motion. I'm not really inclined to believe differently until I see some samples of otherwise...
As an amateur photography, I'm sure you have taken some stellar snapshots output to jpeg, right? I think you can only agree that if you took that same snapshot and made a short video of it in various slow pans, pullbacks, and zooms, the mpeg output would not quite compare to the jpeg when it comes to retaining image detail. It's just the nature of a video codec.
At CES'05, Sony was showing Spider-Man 2 on an SXRD based HDTV off of Blu-Ray that blew the pants off of anything I've seen before.
I'm sure the technology can look quite impressive when the stakes are there for the presentation. Whether or not that translates into comparable performance when movies are released in a consumer-level format, that's what I am concerned about. This has always been a factor when it comes to the content owner deciding how much quality to give you in this retail copy. Given how trivially quality can be scaled once mpeg is involved, it's always a source of suspicion for me.
I am an amateur photographer and I know all about zoom lenses and open apetures, but bzzzt, wrong again. Have you actually taken screen captures of HD content? I used an ATI HDTV wonder to compare an NBC 720p broadcast with a scaled down NTSC 480i broadcast (they were using the same camera, same lens.) The video was of a talk show stage set. It was zoomed out, everything in focus.
Naturally, you can expect a general increase in detail all around. Just the same, I have seen the scenario I described just as often, undoubtedly to really hit home the "detail" effect of hdtv. Also, you should account for the fact that broadcasters rarely, if ever, put any serious effort into quality digital SD broadcasts (why would they, when the agenda favors showing off hdtv). Many people just assume that is literally all that SD can offer, but I have observed that these broadcasts are considerably compromised when comparing it to even its analog counterpart (assuming best signal conditions, of course). So people get a false sense of where SD is at, only because it is done so poorly most of the time. Naturally, it would never be as good as HD, but the differences are far more emphasized just in the implementation, rather than just resolution numbers suggest. (as an example, consider a music track that has been mastered to a CD and SACD for comparison- it's actually happened where they doctor up slightly different copies of the performance such that the CD is slightly compromised and the SACD is poked up a bit just to sell the "SACD superiority" point that much more- after all, if you hear a "difference", then it's gotta be true, right?).
You sound like someone who has never watch real HDTV.
This is circular logic, again. The idea being if you can't reflexively speak highly of hdtv, then you must not have seen any. This ignores that it is entirely possible that a forthcoming format simply fails to wow on its own merits, under a more scrutinous eye. Please do note, that I am not saying you are absolutely wrong, and I am absolutely right. I'm just saying that there is another side to the story, and granted a lot of this really comes down to subjective impressions.
You can watch identical Discovery Channel HD content on both an SDTV and HDTV, as well, you can watch an SD version of the same program on an SD set. No matter how you slice it, the HD version on HD set has the "lifting the veil" effect, and the HD/SD version on SD set does not.
Yeesh! Discovery HD is a real walking contradiction when it comes to PQ. Yes, it does have that "HD resolution", but it also has frequent bouts of digital artifacts when it comes to difficult moving scenes. So you trade away the "resolution veil" only to substitute in the "macroblocking veil". It's really hard to say, imo, a clear improvement was achieved when taking
all parameters into account.
You remind me of people who used to be in denial about the benefits of 480p vs 480i ("black interlace lines? flicker? I can't see em. Image difference? Looks only marginally different!")
Well there is always the distinction of what is a small difference and a blinding difference. When it comes to emerging technologies, the temptation is always great to say it is a revolutionary, massive improvement, when it may only merely be an incremental difference...but let's dispense with the "marginalize your opponent" technique, kay? It's really jumping the shark, once you play that card.
No, they won't. T2 Extreme Edition is mastered at 1080p.
This means relatively little, since you can master a big gray block in 1080p- does that mean it will look "HD"? What you get, is highly influenced by what the source/provider feels justified in giving you, rather than a number designation such as "1080p".
I own 2 HDTVs, one a WXGA Sanyo projector, the other a Samsung DLP. Both 720p. Different scalers. I also own T2 Ultimate Edition (SDTV Widescreen).
The scenario is this: T2 Extreme Edition looks WAY better. T2 Extreme Edition at 480p looks much worse. Same codec, same disc, looks much worse on SDTV. The SDTV specific disc (Ultimate) looks worse too.
Myraid factors can be involved there, not just the resolution. "Direct feed"/native scan/scaled material issues in the display device are easily suspect (if not, by default). I would suspect the HD version should look better, as well, but
how much?... How well you can normalize other factors really makes that a challenge to really put a finger on.
The most popular scalar built into almost every device sold today is DCDi Farouja which does a great job.
That doesn't obviate the notion that every manufacturer will employ their own tweaks and settings to maximize their respective performance metrics. A scaler isn't exactly a "non-adjustable" device. It blends and enhances accordingly to how its internal adjustments are tweaked.
That's right, what tells the whole story is actually owning an HDTV, not sitting there philosophizing about it. It looks substantially better. "10% better"? Complete nonsense.
Again, the "10% better" comment was to describe how much better an HD feed viewed on an SD TV would be, no? It was not claiming that HD on an hdtv is only 10% better than SD on an sdtv. By all means, most hdtv advocates would freely say there should only be 0% improvement when showing HD on an sdtv, because after all, it is SD, and SD sucks no matter what, right? "10%" is really no big deal. All it really means is that there was a bit of resolution capability left in plain ole sdtv technology that SD broadcasts (even more so, considering digital SD) were not exploiting.