Will Warner support Blu-ray?

Status
Not open for further replies.
DemoCoder said:
Even my friend's girlfriend, not technically savvy in the least, knows her Sony WEGA CRT is not a true HDTV, but merely "HD Ready" (can accept signal, but resolution not as good as HDTV spec)

What might blow your mind right now is that 'HD Ready' is the branding policy implemented in Europe to denote a HDTV which has a minimum 720p resolution and at least HDCP through either HDMI or DVI. Summary here.

It's sort of a consolation prize for being so late to the HDTV table that they have a reliable policy which will ensure that a much higher percentage of TV purchasers will be able to use HD-DVD and Blu-Ray compared to the American market.
 
Yep. HD Ready in Europe means the set can display (properly) at least 720P at both 50 and 60Hz, that it has HDMI and i think Component also is a requirement.

Can't say i'm unhappy with the "standard". Bit late, but at least we're clear.

HD Compatible (in Europe) is the name of what Democoder is talking about in the US - The set accepts all kinds of resolutions but can only display max 480p/i.
 
london-boy said:
HD Compatible (in Europe) is the name of what Democoder is talking about in the US - The set accepts all kinds of resolutions but can only display max 480p/i.

You would need a HD-Tuner with a HD Compatible TV right? (as an addon)

I'm wondering, I have a Samsung 27"HDTV (Dynaflat) and its 1080i native (or something along those lines). It doesn't have HDMI (has DVI). So if (theoreticaly) if there is a converter for HDMI to DVI that would be acceptable to Blu-Ray and I use this TV to watch HD Movies, would the image look bad or fuzzy. Reason I ask is because it can take 720p signals but with this TV it would probably upscale that image to 1080i. So this would be detrimental to alot of TVs in the US that are NOT 720p native.

I think that really sucks.
 
BlueTsunami said:
You would need a HD-Tuner with a HD Compatible TV right? (as an addon)

I'm wondering, I have a Samsung 27"HDTV (Dynaflat) and its 1080i native (or something along those lines). It doesn't have HDMI (has DVI). So if (theoreticaly) if there is a converter for HDMI to DVI that would be acceptable to Blu-Ray and I use this TV to watch HD Movies, would the image look bad or fuzzy. Reason I ask is because it can take 720p signals but with this TV it would probably upscale that image to 1080i. So this would be detrimental to alot of TVs in the US that are NOT 720p native.

I think that really sucks.

Well i can't speak for your TV but i'm sure it can accept a 720p signal just fine. It will only display it at whatever res it prefers.
You will only be able to watch Bluray/HDDVD is your DVI input accepts HDCP. If it doesn't, you're screwed. You will either get a 480p image or a "no signal" message. Still not too clear.
There are cables with HDMI port on one end and DVI on the other end (they are basically the same thing - just different connectors - and HDMI carries digital audio too), but HDCP material will only work if your DVI port accept HDCP (think a software HD version of Macrovision).
 
My Samsung 27" HDTV

That it right there. Its funny that the official Samsung site lists shit about the specs of the TV (I hate that)

EDIT: Its actually there...I had to download the -Detailed Specs Sheet- :blush:

I looked at the -Detaled Specs Sheet- and yes! it does have DVI with HDCP copy protection

Samsung Spec Sheet said:
DVI-HDTV Interface with HDCP Copy Protection

I guess I lucked out with that one, because HDCP wasn't even a factor when I bought this TV. The next HDTV I get though, I want to be 720p Native. I heard though, that theres not alot of 720p sets out there.

Note: I carried that shit up into my house from my car. That was fricken fun, the thing is so heavy!
 
Last edited by a moderator:
BlueTsunami said:
My Samsung 27" HDTV

That it right there. Its funny that the official Samsung site lists shit about the specs of the TV (I hate that)

EDIT: Its actually there...I had to download the -Detailed Specs Sheet- :blush:

I looked at the -Detaled Specs Sheet- and yes! it does have DVI with HDCP copy protection



I guess I lucked out with that one, because HDCP wasn't even a factor when I bought this TV. The next HDTV I get though, I want to be 720p Native. I heard though, that theres not alot of 720p sets out there.

Uhm not sure about the US, but most HDTVs in Europe are only 720p native, so all signals (1080i too) get up/downscaled to 720p.
But yeah, u were definately lucky on that one!!

This will most likely be my choice http://www.homecinemachoice.com/cgi-bin/displayreview.php?reviewid=6128
 
london-boy said:
Uhm not sure about the US, but most HDTVs in Europe are only 720p native, so all signals (1080i too) get up/downscaled to 720p.
But yeah, u were definately lucky on that one!!

I'm guessing Europe is MUCH more organized with the HD implimentation. Reason being, over here...High Def TVs are pretty much a cluster-f*$&%. So many different brands, types, styles. But I was reading up recently and the trend over here is that 720p is only now really being used in newer HDTVs. 480i/480p/1080i are pretty much the norm.

I believe the TV CAN accept 720p signals (haven't tried at all, the only things i've pretty much used on it are HALO 2, Star Ocean:TtEoT and my computer). Although with my computer It displays 720p flawlessy...so I wonder if it DOES upconvert images (or if that signal is just being forced.
 
BlueTsunami said:
I'm guessing Europe is MUCH more organized with the HD implimentation. Reason being, over here...High Def TVs are pretty much a cluster-f*$&%. So many different brands, types, styles. But I was reading up recently and the trend over here is that 720p is only now really being used in newer HDTVs. 480i/480p/1080i are pretty much the norm.

I believe the TV CAN accept 720p signals (haven't tried at all, the only things i've pretty much used on it are HALO 2, Star Ocean:TtEoT and my computer). Although with my computer It displays 720p flawlessy...so I wonder if it DOES upconvert images (or if that signal is just being forced.

Well HDTV in Europe seems to finally be getting some well deserved time.
Our HD-Ready standard requires: HDMI (or DVI+HDCP), 720p or more, component (not sure it's a requirement), and it has to accept both 50Hz and 60Hz HD signals.
If a set doesn't have any of those, it's not "HD-Ready". It's not an HDTV.
So yeah, it's kinda easier for the consumer. If one buys a "HD-Ready" TV, he's "protected" - if one of those things isn't there, he can return the TV for "false advertising" and get a proper one.

One thing to remember though is that Sky made it clear what they'll be using, and manufacturers have acted accordingly. They said HDCP will be used for some content, and that they will prob broadcasting at 720p.

One gets the "HD-Ready" sticky only if the set will work flawlessly with SkyHD.

EDIT: Woah that was my 14999th post... Head over to the General Forum for teh parteee!!! :LOL:
 
BlueTsunami said:
I'm guessing Europe is MUCH more organized with the HD implimentation.
Well if you keep bickering about standards, over the years the rest of world sorts out it's messes and you can just grab the best solution from there. That's why being last in getting new technologies isn't such a bad thing. :mrgreen:
 
Some early HDTVs in the US supported 720p. These were CRT-based rear-projection sets. However, apparently, it's more costly to support a high-res progressive signal pathway than interlaced.

So most CRTs supported 1080i natively but not 720p. Now with the advent of flat panels and digital displays like DLP, 720p came back into the picture because the early LCDs and DLPs were closer to 720p than anything else.

Going forward, 1080p seems to be the path for DLP, LCOS, LCD and probably SED. These will have premium pricing for awhile.

Unless they radically change the OTA infrastructure, there probably won't be too much 1080p sources. ESPN has talked about moving to 1080p but I don't think cable or satellite has any plans to support that yet.

So the best chance for 1080p sources would be from discs and maybe games. Of course, Microsoft would say VC-1 could deliver 1080p at around 10 Mbps. Maybe even less.
 
Yup that's the thing.

In Europe we don't have CRT HDTVs (or none worth remembering, though i seriously have never seen one in Europe).

All HDTVs are either LCD, Plasma or DLP. All of those tend to have 720p as native res. There are exceptions obviously but the vastest majority of HDTVs here have 720p as native res.
 
LCD and DLP don't support interlacing, so only *p formats will be native. Thus, flatpanels will be either 720p or 1080p. No 1080i LCDs or DLPs.
 
DemoCoder said:
LCD and DLP don't support interlacing, so only *p formats will be native. Thus, flatpanels will be either 720p or 1080p. No 1080i LCDs or DLPs.

Well there's always that Alis thing (or whatever it's called...), the 1024x1024 interlace-only displays... Old but they were out a while a go... :smile:
 
DemoCoder said:
LCD and DLP don't support interlacing, so only *p formats will be native. Thus, flatpanels will be either 720p or 1080p. No 1080i LCDs or DLPs.
Surely they can buffer the input and output a full frame one a field late? 30 fps Progressive vs 60 fps Interlaced. That is...

Receive frame 1a
Store frame 1a
Receive frame 1b
Output frame 1a and 1b interleaved
Receive frame 2a
Store frame 2a
Receive frame 2b
Output frame 2a and 2b interleaved...
 
DemoCoder said:
Passable? What are you talking about?

"Passable" as in reasonable for what is expected for a clean, sharp HD image. It's by no means pushing the technical envelope for what the "taken for granted" resolution specs suggest, and it is sure as hell not devoid of blemish if you really take some time to look into the picture.

OTA HDTV is superior to DVD,...

In technical resolution, yes. In keeping down compression artifacts, no. Disc formats seem to hold the standard so far, in passing instances of proper encoding. It doesn't matter if it is SD or HD (i.e, being "HD" doesn't magically imbibe you with superior encoding result). I'm sure not having to be encoded in realtime has a lot to do with it, but there you are. That you fail to bother in making any distinctions or qualifications for this when making your comparison speaks volumes for how attentive you really are in determining PQ.

...and Cable/Satellite HD transmissions.

No argument there. It was cited as such in my post.

OTA HDTV has significantly less artifacting than either DVD, DirectTV HD, and Comcast HD. The same applies to OTA SDTV vs SDTV on Satellite/Cable.

...not so on the DVD comparison. It simply doesn't work out that way.

SD ends up highly variable whether or not it is OTA, satellite, or cable. What seems to be quite universal is that digital SD in any of the "broadcast" incarnations looks pretty bad, even for optimal analog SD standards. I know that sounds counter-intuitive, but SD has really been abused quite heinously by the digital age. The standard should be where the best example of analog SD was, when its use began to fade. Digital SD has been just a poor, poor, poor facsimile of where analog SD left it (audio follows the much same trends, as well). That's why it is soooo easy to point out how HD looks sooooo much better than the germanely available digital SD. How could it not (aside from the obvious resolution advantage)??? Digital SD has been dumbed down so badly in current days, it looks just plain bad to begin with- far more than it should be given what SD is actually capable of.

Why should this matter since SD is on the way out, anyway? It matters bigtime, actually. When you see how badly digital SD was compromised in use, you better be damned concerned that they don't pull the same crap with digital HD. As it is now, there does not seem to be anyone worried. Hence there is no one to call "them" on it, when they slowly, subtly taper back quality levels because it suits the bottom dollar some where in the chain. Be vigilant, people. Don't think for a second they won't do this as far as they can get away with, w/o it being obvious how fubarred they have made the feed.

There is a simple reason why. Comcast and DirectTV own hundreds of channels. They tune their compression codecs to permit more channels, instead of a fewer number of higher quality channels. That's why the DirectTV Spaceway satellites which are going to offer over 1,000 HD channels are most likely, crappy quality just like their SDTV channels. Cable and Satellite are interested in maximizing the number of channels they have available.

At least you are able to acknowledge this threat. Now imagine all the people who think what they shovel out looks absolutely stellar on their brand new HDTV simply because it ends up looking better than what they had before on average... People just fail to recognize a crap signal as soon as it has an HD label slapped on it.

In contrast, local OTA broadcast stations own a single frequency alotment for their station broadcasts. They are denied by law/FCC regulations from subdividing those frequencies for other uses. Thus, if your local NBC/ABC/CBS affiliate is switching to HDTV, they have no incentive not to utilize the complete bandwidth they have alotted, since they can't resell extra bandwidth if they opted for a crappier compression codec.

Wonderful they have this check in balance, yet they still only achieve "passable". It tells you something. Hopefully, you are right that the actual transmission is solid, and the occasional flurry of artifacts is simply from young-generation HD camera and broadcast equipment adding their own layer of new-age "noise". Suffice to say there are *a LOT* of links in this chain, and just being "HD" isn't going to magically disappear the impacts of just a few bad links. It's more reasonable to expect that the times where everything is working at peak performance to give you what HD is really spec'd to deliver will actually be quite rare.

The difference between HD and SD broadcasts is more than some minor sharpness improvement. If you watch HD broadcasts on a daily basis, and then switch to SD, the difference is striking. SD looks shitty. Blurry as hell.

Your test makes no bother to rigidly control SD as a reference (it's like making a comparison of HD to VHS- well what exactly did you expect? VHS was never the paragon for even SD delivery). As explained earlier, digital SD does look quite bad, perhaps even by design and purpose. It certainly is no measure of the best that SD as a format was capable of (but since no one is delivering "full-out" SD anymore, who cares, right?). I'm not saying it could ever be confused for HD, but if you took a real reference, you would most certainly realize what a short step forward HD is delivering at this moment. You put anything on a big enough screen, it's going to get "blurry". Blurry isn't the issue, here. It's already known that blurry is an inherent consequence of dpi. What most people fail to realize or acknowledge are the artifacts and loss of detail under movement. These are significant and relevant measures to PQ, as well. You don't forgive everything else, just because one component in the "larger picture" has made a step forward.

I've been watching the Lost DVD Season 1 on my HDTV to catch up. Tonite, I watched the OTA HDTV broadcast. *Vastly* superior to upconverted DVD 480p. (and please, don't start this ignorant nonsense about upconversion again. The same result can be had by watching on a CRT).

Once again, if you blow anything up on a big enough picture, it's going to get blurry. Adding more pixels is certainly going to help, but ignoring other "troubles" that seemingly have come along with it hardly makes the determination of "vastly" meaningful. It looks better in some ways, not so good in others, I'd gather. This is hardly an overture for SD to live on. What this is, is a conscious push for HD that can be all that it can be. What we have now is "passable" in its best state, and downright embarrassing at its worst. Simplistic generalizations of "vast" improvements is just plainly not seeing the forest for the tree.

You make a habit in these forums of being an HDTV denier. I suggest you get your eyes checked.

That you are unable to see the artifacts suggests that it is really you that needs your eyes checked (or at least take the rose-colored glasses off for once). Things are not as simple as labeling somebody a "denier" and using that to establish the genuinity of your own position (it does the complete opposite, actually). You seem to feel comfortable with the belief that HD is all roses, but you don't realize how that has affected your ability to really "see" it critically and objectively. HD has a green light, but that doesn't mean that all lights are green. That's all I'm trying to say. Stamping out awareness of quality standards may serve your ego, but it most certainly will not put the industry on a favorable future. You can bet on that.
 
All I have to say. Is that I watch Baseball, Football and Basketball in HD. I personally (NO MATTER what anyone says) see a noticable difference in picture quality. I don't know how you can say that what where seeing isn't a noticable difference or that the tradeoff isn't worth it. I know what I see, my eyes detect something better. The image is much clearer and sharper. Also, your imposing your opinion...saying that there isn't a BIG difference. Well...to some people there IS a big difference. To me there is and to DemoCoder there is.

Thats all there is to it. The post above was just a superiorly long winded opinion about HD. I enjoy it, I feel its worth it and is one of my more anticapted features of the 360, PS3 and possibly the Revolution.
 
No one said there is no noticeable difference- read closely, don't scan.

The value of the trade-off is certainly up for debate. I'm of the position that a trade-off was not even a necessity (it should be better *all around*, not just take some/give some), while others are seemingly willing to consider trade-offs.

As for watching sports, l o o k__a t__t h e__g r a s s . . . I know, who cares about grass, right? Rightly, it isn't a necessity, but HD should be capable of presenting the entire image in HD w/o discrimination. That's the whole point of the technology. Don't just marvel at the edges of things. Look at the textures of things. How well do the textures hold out? Are they stable and detailed or do they seemingly cycle between moments where they gel and then scatter and vice versa. If you are familiar with jpeg artifacts, then finding them in HD is a no-brainer.
 
Last edited by a moderator:
randycat99 said:
where they gel and then scatter and vice versa. If you are familiar with jpeg artifacts, then finding them in HD is a no-brainer.

Thats the thing, YOU KNOW TO MUCH!!!. My eyes are not trained to pick those things up. Going foward with my fascination with HD....they probably will. Until then, what I see now is fine with me. It will probably fade (the fascination in HD) but till then, i'm riding that hype train straight to where ever its going..and..a reasonable pace...yeah. But the US botched its HD implimentation (i'm talking hardware wise)....so I wouldn't doubt that the Cable and Satellite companies are wanting to cash in on this as quickly as possible, without giving the best form of HD.
 
That simply leads us back to my earlier post- most people have little grasp over what constitutes real PQ. All that's needed to get the "pass" is something a bit sharper, a blast of color saturation, and some not so subtle verbal suggestion (you're a blind idiot if you don't see the improvement, right?).

Break it down critically and objectively (but not unfairly), and it becomes more and more apparent that this is not the tko that most people simply assume it is. I'll reiterate that this is not a bash against all forms of HD. This is hard-hitting criticism of broadcast HD. It's really straying the feeble line, right now. An HD disc format will certainly take HD to another level, though not necessarily be the complete panacea. HD CGI (from consoles) will be a real highpoint, as finally we get to take video compression out of the picture, altogether (no pun intended).
 
Last edited by a moderator:
randycat99 said:
That simply leads us back to my earlier post- most people have little grasp over what constitutes real PQ. All that's needed to get the "pass" is something a bit sharper, a blast of color saturation, and some not so subtle verbal suggestion (you're a blind idiot if you don't see the improvement, right?).

Break it down critically and objectively (but not unfairly), and it becomes more and more apparent that this is not the tko that most people simply assume it is.

It is unsettling (if thats what you getting at) that HD implimentation is being half assed. What would you rather have done though (I admit I kinda scanned over your post, I'll be reading it in the morning though). Would you rather have regular SD resolutions with better filters? or do you want HD resolutions that are implimented better?
 
Status
Not open for further replies.
Back
Top