Can I still get 1080i on the PS3 without using HDMI?

!eVo!-X Ant UK said:
Yea component can carry 1080p just fine, and as for the receivers i think they need 100mhz component inputs. Infact most of the new ones have them....Denon, Yamaha, Harmon Kardon...all have 100mhz inputs.

Which would be too low for 1080p.

It is entirely possible that people would need to upgrade more than just their display to cope with 1080p.
 
MrWibble said:
Which would be too low for 1080p.

Is it?

I've seen calculations says 1080p/60 needs 148.5ish Mhz and others that claim only 76Mhz.

Which is it?
 
Ty said:
Is it?

I've seen calculations says 1080p/60 needs 148.5ish Mhz and others that claim only 76Mhz.

Which is it?

I've not seen an estimate that low. I'm no expert on analogue signals, but I'd say that your signal path would need to be able to cope with a frequency high enough to encode all your pixels, a bit of slop for the blanking/sync areas and a reasonable multiplier to avoid aliasing if you're near the upper limit.

1920x1080x60 is over 124,000,000 pixels/s. You can divide by two because your pixels would be encoded on alternating peaks and troughs in the waveform - but you'd really want to then multiply up by maybe 3 to get a reliable result (even at 2x you've blown the 100Mhz limit quite comfortably). Thats >186Mhz and I haven't even factored in the slop...

Really I'd think you'd want a system capable of maybe 200Mhz for a 1080p signal. Maybe you'd get away with less, but I wouldn't expect the results of limiting to 100Mhz to be very pretty.
 
Chroma compression may ease back on analog requirements to support a given resolution. Additionally, consider that the one of the benefits of analog is that the bandwidth response does not need to follow the hard limits that digital typically does for it to work. Analog response may be tapering off as it approaches the bandwidth limit. It's not the best signal at that point, but there is still information in the signal that will appear on the screen that pertains to that bandwidth.

(For the record, I wouldn't recommend buying an "hdtv" that doesn't at the very least support component and DVI inputs at this point in time...not w/o a very steep discount, at the least, and then it is only with the understanding that purchasing said unit is will only be a transitionary move to some better purchase down the road.)
 
Last edited by a moderator:
randycat99 said:
Chroma compression may ease back on analog requirements to support a given resolution. Additionally, consider that the one of the benefits of analog is that the bandwidth response does not need to follow the hard limits that digital typically does for it to work. Analog response may be tapering off as it approaches the bandwidth limit. It's not the best signal at that point, but there is still information in the signal that will appear on the screen that pertains to that bandwidth.

(For the record, I wouldn't recommend buying an "hdtv" that doesn't at the very least support component and DVI inputs at this point in time...not w/o a very steep discount, at the least, and then it is only with the understanding that purchasing said unit is will only be a transitionary move to some better purchase down the road.)

Well my calculations weren't taking that into account, but luminance doesn't get the luxury of throwing half the pixels away (and really, we shouldn't be tolerating that kind of thing anyway) so it'd still need a good amount of bandwidth.

Analog will degrade a bit more gracefully than digital (though if your digital signal has a clock it ought not to need so much headroom to work correctly), but I'd rather my signal wasn't having to be degraded just to get from a to b on the signal path...

Aynway, my numbers may be off, but they don't seem a million miles away from estimates I've found from looking around a few other sites. Whatever the reality, I suspect that 1080p @60 is going to push most peoples home setups pretty hard.
 
This is a good question, actually. Sony have no reason to have HDCP on game content.

There's no technical reason 1080i HD content cannot be displayed over component. However, it may be disabled in software due to DRM. AFAIK the same is 100% accurate for 1080p as well.

I personally wouldn't buy a set today without HDMI, purely because it won't be an easy job to get HD movies to play on it.
 
make sure its HDMI 1.3 tho, otherwise it will not do 1080p either so be very careful.

HDMI 1.1 does not cut it.

And Sony were running component to tv direct for 1080p.


This from HDTV Supply http://www.hdtvsupply.com/coca10te1.html

"Premium Component Video Cables that are 1080p Tested. The next generation of HDTV's will have 1080p resolution, nearly twice what we have today...and todays component video cables will not be able to transfer the 70-Mhz signal needed to accomodate this demanding resolution."
 
MrWibble said:
Well my calculations weren't taking that into account, but luminance doesn't get the luxury of throwing half the pixels away (and really, we shouldn't be tolerating that kind of thing anyway) so it'd still need a good amount of bandwidth.

That's the thing with analog, is that it is never as cut and dried as "throwing half the pixels away". High frequency detail attenuates gradually as you approach the limit. They don't just cease to exist after a brickwall frequency point. So detail is still there- just less strongly distinct as something that successfully covers the desired bandwidth. It's not ideal, to be sure, but it also isn't as pathological as it sounds to some. The pathological case also assumes that program material fills that entire bandwidth with HF detail, which often it does not. The amount of softening you get with the presence of the mpeg layer, alone, pretty much ensures the program isn't exercising that very last pixel with HF detail. So a myraid array of factors conspire to make that "very last octave" of bandwidth isn't really doing much for most people. It's great when the program material and particular program source enables such detail to make it to the consumer, but really a small issue when it comes to most people (who own merely small to med size screens), most programs, and most sources of program material.
 
I saw 1080i on the weekend. It was an LG 42" LCD with a demo HD setup showing 1080i pics. Did look very good. Similar I'd say to my 14" Trinitron in picture quality, only over the entire size of the screen, and in stark contrast to the usual HDTV demo pics of SD sources upscaled and blotchy.

Viewed from a distance I think 1080p output might actually be worthwhile, and I give some kudos to Sony for having it as an option.
 
!eVo!-X Ant UK said:
Yea component can carry 1080p just fine, and as for the receivers i think they need 100mhz component inputs. Infact most of the new ones have them....Denon, Yamaha, Harmon Kardon...all have 100mhz inputs.

but HDCP is still an issue. I wouldn't buy a TV without at least two HDMI inputs now, and as there arent many of those, I'm still waiting.
 
I have a question. . .

Why would you need a 1080p input in the first place? Movies are 24 frames per second and video is 30 frames per second. You should be able to de-interlace the 60 fields per second 1080i signal and end up with 1080p30 for video and 1080p24 for film, in exactly the same way current HDTVs and progressive scan DVD players do now.

The only reason I could see for a 1080p60 input would be for video games that actually manage to run at 60 FPS at 1920x1080 resolution. I don't know how many of those we'll see.

Am I missing something?
 
It's a good point. Really, the only milestone that needs to be covered is 1080p24 (or any interlaced standard that fits in that envelope, as you have surmised) and 1080i60 (and also 1080p30, by extension). 1080p60 is still kind of a pipedream, the pinnacle of capability that video hardware geeks would like to see (not saying that in a bad way). Absolutely nothing current is planned to feed such a format for joe-consumer, but there is always the possibility that something might, later on down the line. So therein lies the desire to have that capability ready to go, should the time come- the "uber" format that has yet to show its face, as it seems. Maybe it could be videogame graphics, maybe not. Time will tell...
 
Last edited by a moderator:
1080p isn't just for movies, but any media. eg. Sports on satellite. At the moment the capturing/broadcast hardware (not sure where the limits are) can manage 720p @ 60 FPS, or 1080i @ 60 FPS. In the not too distant future they'll be able to record material at 1080p 60FPS. So just as now you can watch 720p 60 FPS sports on your HD set, and then watch HD movies at a lowly 24fps, in the near future you'll be able to watch 1080p material at 60 fps, and then watch your lowly 24 FPS HD movies (unless they finally up the specs for movies, which looks possible with the shift to digital).
 
1080p 61" Samsung HDTV?

I'm not getting this TV (wish I could) but what do you guys think of the specs

http://reviews.cnet.com/Samsung_HL_R6168W/4507-6484_7-31335744.html?tag=sub

Edit: Gah..I just read the review of its bigger brother (their supposed to be identical, just different in size) and the 1080p aspect can ONLY be used through the VGA input on the TV (basically for PC use). Of course, reading through this thread you would realize that the HDMI ports wouldn't be able to resolve the 2million+ pixels with 1080p :(. Getting closer though.
 
Last edited by a moderator:
Back
Top