MS's "secret weapon" against the PS3 (Arstechnica)

Actually, what Shifty is getting at is the fact that 1080i display don't resolve anywhere near 1920 horizontal rows of resolution. Sony's SuperFine tubes were advertised as displaying about 1400 rows and the standard 1080i tubes average a mere 800 rows, but those are just average values as phosphors are not contained to fixed pixels and hence the effective resolution varies with their intensity.
Finally, someone gets it! I wasn't talking about the specifications of 1080i versus 720p, but the actual 1080i displays people are using. These 1080i sets that people are complaining about, aren't the latest 1920x1080 native resolution sets. Those sets are brand new. The current 1080i sets that are causing Sony a headache are not displaying true 1080i images. They're displaying downscaled 1080i images to fit the image width of the display. Kyleb's suggesting an average sort of 800x1080 resolution. So in the argument of whether 720p is better than 1080i or not, bare in mind that 1080i is not 1920x1080 resolution because no 1080i set has 1920 horizontal lines. Any TV that has 1920 horizontal lines supports 1080p! Thus, when folks were sold those HD sets, there's an argument that they were diddled as those sets aren't capable of displaying 1280x720 pixels or greater, where 1280x720 is the smallest resolution of an HD image.

Or illustrating it another way, if there's a CRT with 1080i vertical lines and 300 horizontal lines in a 16:9 aspect ratio, and a 1920x1080 image is displayed across it, is that an HD display? Or would it look like poo? 1080 interlace vertical lines isn't enough to constitue a real-world HD format. You need a suitably high horizontal resolution too. Sticking an HD label on such a TV would be diddling the consumer who bought it. Then if that TV also can't display the HD image standards from a console, some might say 'well it's hardly a real HD set is it, and you were a chump to have bought it.'

As I said before, it's all a mess of standards and salesmanship and balls-upped support for the mess of consumer devices that Sony did help contribute to.
 
The vast majority of existing HDTV CRTs, especially cheaper ones, can in no way display 1920x1080i. They will *accept* a 1080i signal, but they cannot resolve 1920 pixels per scanline. The best one I've seen can reach 1400, most are <1000.

Frankly, I'm sick of everything having a scaler. I think scalers should be in 1 of two places: Display device and external specialist scaler.

Why?

Because I am paying multiple times over for scalers, and they also in some cases cause headaches when chained together, requiring me to fiddle multiple configurations sometimes.

For example: Upscaling DVD -> Upscaling A/V Receiver -> iScan scaler -> Display Scaler

Now, the key to remember is that scalers just don't scale, most also try to do video processing as well, noise filters, color correction, et al and they are turned on by default. You need to go to each device in the chain and disable video processing, else you get "Cinema Mode" or "Dynamic Mode" applied multiple times. Hell, on my AV receiver, I just can't select "Normal Mode" because even that does video processing, instead, I need either totally disable the video processor, which sometimes us an issue in itself, or I create a custom config and move all the slider bars for correction processing to zero.

Worst, some devices don't allow you to disable video processing. The most infamous example is Samsung's PDP which has DNIe turned on permanently.

But perhaps the biggest issue is having a 2 year old toddler who likes to press buttons, which can lead you on an adventure to find out which scaler he borked.

I'd rather have unscaled, unprocessed raw data sent through my N devices, and let either the display or an external scaler do it. The external scaler often will be very flexible, and the builtin display scaler will often have additional video processing tuned for the native display. For example, the XBOX360 doesn't let me choose 1366x768 (my projector's panel resolution), so I can end up with double scaling if the XB360 isn't generating a true 720p frame.

Lots of PDP sold on the cheap have weird rezes like 1024x1024, 1024x768, etc which implies scaling on the display no matter what.

Maybe my setup is atypical, by I'm kinda sick of a bazillion embedded Faroujas in my house that have to be paid for, but switched off.
 
Have I had it completely backwards all this time?

kyleb said:
Actually, what Shifty is getting at is the fact that 1080i display don't resolve anywhere near 1920 horizontal rows of resolution. Sony's SuperFine tubes were advertised as displaying about 1400 rows and the standard 1080i tubes average a mere 800 rows, but those are just average values as phosphors are not contained to fixed pixels and hence the effective resolution varies with their intensity.

TV HD resolution specifications refers to the horizontal rows, right?

A 1080i display may not display anywhere near 1920 vertical columns / vertical lines of resolution. A 1080i display should by definition display 1080 horizontal rows of resolution.

The quote above and Shifty's repsonse just made my head spin a little. :)
 
Maybe my setup is atypical, by I'm kinda sick of a bazillion embedded Faroujas in my house that have to be paid for, but switched off.
Exactly. I've never understood of the mess of digital. You have fixed images, sent in exact data, to a display, which then displays those pixel to pixel. All this scaling is an unneccessary evil, except for the old PITA that is backwards compatibility with poor inital standards. If HD wasn't released until displays were 720p native, there'd be no need for scalers in 90% of situation anyway. Then offer a scaling device for 1080 sources to shrink to those 720p sets, while the new 1080 native sets include a scalar to expand 720p signals. That's all you'd need. Unfortunately CE companies have a great knack for screwing up even the simplest of ideas. It's like having a dozen video and audio codecs all doing the same job.

To be honest, I'd rather all new technology formats had to be passed by an international commitee. There would be no HD until there was one standard for HD. There would be one or two codecs. Everything would be simple and lovely, and with everyone's life made easier, there'd be less war and more time to spend on farming that'd eliminate famine.
 
Have I had it completely backwards all this time?
TV HD resolution specifications refers to the horizontal rows, right?

A 1080i display may not display anywhere near 1920 vertical columns / vertical lines of resolution. A 1080i display should by definition display 1080 horizontal rows of resolution.

The quote above and Shifty's repsonse just made my head spin a little. :)
You're right. Kyleb used rows where he shouldn't. I used lines (at least I intended to!) rather than rows and columns, 'coz it irrationally sounds odd to me talking about columns on a display :p
 
Sounds like in an attempt to make things easier, the companies have made things more complicated and inefficient.
 
Slightly OT, can anybody tell me how my set compares as far 1080i support?

Sanyo HT32744 32" 4:3 HDTV CRT

I know it's not a great HDTV, but it was the best for what I could afford at the time. I can definitely tell differences between 480i, 480p, 720p, and 1080i on OTA transmissions and my Xbox 360.

Tommy McClain
 
Finally, someone gets it! I wasn't talking about the specifications of 1080i versus 720p, but the actual 1080i displays people are using. These 1080i sets that people are complaining about, aren't the latest 1920x1080 native resolution sets. Those sets are brand new. The current 1080i sets that are causing Sony a headache are not displaying true 1080i images. They're displaying downscaled 1080i images to fit the image width of the display. Kyleb's suggesting an average sort of 800x1080 resolution. So in the argument of whether 720p is better than 1080i or not, bare in mind that 1080i is not 1920x1080 resolution because no 1080i set has 1920 horizontal lines. Any TV that has 1920 horizontal lines supports 1080p! Thus, when folks were sold those HD sets, there's an argument that they were diddled as those sets aren't capable of displaying 1280x720 pixels or greater, where 1280x720 is the smallest resolution of an HD image.

OK, I see what you are saying. 800 is a bit low, though. I'd say 900-1000 are more typical. Anyone with a lower-quality CRT device would have likely found some reason to replace it by now. Still, it's a valid point. But, I have a counterpoint.

Extra detail is useless unless you are capable of perceiving it. Check this chart: http://www.carltonbale.com/wp-content/uploads/resolution_chart.png

Replace the vertical resolution numbers with horizontal ones (this is perfectly valid as your ability to perceive horizontal resolution is equal to your ability to perceive vertical resolution). See how much more visible the difference between 1080 vertical lines vs 768 (typical FP resolution) would be at normal screen sizes and normal viewing distances than that between 900 or 1000 horizontal lines vs 1366?

In a nutshell, you are much more likely to hit the limit of your ability to resolve the horizontal resolution of a widescreen image than the vertical resolution. Therefore, the vertical resolution is much more important. This is a concept I was aware of, but never fully understood the why of until now.
 
That isn't what I was saying at all, they do display 1080 line frames, by means of alternating two 540 line fields. There is a huge difference between that and 540p, <...>
Holy ostrich!
You said 1080i sends 540 line fields that can be merged into a 1080 line frame, which is true, to which I replied that the TVs we're talking about can't resolve* those frames, because they can't. They can resolve* 540 line fields. They can jitter them as expected for interlaced video because they can. "That's HD" you said, implying a connection where there is none.

And before anyone dares to return with mention of deinterlacing and frame reconstruction: the TVs we're talking about can't do it.
kyleb said:
<...>just as on an SDTV you will see a huge difference between 480i and something downsampled to 240p.
All SDTVs that can still function today can resolve* 480 lines. Your analogy is irrelevant.

*"resolve" as in "resolution"; meaning the ability to display images in a manner that allows to discern source pixels, given high enough contrast. Somehow smearing together many source pixels into one colorful blob isn't exactly what "resolution" is about and flies in the face of all the extra expenses we take to transmit these soon-to-be-blobbed-together pixels in separation. If this drags on any longer I might even produce some test images for field experiments, you'd just need a way to hook them up to one of 'em 1080i-only TVs.
 
Last edited by a moderator:
Extra detail is useless unless you are capable of perceiving it. Check this chart: http://www.carltonbale.com/wp-content/uploads/resolution_chart.png
First of all, who made this chart? It certainly is at odds with published data about the eye's ability to distinguish line or spot pairs as distinct. 0.35 arc-minutes is the pixel spacing needed to reach the eye's limit.

Replace the vertical resolution numbers with horizontal ones (this is perfectly valid as your ability to perceive horizontal resolution is equal to your ability to perceive vertical resolution). See how much more visible the difference between 1080 vertical lines vs 768 (typical FP resolution) would be at normal screen sizes and normal viewing distances than that between 900 or 1000 horizontal lines vs 1366?

In a nutshell, you are much more likely to hit the limit of your ability to resolve the horizontal resolution of a widescreen image than the vertical resolution. Therefore, the vertical resolution is much more important. This is a concept I was aware of, but never fully understood the why of until now.
Umm, what?

HD resolutions have square pixels on 16:9 screens. There's no difference for the eye between horizontal and vertical. That chart is labelled using the standard screen resolutions and diagonal screen size. For example, it claims that at 10 feet, a 50 inch screen is needed to fully see the benefit of 720p. Whether vertical or horizontal, that's 0.034 inches between pixels (or 1 arc-minute at 10 feet).

EDIT: I just found out 20/20 vision corresponds to 1 arc-minute features, so maybe that chart isn't as bad as I thought.
 
Have I had it completely backwards all this time?



TV HD resolution specifications refers to the horizontal rows, right?

A 1080i display may not display anywhere near 1920 vertical columns / vertical lines of resolution. A 1080i display should by definition display 1080 horizontal rows of resolution.

The quote above and Shifty's repsonse just made my head spin a little. :)
I'm sorry, I think of the vertical resolution as lines of resolution and the horizontal as columns or rows, but I did phrase that poorly.

Holy ostrich!
You said 1080i sends 540 line fields that can be merged into a 1080 line frame, which is true, to which I replied that the TVs we're talking about can't resolve* those frames, because they can't. They can resolve* 540 line fields. They can jitter them as expected for interlaced video because they can. "That's HD" you said, implying a connection where there is none.

And before anyone dares to return with mention of deinterlacing and frame reconstruction: the TVs we're talking about can't do it.
All SDTVs that can still function today can resolve* 480 lines. Your analogy is irrelevant.

*"resolve" as in "resolution"; meaning the ability to display images in a manner that allows to discern source pixels, given high enough contrast. Somehow smearing together many source pixels into one colorful blob isn't exactly what "resolution" is about and flies in the face of all the extra expenses we take to transmit these soon-to-be-blobbed-together pixels in separation. If this drags on any longer I might even produce some test images for field experiments, you'd just need a way to hook them up to one of 'em 1080i-only TVs.
It's not irrelevant by any means; 1080i TVs display 1080 line frames the same way 480i TVs display 480 line frames, as two interlaces fields.
 
zeckensack said:
It really should just be part of TRC
I agree, but given current turn of events, I don't expect it to happen. PS2 was in almost exactly the same situation 6 years ago, and that was never handled with any mandated standards either.

It was my understanding that most of the "scaling" on PS2 wasn't actual scaling but just slowing down the RAMDAC scanout to make lines with less pixels.
There were variations, but commonly front and back buffers were of different dimensions(and even color depths), hence buffer flip also performed a scale with GS. Exact parameters are too many to mention (and really, there was no standard of any kind).
CRTC would sometimes be used to do horizontal resize like you described (eg. 512 -> 640, or 640->1920 for GT4), but even that depended on phases of the moon and whether programmers had PMS during development of a particular title, and that also only worked on interlaced outputs.

hence the issues with PS2 games playing on PS3; they are scaled now, unlike before, but with primitive point-sampling. All wrong?
Right, just that there's an actual error in horizontal scaling of PS2 titles on PS3 - pointsampling would still look "correct", albeit a bit more jaggy and much less blurry then your typical HDTV scaler.
 
going back to the original subject....

http://forums.xbox-scene.com/index.php?showtopic=577032

Taken from the other thread. It seems that the Zephyr 360 has a new scaler chip named "Hana" :eek:

No, it's no retail Xbox360, it's a prototype board ... but Microsoft had to design a new scaler chip named 'HANA' (replacing the (analog-only?) 'ANA' chip found in current retail and dev Xboxes), I doubt they'll do that effort for nothing. You will also notice several other changes to the motherboard.

Microsoft always refused to answer any question about a possible HDMI-cable for the Xbox360 (even when asked for explicitly), but this HANA chip probably confirms the current 360 can only output an analog signal, so an HDMI-cable for the current 360 probably won't be possible (sure they could design a cable/box that reconverts the analog output to digital, but that makes no sense and is not the point).

Any thoughts?
 
http://forums.xbox-scene.com/index.php?showtopic=577032

Taken from the other thread. It seems that the Zephyr 360 has a new scaler chip named "Hana" :eek:
<...> Microsoft always refused to answer any question about a possible HDMI-cable for the Xbox360 (even when asked for explicitly), but this HANA chip probably confirms the current 360 can only output an analog signal, so an HDMI-cable for the current 360 probably won't be possible <...>
Any thoughts?
Yes. I like it when reality reaffirms common sense.
 
HD resolutions have square pixels on 16:9 screens. There's no difference for the eye between horizontal and vertical. That chart is labelled using the standard screen resolutions and diagonal screen size. For example, it claims that at 10 feet, a 50 inch screen is needed to fully see the benefit of 720p. Whether vertical or horizontal, that's 0.034 inches between pixels (or 1 arc-minute at 10 feet).

EDIT: I just found out 20/20 vision corresponds to 1 arc-minute features, so maybe that chart isn't as bad as I thought.

The chart is in line with other published representations I've seen. You're right, though. Because the screen is wider than it is tall, it's lines are spread over a wider area. So you should be able to perceive the additional resolution of the horizontal component of a widescreen image over the vertical. Guess I didn't understand as well as I thought I did. :oops:

There definately is a tendency in video circles to prioritize the vertical resolution (leading to anamorphic images). I haven't seen any explanation as to why, though.
 
There definately is a tendency in video circles to prioritize the vertical resolution (leading to anamorphic images). I haven't seen any explanation as to why, though.
I think it just has to do with the history of how video signals are transmitted. Analog cable TV and component outputs transmit signals in scanlines, and CRT's display them that way also. Those lines are clearly separated in the signal. Horizontal resolution depends on the quality of both the signal and analog electronics.

I think even today many HD broadcasts won't give as much horizontal resolution. Maybe it has to do with recording methods, as all DV variants are interlaced-scanline based AFAIK.
 
Back
Top