MS's "secret weapon" against the PS3 (Arstechnica)

It'd probably would be better from a PR perspective if Sony didn't also sell a bunch of the TVs in question. It's not really relevant, but it is kind of funny in a way.
 
What gave you that idea?
If those TVs could resolve 720 lines, doncha think they'd accept it as an input resolution?
1080i is 540 lines plus per-frame jitter, which is a trivial retrofit on tubes designed for widescreen PAL.

It's a TV. It's widescreen. But it's not an HDTV by any stretch. If a salesman tells you so, smack him.
 
Indeed I wasn't - I had no idea what I wrote could be misinterpreted so badly by anyone. I call the my not being native english excuse and all that :p


Well maybe visual improvements aren't really that noticeable, and performance tends to be better at native SDTV. Or maybe it's just a cunning conspiracy to force people to upgrade to HDTV. :p

My apologies for misundertanding then! :oops:
 
Well to be clear, aren't the standards a bit...'iffy'? I mean, by this standard, which I understand is official, GT4 did true HD rendering. I doubt you'll find many people who'll claim interlaced 640x450 fields consitute real HD rendering though. GT4 is invariably referred to as a 'hack'. Likewise there's no official 640x1080 HD resolution. The resolutions that make up real HD are 1280x720 and 1920x1080. Any TV that's rendering lower than those resolutions (the lowest applicable HD res being 1280x720) isn't rendering an HD resolution, no? I mean, would 320x1080 be classed as an HD image?

Given a host of peculiar resolution displays like 1024x0124 panels scaling HD resolutions to fit, the official definition of 1080 lines is the only one that can be applied to displays. And thus a 640x1080 interlaced CRT counts as an HD set. But from a user's POV, anything that can't resolve 1280x720 pixels at least isn't truly an HD display. I mean, if I were to take a 1920x1080 image and shrink it to fit on a 640x480 TV, is that TV HD, because it's rendering an HD image shrunk to fit? No. Then how so is a 1024x1024 panel shrinking images to fit?

It's a muddle of definitions and standards and I can see both POVs. From an official standpoint, I can see how Sony can say reasonably fairly that sets which can't display 720p aren't a real concern as they're not real HD sets, but likewise as they have sold such sets to people, it's bad form not to support them.
 
there was some marketing involved... like folks saying 720p and 1080i werent "true" HD. Some tried to dither and modify their statement to say "full HD." HD runs the gamut from 720P on the "low end" (because some people do not consider interlace field techniques to be a valid mode for HD consideration) to 1080P.

Beyond that determining HD is a joint proposition of having a valid HD source and a valid HD display. If the two dont meet, you may not be getting your money's worth....
 
Well to be clear, aren't the standards a bit...'iffy'? I mean, by this standard, which I understand is official, GT4 did true HD rendering. I doubt you'll find many people who'll claim interlaced 640x450 fields consitute real HD rendering though. GT4 is invariably referred to as a 'hack'. Likewise there's no official 640x1080 HD resolution. The resolutions that make up real HD are 1280x720 and 1920x1080. Any TV that's rendering lower than those resolutions (the lowest applicable HD res being 1280x720) isn't rendering an HD resolution, no? I mean, would 320x1080 be classed as an HD image?

I particularly enjoy your GT4 argument :D It seems to me some people forgot their stance on the matter from the previous generation ;)
 
If those TVs could resolve 720 lines, doncha think they'd accept it as an input resolution?
1080i is 540 lines plus per-frame jitter, which is a trivial retrofit on tubes designed for widescreen PAL.

It's a TV. It's widescreen. But it's not an HDTV by any stretch. If a salesman tells you so, smack him.

I apologize for the one-liner response previously. I didn't really have time to elaborate at the time, but couldn't let that post go by without comment.

Let's go over the applications for an HDTV and see how a 1080i-only set stacks up to a 720p-only set, shall we?

Displaying HDTV broadcasts from OTA/Cable/Satellite:

3 networks I am aware of broadcast in 720p. ABC, Fox and ESPN. There may be some others, but the great majority are 1080i. Therefore, the 720p set will be down-rezzing 1080 to 720 more often than the 1080i set will be losing the temporal resolution going from 720p (60fps) to 1080i (30fps). Advantage 1080i set.

Displaying Video Game/PC-generated content:

This will nearly always favor the 720p set, since this content by nature benefits from the higher refresh rate. There may be edge cases where the increased resolution is more of a benefit, but this is an Advantage 720p.

HD-Disc playback:

This is almost exclusively going to be film-sourced material, and that is 24fps. That means that the 30fps of 1080i is enough to resolve all of the motion in an HD movie. The 720p set, however cannot resolve all 1080 lines of resolution of the movie. Therefore Advantage 1080i.

Standard def DVD playback is a wash, see above but subtract the need for 1080 lines of resolution and therefore the disadvantage for 720p.

That's why your statement was incorrect.
 
GT4 officially supported HD output, but all the same the rendering resolution was below that of any HD standard. Confusion comes from the fact that we have separate HD broadcast standards, HD signal standards, and HD display standards; each for their own reasons, but people often fail to understand the distinctions between them. GT4 was less HD than a PAL DVD on an "HD upconverting" DVD player outputting 1080i, but those 1080i outputs meet the HD signal standards all the same.
 
Displaying HDTV broadcasts from OTA/Cable/Satellite:

3 networks I am aware of broadcast in 720p. ABC, Fox and ESPN. There may be some others, but the great majority are 1080i. Therefore, the 720p set will be down-rezzing 1080 to 720 more often than the 1080i set will be losing the temporal resolution going from 720p (60fps) to 1080i (30fps). Advantage 1080i set.
Forgive me if I'm missing something here, but AFAIK 1080p native resolutions only came on the market all of a year ago, at crazy prices and with technical support for all HD resolutions. Any TV older than that which accepts 1080i doesn't display anything like 1080i resolution, so I don't see how they're any better in the resolution stakes than a 720p set displaying native 720p content or downsampling 1080i content.

HD-Disc playback:

This is almost exclusively going to be film-sourced material, and that is 24fps. That means that the 30fps of 1080i is enough to resolve all of the motion in an HD movie. The 720p set, however cannot resolve all 1080 lines of resolution of the movie. Therefore Advantage 1080i.
Same as above. Unless your set displays higher than 720p resolutions, the fact you can read a 1080i source over a 720p source is neither here nor there.
Standard def DVD playback is a wash, see above but subtract the need for 1080 lines of resolution and therefore the disadvantage for 720p.
Now this I can agree with, as the vertical resolution is a simply doubling so no sampling artefacts are expected.

In summary, I don't think there's a clear-cur superior, 'more HD', format based on installed tech. Most 1080i displays have a lower native resolution than the 720p displays so display less pixels when displaying 1080i than 720p sets displaying 720p content. Right?
 
In summary, I don't think there's a clear-cur superior, 'more HD', format based on installed tech. Most 1080i displays have a lower native resolution than the 720p displays so display less pixels when displaying 1080i than 720p sets displaying 720p content. Right?

Not quite. 720p has a higher temporal reolution than 1080i. That means if your source material runs at greater than 30fps you lose some of the motion of the scene if it is displayed on the 1080i set. But 1080i is still really 1080 lines of resolution.
 
Last edited by a moderator:
They are 540 lines per field, 1080 lines per frame, and that is HD.
Bingo.
A 1080i-only TV can't display frames with 1080 lines, not anywhere near it in fact, because if it could, it could display 720p, which it by definition can't.
 
Not quite. 720p has a higher temporal reolution than 1080i. That means if your source material runs at greater than 30fps you lose some of the motion of the scene if it is displayed on the 1080i set. But 1080i is still really 1080 lines of resolution.
I'm talking about the displayed image of 1080i only TV sets, which as I understand it aren't anywhere close to 1920x1080 in resolution. What sort of actual display resolutions are these common, cheap 1080i CRTs and the like?
 
I'm talking about the displayed image of 1080i only TV sets, which as I understand it aren't anywhere close to 1920x1080 in resolution. What sort of actual display resolutions are these common, cheap 1080i CRTs and the like?

The important point to understand is there's is both spatial and temporal resolution(resolution over time). 1080i sets do display a 1920x1080 spatial resolution, but only show half the image at a time so they have a lower temporal resolution. Your eye still sees a full 1080x1920 image, or whatever the TV actually displays (over 90% with my CRT).
 
No matter how you personally define "High Definition", you can't change the fact that there exist a subset of TVs that are limited to 480p on the PS3 but are capable of displaying at higher definitions using the Xbox 360.
 
Bingo.
A 1080i-only TV can't display frames with 1080 lines, not anywhere near it in fact, because if it could, it could display 720p, which it by definition can't.
That isn't what I was saying at all, they do display 1080 line frames, by means of alternating two 540 line fields. There is a huge difference between that and 540p, just as on an SDTV you will see a huge difference between 480i and something downsampled to 240p.
The important point to understand is there's is both spatial and temporal resolution(resolution over time). 1080i sets do display a 1920x1080 spatial resolution, but only show half the image at a time so they have a lower temporal resolution. Your eye still sees a full 1080x1920 image, or whatever the TV actually displays (over 90% with my CRT).
Actually, what Shifty is getting at is the fact that 1080i display don't resolve anywhere near 1920 horizontal rows of resolution. Sony's SuperFine tubes were advertised as displaying about 1400 rows and the standard 1080i tubes average a mere 800 rows, but those are just average values as phosphors are not contained to fixed pixels and hence the effective resolution varies with their intensity.
 
Back
Top