Connected 360 via component on a SDTV

mrboo

Regular
As i dont own a HDTV :( i have to play my 360 on an SDTV but the game's are still jaggie but i thought you get super sampling when viewing 360 in SDTV mode?

Am i doing somthing wrong?

Help would be apriciated ( < spelling? )
 
The only difference between playing a 360 game on an SD vs HDTV will be resolution. At a higher resolution jaggies will be less noticeable but they are still there.
The only benefit to running a 360 through an SDTV with component video cables(as opposed to composite or S-video) is better colour seperation.You might notice a slight improvement in IQ due to a cleaner picture, better black and more vibrant colours.
 
Last edited by a moderator:
ninzel said:
The only difference between playing a 360 game on an SD vs HDTV will be resolution. At a higher resolution jaggies will be less noticeable but they are still there.
This is silly though, and against what we've been led to expect. All games are supposed to be being rendered at HD resolutions and downscaled for SDTV. This should give better sampling at a cost of fidelity, and result in less jaggies at lower resolution. Take a 720p screenshot and downscale it via both point sampling and a bilinear resize, and you see the downsampled image gains a degree of AA.

GOW_PR.jpg


GOW_DS.jpg


Downsampling should improve texture and edge antialiasing. It won't get rid of aliasing altogether though. It depends what sort of aliasing mrboo is seeing as to whether he's expecting too much from downsampling or whether the games aren't being resized. If they're being rendered at 640x480, you're XB360 is working perhaps as little as a third as hard on the graphics front as in HD, which is a shocking waste of resources.
 
Shifty Geezer said:
This is silly though, and against what we've been led to expect. All games are supposed to be being rendered at HD resolutions and downscaled for SDTV. This should give better sampling at a cost of fidelity, and result in less jaggies at lower resolution. Take a 720p screenshot and downscale it via both point sampling and a bilinear resize, and you see the downsampled image gains a degree of AA.

GOW_PR.jpg


GOW_DS.jpg


Downsampling should improve texture and edge antialiasing. It won't get rid of aliasing altogether though. It depends what sort of aliasing mrboo is seeing as to whether he's expecting too much from downsampling or whether the games aren't being resized. If they're being rendered at 640x480, you're XB360 is working perhaps as little as a third as hard on the graphics front as in HD, which is a shocking waste of resources.


You're right. I keep thinking in terms of PC when changing resolutions. With these new HD consoles it's really rendered at one res then converted correct?
Edit: So does that mean that if the game is done at 720p,and I play it on a TV that display's 1080i that AA will actually appear worse ?
 
Last edited by a moderator:
Shifty Geezer said:
If they're being rendered at 640x480, you're XB360 is working perhaps as little as a third as hard on the graphics front as in HD, which is a shocking waste of resources.
Well, that is apparently what happens. Anything with a 480i/p output target set in the dashboard will have a 640x480 framebuffer. Several games have exhibited performance improvements when played in SD resolutions compared to running in HD.

There are other motivations to reduce the framebuffer too, like with HUD visibility which would be somewhat impaired when the text/overlay is downsampled.
 
Mmmkay said:
Well, that is apparently what happens. Anything with a 480i/p output target set in the dashboard will have a 640x480 framebuffer. Several games have exhibited performance improvements when played in SD resolutions compared to running in HD.

There are other motivations to reduce the framebuffer too, like with HUD visibility which would be somewhat impaired when the text/overlay is downsampled.


This is something that I argued about with people when they said how HD would be better for everyone,even non HDTV owners. I argued that if all the resources could be used to make a game at 480p that the graphics would actually be better than making a game for 720p then converting it for SDTV but HD advocates would hear nothng of it.
 
mrboo said:
Ive just brought Kameo and the jaggie's are horrible, i really gotta get an HDTV ASAP.
buy a VGA cable if you have a decent monitor. you will be impressed.
 
ninzel said:
Edit: So does that mean that if the game is done at 720p,and I play it on a TV that display's 1080i that AA will actually appear worse ?
Depends on the scaler. A fixed-pixel resize on a fixed-pixel display will look kinda rough, but you should be getting interpolation. The result will be somewhere between the same jaggies and slightly less but the whole screen having lessc clarity (assuming a 1080 native display, and the 1080i support isn't on a 720p)
 
Mmmkay said:
Well, that is apparently what happens. Anything with a 480i/p output target set in the dashboard will have a 640x480 framebuffer. Several games have exhibited performance improvements when played in SD resolutions compared to running in HD.
Well if it stays that way, PS3 (and maybe even Wii) could look a lot better on SD sets, certainly enough to make that an influence on purchasing decision. If you had a choice between NFL with current-gen style jaggies and shimmer, or NFL with a miminal extra 2x AA, which machine would you prefer it on? Though showing HD sets in store, if there is a quality difference on SD sets I guess punters wouldn't know it.
There are other motivations to reduce the framebuffer too, like with HUD visibility which would be somewhat impaired when the text/overlay is downsampled.
Potentially, depending on how the HUD is laid out. But if you use a HUD designed for SDTV and work that upwards for HDTV it shouldn't be an issue, or render the HUD differently for HD and SD resolutions but render the rest of the game at 720p and downscale.
 
Back
Top