1080i support for XBox360. How?

Shogmaster

Regular
I wonder how MS will make the devs support 1080i in addition to 720p. Will it be required for devs to code the game to output 720p and 1080i? ---or--- Will they let devs just code for 720p, and let some kind of scaler chip inside X360 output 1080i from 720p?

I ask this because I assume there's a huge framerate difference for coding for progressive and interlaced. Progressive output games can have fluctuating framerates, while interlaced games will have to be locked to either 15, 30, or 60fps. No fluctuating framerates. Right?

Any thoughts people?
 
Without knowing too much... they'll "support" it in the same way they supported 1080i in the XBox1. It can do it, for sure, but it may not - it'll be entirely up to the devs.

Regarding framerate, it's not really an issue, since it's a closed box - devs will not make a game 1080i if it runs like crap, since it's common sense. Again, the same as there were 1080i games this gen, but mostly they were the less visually pleasing of the lot, since games that struggled at 480p just couldn't get the same framerate at 1080i.

I'm guessing this is the same way Sony are supporting all HD modes, 480p, 720p, 1080i and 1080p - I say this because they haven't confirmed any res supported as a MINIMUM level (to my knowledge), meaning it's up to the devs.
 
PARANOiA said:
Without knowing too much... they'll "support" it in the same way they supported 1080i in the XBox1. It can do it, for sure, but it may not - it'll be entirely up to the devs.

MS said 720p and 1080i is STANDARD OUTPUT for X360 games.

Regarding framerate, it's not really an issue, since it's a closed box - devs will not make a game 1080i if it runs like crap, since it's common sense. Again, the same as there were 1080i games this gen, but mostly they were the less visually pleasing of the lot, since games that struggled at 480p just couldn't get the same framerate at 1080i.

I don't think you get what I'm getting at. IIRC, optimising a game from progressive output and interlaced is a big difference from framerate point of view, because with interlaced only, you have to lock it down to either 30 or 60 (25 or 50 for PAL). If you start out with a progressive scan game, you can have fluctuating mess without a care, and the interlaced version just could thow out even or odd scanlines.
 
MS said 720p and 1080i is STANDARD OUTPUT for X360 games.

That's news to me... I was under the impression that only 720p was supported in all games. Can you back your quote with a source?

because with interlaced only, you have to lock it down to either 30 or 60 (25 or 50 for PAL).

So why do games on current systems (and PS1/Saturn) suffer framerate issues from time to time? Not saying you're wrong, but again, if you're right, again, it's news to me.
 
The xbox 360 video output chip has a scaler built into it.

I assume that if a developer wants, they can support 1080p natively and let the scaler bump it down to 720p or 1080i, or support 720p natively and let the scaler bump it up to 1080i.

I doubt anyone is going to render in an interlaced mode natively.
 
PARANOiA said:
MS said 720p and 1080i is STANDARD OUTPUT for X360 games.

That's news to me... I was under the impression that only 720p was supported in all games. Can you back your quote with a source?

"All games supported at 16:9, 720p, and 1080i, anti-aliasing"

http://www.xbox.com/en-US/xbox360/factsheet.htm

PARANOiA said:
because with interlaced only, you have to lock it down to either 30 or 60 (25 or 50 for PAL).

So why do games on current systems (and PS1/Saturn) suffer framerate issues from time to time? Not saying you're wrong, but again, if you're right, again, it's news to me.

The games that had framerate fluctuations was processed internally progressive before outputting interlaced is my guess.




aaaaa00 said:
The xbox 360 video output chip has a scaler built into it.

I assume that if a developer wants, they can support 1080p natively and let the scaler bump it down to 720p or 1080i, or support 720p natively and let the scaler bump it up to 1080i.

I doubt anyone is going to render in an interlaced mode natively.

Thankyou sir. :)
 
Looking at daves article http://www.beyond3d.com/articles/xenos/index.php?p=05

1080i 2x fsaa will fit nicely into 2 tiles . It requires 15.8 megs .

So i can see 720p 4x fsaa and 1080i 2x fsaa

1080i would require 4 titles vs the 3 titles of 720p 4x . They are saying around 5% hit on 4x at 720p so i'm guessing another tile would add around another 5% ? bring it to made 10% hit from 2x fsaa at 1080i ? I don't see many devs doing that .
 
According to the TCR's devs only need to target 720p. The outboard display chip is responsible for scaling the image. It can scale up to 1080i or down to 480i/p. Microsoft made this decision in order to support ALL HDTV's without requiring specific developer support for each display mode, as was the case with XBox 1. Many older HD displays are limited to supporting only 1080i or only 720p.

Because it does its scaling in the analog domain, this display chip is incapable of digital outputs like DVI/HDMI or supporting 1080p. It was designed by Microsoft's WebTV group. Right now it is limited to composite, s-video, component, and vga. The chip can be re-engineered and changed later since it is seperate from Xenos to support digital out, but that could create consumer confusion as to which XBox units do what.
 
Rockster said:
Devs only need target 720p. The outboard display chip is responsible for scaling the image, and it does so in the analog domain. This is the reason that the 360 is incapable of digital outputs like DVI/HDMI. Right now it is limited to composite, s-video, component, and vga. The chip can be re-engineered and changed later since it is seperate from Xenos to support digital out, but that could create consumer confusion as to which units do what.

Very interesting. It does simplify things for devs, but it will kinda suck for 1080i only CRT HDTVs since wouldn't the scaling from 720p to 1080i ruin the image quality?

Maybe I'll get a 46"+ 720p DLP RP, but blacks suck for anything other than CRTs for dark games like Doom 3 so I'm probably just going to stick to my 19" CRT monitor so I can play X360 games @ 720p with letterboxing on it.
 
Shogmaster said:
Very interesting. It does simplify things for devs, but it will kinda suck for 1080i only CRT HDTVs since wouldn't the scaling from 720p to 1080i ruin the image quality?

If you watch any HDTV program on ABC, it is broadcast in 720p. In my experience, they look just fine on a 1080i TV.

If you watch any HDTV program on FOX or CBS, it is broadcast in 1080i. In my experience, they look just fine on a 720p TV.

I think you will have to be a real nitpicker to complain about 720p <-> 1080i conversions.

(As for 1080p, realistically, the difference between 720p/1080i and 1080p will be smaller to the average person than the difference between 480i and 720p/1080i.)
 
Shogmaster said:
I don't think you get what I'm getting at. IIRC, optimising a game from progressive output and interlaced is a big difference from framerate point of view, because with interlaced only, you have to lock it down to either 30 or 60 (25 or 50 for PAL). If you start out with a progressive scan game, you can have fluctuating mess without a care, and the interlaced version just could thow out even or odd scanlines.

Logically it wouldn't make any sense to lock down to 30 interlaced instead of doing progresive scan. Progressive would simply yield much higher frames with less effort.
Typically expect the lowest frames in a shooter to be half the average framerate. You don't want to cut of at a lowest 30 fps.
 
Shogmaster said:
IIRC, optimising a game from progressive output and interlaced is a big difference from framerate point of view, because with interlaced only, you have to lock it down to either 30 or 60 (25 or 50 for PAL). If you start out with a progressive scan game, you can have fluctuating mess without a care, and the interlaced version just could thow out even or odd scanlines.


Well, that is the first time i've ever heard that one.

For framerate issues/fluctuating mess you mean tearing? Tearing happens with both interlaced and progressive images.

Whether the output is P or I, you will have framerate issues if the hardware can't handle it. If Vsynch is enabled, both P and I will have to synch to either 30 or 60. If Vsynch is disabled, both P and I will have tearing if the framerate fluctuates.
 
VSync should always be enabled IMO. The Amiga had fantastically smooth, crisp images because of this. I hate tearing and dropped frames. Devs should aim for the fixed framerate, syncronised update.
 
The tearing issue was solved millenia ago with tripple buffering. Why some devs still use double buffering heaven knows.
 
I put it down to laziness. Anyway, fixed frames is bad in this age of 3d games. The hardware will be idling more often and load balancing levels costs extra time and effort.
 
I put it down to laziness. Anyway, fixed frames is bad in this age of 3d games. The hardware will be idling more often and load balancing levels costs extra time and effort.
If by "fixed frames," you're referring to vsync, then that's just stupid. There is not now, nor has there ever been, nor will there ever be a 3d console game that doesn't vsync. TVs are made to scan at specific rates no matter what people might prefer. Hell, I'd like it if my TV refreshed at 85 Hz... won't happen within my lifetime.

Your little dream of "load balancing" levels so that they will always naturally run at a constant framerate is nonsense... Maybe in a game where the camera angles are constantly fixed and you're always looking at the same level of complexity of everything (render geometry, collision geometry), and having the exact same number of NPCs, and always having exactly identical levels of activity at any given moment... and everybody playing the game is limited to the performing the exact same actions... yeah, then I can see it happening.

Load balancing so that you never drop *below* playable framerates is fairly normal, though. The only thing that prevents that is deadlines. There's no studio in the world that wouldn't do it given the time... for that matter, it's a requirement for 360 (I think they say test run start to finish, and the game can never drop below 30 (or 25 for PAL, I guess?) at any point).
 
Maybe I'll get a 46"+ 720p DLP RP, but blacks suck for anything other than CRTs for dark games like Doom 3 so I'm probably just going to stick to my 19" CRT monitor so I can play X360 games @ 720p with letterboxing on it.


Black levels on a DLP aren't to bad. Brightness is fantastic as is contrast.

Letterboxing on a 19" screen?

Seems a bit small to me.
 
Back
Top