Question about 1080i/p and Xenos

Hardknock

Veteran
Are 1080i games rendered @ 1080p on Xenos and then converted to 1080i with the scaler chip?

The reason why I ask this is because DOA4 is 1080i/60fps and I'm thinking that with the upcoming fall update this could be the first native 1080p 360 game?
 
Is DOA4 actually 1080i, or are you just reading what's on the box? Usually, the output chip interlaces the signal so the TV understands.

Typically, games aren't rendered interlaced.
 
Yep looking at it on my HDTV I'm pretty sure it's 1080i. Also Gamespot mentioned it in one of their previews:

http://www.gamespot.com/features/6132968/p-3.html

DOA4'S visuals are, in a word, outstanding. While the high visual quality of the previous games has left DOA4 a big pair of shoes to fill, the game is well on its way to doing just that. The work-in-progress version we saw was running at 1080i and needs to be seen on an HDTV to be believed. The environments featured an amazing level of detail that just popped off the screen thanks to that insanely high resolution. Liberal use of bump mapping, lighting, and particle effects are outstanding and bring the environments to impressive life. The shiniest Ultimate environment, the crazy nightclub that was awash in lighting and particles, pales in comparison to what we saw. However, in many ways you can see counterparts to those stages taken to the next level in DOA4.
 
Personally I wish game developers would render at 720p and use the extra power for visual effects, rather than rendering at 1080i/p.
 
Personally I wish game developers would render at 720p and use the extra power for visual effects, rather than rendering at 1080i/p.

Choice is a terrible thing...

Why on earth have we ended up with two and a half (i.e. 720P, 1080i, 1080P) incompatible HDTV standards anyway - surely both the TV and games industries would be tons better off if there was a single standard. I'm certainly not buying an HDTV set until one resolution emerges as the winner - I'm buggered if I'm spending eight hundred quid only to view everything through a crappy scaling filter. :devilish: Grrrrrr!
 
Choice is a terrible thing...

Why on earth have we ended up with two and a half (i.e. 720P, 1080i, 1080P) incompatible HDTV standards anyway - surely both the TV and games industries would be tons better off if there was a single standard. I'm certainly not buying an HDTV set until one resolution emerges as the winner - I'm buggered if I'm spending eight hundred quid only to view everything through a crappy scaling filter. :devilish: Grrrrrr!


Do you drink coffee out of your beer mug? Do you sip champagne from your coffee mug? It's all about selecting the right cup for the right job.
 
Yep looking at it on my HDTV I'm pretty sure it's 1080i. Also Gamespot mentioned it in one of their previews:

http://www.gamespot.com/features/6132968/p-3.html

Fair enough. If you search for ERP's, Faf's, Deano's, and DeanA's posts, you'll find some explanations as to field rendering, and perhaps why it sees very limited use. (Recall: Kung Fu Chaos)

edit: ah... decided to search myself afterall. :p


In fact you would likely render to the full 1080 line framebuffer, since the deinterlacing circuit will use both fields.

You could technically render in fields (like early PS2 games did for 480i), but at that point you would have to maintain 60fps. Dropping frames would have ugly artifacts.

Yup.. Kung-Fu Chaos used interlaced rendering. So, that was 640x240 for NTSC, or 640x288 for PAL. The reason for it was straightforward, in that it halved our fillrate cost. And we had lots of effects/particles that ate fillrate for breakfast. It also halved the amount of memory we needed for the display buffers. Of course, it meant that if you ever dropped out of 60Hz (50Hz PAL), the screen would look like crap, as you'd end up with the current field being duplicated (so it'd look like the resolution had halved for the period where you were out of a frame). But then I believe it was in our contract with MS that the game ran in a frame, so no real problem. Ahh.. so many memories.. :/

I'm still pretty convinced that the reason Kung-Fu Chaos isn't on the X360 backwards compatibility list is because we used this.

For PS3 i'd say that the rendering cost of 1080 is likely to be identical, regardless of whether the display mode is 1080i or 1080p. I'm guessing that you'd always need to render the full 1080 lines.. because you wouldn't be able to guarantee that someone wouldn't add some cost that pushed your framerate over 60Hz by bringing up an OS component of some sorts. Kinda like the way framerate is affected on the X360 if you hit the X360 guide button during a game... you'll notice it gets choppy while it has that blade stuff on screen (for video anyway.. most games go into pause mode). So, I think the days of rendering-half-your-buffer hacks like we used in Kung-Fu Chaos are long gone.

Dean


Good thread here with a fair bit of discussion with more links via Vysez.
 
Last edited by a moderator:
wasn't that MotoGP3? I haven't kept up with that AVSforums thread...

edit: found it...having a look through it at the moment.

Interesting on MotoGP. There was a bit of bickering about the GPU on that title (unstable framerate, etc) yet if it is 1280x1024 that is nearly 45% more pixels than 720p which tend to indicate the framerate issues were not GPU related in general.

Anyhow, I have heard about a couple other games at 1280x1024 (like DoA4). A while back we had some guys test their VGA cables and a number of games departed from the norm in how they displayed on CRTs (PGR3, DoA4, I believe MotoGP, a couple others...) which may be an indication of them using a "non-720p" framebuffer. It would be interesting to see if there is a correlation?

Anyhow, I am dissappointed that some games at 480p are rendered at 480p internally :devilish: Why not 720p and downscale??? Increased AA + Better texture fidelity!
 
Do you drink coffee out of your beer mug? Do you sip champagne from your coffee mug? It's all about selecting the right cup for the right job.

Yes, but coffee growers and vinyards don't get to make decisions about what cup I use. Also, I don't pay hundreds of pounds for the cup or fifty quid for the coffee (or even champagne)
 
Anyhow, I am dissappointed that some games at 480p are rendered at 480p internally :devilish: Why not 720p and downscale??? Increased AA + Better texture fidelity!


Hm... just curious, are these games mostly single-player? And are there significant framerate disparities?
 
Hm... just curious, are these games mostly single-player? And are there significant framerate disparities?

I don't think it is limited to single-player games, and there are framerate discrepancies. Mostly with stability in framerate. I believe GRAW is an example.
 
I'm pretty sure DOA4 is rendered at 1280x1024.

It looks like this is incorrect. This post sounds somewhat 'insiderish'. It looks like what ever resolution/s are listed on the back of the game is what the 360 renders it at internally (that's why Dead Rising has such a huge problem with text because it renders only at 720p and lets the scaler do the rest.)

http://www.avsforum.com/avs-vb/showthread.php?p=8527384&&#post8527384

In your post about capturing, you said you were using the VGA cable and selecting 1280x1084. I wouldn't expect DOA (and other multi-res games) to allocate a 1920x1080 framebuffer in that case. Unless you actually pick 1920x1080, the game would likely pick a framebuffer size less than or equal the desired output resolution. If you look on the back of the DOA4 box, it lists the formats it supports. Now the 360 will output anything from 640x480i to 1920x1080i (soon to be p), but those numbers on the back of the box represent the framebuffer sizes the GAME presents to the Xbox for display. Some only say "720P" while others will say "480P", "720P", and "1080i". So if you had a game that only said "720P" on the back of the box, it will still work fine with a display that only accepts 480i or 1080i signals, it just means there is scaling done by the 360 that is external to the game code.
So why don't games just pick one resolution and internally render at that? Lots of reasons. You may have seen some magazine reviews complaining that on non-HD sets, some of the text in certain 360 games is hard to read. Invariably, it is because a game does only render at 720P (the one resolution that a 360 game must support natively) and relies on the scaler to take care of the rest. Text and HUD elements are items you want to, if possible, place and render uniquely at a given resolution. Slight scaling up or down on text looks bad as it can become either blurry (when scaling up) or unreadable. HUD elements that line the edge of the screen are in particular danger on NTSC displays (480i) of being in the overscan region. You also will want to do different HUD arrangements for 4:3 displays than for 16:9 displays. Needless to say, UI work on games is tedious and unfortunately invisible if done properly. When was the last time you saw a magazine review say "Man, that onscreen text is the da' bomb!"? It is only when it doesn't look perfect (like illegible text on a non-HD display) that people notice the UI at all.
Some other poster had said Bethesda replied to his query that Oblivion supported 1920x1080. Oblivion is likely more CPU bound (for things like AI and physic simulations) than GPU bound, so the bump in resolution probably doesn't affect their framerate. Note that Oblivion on the PC is one of the more scalable games out there. It is able to run on relatively "ghetto" graphics cards. To max out Oblivion on a PC you need both a fast graphics board (to do HDR and anti-aliasing) and a very fast CPU.
Now, just because a game presents the XBox OS with a framebuffer of a given size (like 1920x1080), that doesn't mean it is rendering absolutely everything at that resolution. As some people have noted, the PGR3 guys admitted they were rendering to an intermediate buffer that was smaller than 1280x720 and rendering that to the final framebuffer. That may seem like cheating, but it is very common for games that have lots of rendering effects applied to the 3D scene. Things like motion blur, bloom lighting, HDR, toon shading, etc. require an intermediate buffer of the raw 3D scene. Because the individual raw 3D pixels get so processed anyway, rendering the intermediate buffer at a higher resolution before the post-effects doesn't buy them a better looking final result or a good enough to take the memory hit.
Now one could be cynical and say it is possible that all those games that list 1080i on the back are really rendering at 720P and then within the game, scaling that to a 1920x1080 framebuffer, but that would be silly. The OS would do that for them automatically and be invisible to game code (less complex code==good). They'd just be wasting framebuffer memory and GPU time. One could be cynical and say MS is lying and those games don't really support that resolution natively. That is a nice theory, but if MS was going to be dishonest, why wouldn't they put it on the back of every 360 game box? Many games like Kameo and PGR3 only say "720P". Why would MS allow such a deception for third party games like Call of Duty 2 and Oblivion, but be honest about first-party games? And wouldn't it be at least consistent for a given publisher? If so, then why do only some EA games say they only support "720p" and other include "1080i'. Hmm, maybe the number means something specific to the game itself...
Anyway, I think this has gotten way off course of the original thread. Stick a fork in it.
 
some guy Hardknock found on a forum said:
As some people have noted, the PGR3 guys admitted they were rendering to an intermediate buffer that was smaller than 1280x720 and rendering that to the final framebuffer. That may seem like cheating, but it is very common for games that have lots of rendering effects applied to the 3D scene. Things like motion blur, bloom lighting, HDR, toon shading, etc. require an intermediate buffer of the raw 3D scene. Because the individual raw 3D pixels get so processed anyway, rendering the intermediate buffer at a higher resolution before the post-effects doesn't buy them a better looking final result or a good enough to take the memory hit.

oh, so that 'may seem like cheating, but is very common'... i see.

that was the most shameless spin on a simple, well-known fact that a launch title like PGR3 did not have full-sized frame buffer because they could not take the tiling hit. and not for 'the effects', but for their bread'n'butter scene content. and even after bizzare step up and admit to the cheat there comes some 'faithful soul' to explain how actually things were not like that.

holly fanboism, batman.
 
Last edited by a moderator:
Interesting on MotoGP. There was a bit of bickering about the GPU on that title (unstable framerate, etc) yet if it is 1280x1024 that is nearly 45% more pixels than 720p which tend to indicate the framerate issues were not GPU related in general.

If I remember right, their problem was draw call count... Which is a bit strange really, it wasn't exactly the type of game you'd expect to have 100s/1000s of draw calls.. I can't remember where the article was, but it gave the impression they were doing things like rendering each and every grass billboard separately.
 
Those Xbox 360 given out for the media that are capable of taking frame buffer dump, can't you tell the internal rendering res by looking at the res of the dump ? I thought that was how they find out the internal res of PGR3 was 720p for menu but only 1024x600 for the actual game.
 
oh, so that 'may seem like cheating, but is very common'... i see.

that was the most shameless spin on a simple, well-known fact that a launch title like PGR3 did not have full-sized frame buffer because they could not take the tiling hit. and not for 'the effects', but for their bread'n'butter scene content. and even after bizzare step up and admit to the cheat there comes some 'faithful soul' to explain how actually things were not like that.

holly fanboism, batman.

Why not take it the opposite way (and more optimistic) that the PGR team at least admitted what they were doing and this guy can explain why (although I'm sure other devs have "cheated" also like teh good Dean A explained)... as opposed to cheating and pretending... sheesh. I think you are projecting ******ism here but that could just be me...
 
wasn't that MotoGP3? I haven't kept up with that AVSforums thread...

edit: found it...having a look through it at the moment.

Doh, I overlooked this thread for over a week. Anyway, yeah, I'm quite certain about MotoGP since I've seen the frabebuffer grabs in that AVSforum thread. But as for DOA4, as I said I'm pretty sure it 1280x1024 as well; and by that I mean I recall someone I consider reliable mention the framebuffer grabs of it coming out at that resolution and that jives with what I see on my display.
 
Why is DOA4 1280x1024? Isn't it widescreen? That's a strange resolution for a console game as it's not 1080i nor 720p...? I'm confused...
 
Back
Top