Question about 1080i/p and Xenos

IIRC I read a postmortem on MotoGP where they very clearly state they were CPU limited (specifically by drawcalls).

Yes, but I was inferring the pre-mortem bickering before we had any real information, i.e. the "Unstable framerates?! Must mean crappy GPU!" Of course that reasoning has followed almost every game with framerate issues.

Anyhow, MS has seemed slow in some regards getting API features out to developers. I thought that the Xbox had its API tweaked pretty early on to reduce draw call overhead. I guess that is the cost to pay of getting hardware late and releasing first.
 
Splinter Cell : Double Agent & DOA4 Are Basically 1080p Games

According to Sr. Software Engineer for Xbox

http://ozymandias.com/archive/2006/10/21/Clarifying-Thoughts-on-High-Definition-Game-Rendering.aspx

1.0x: that’s how much harder it is for a game engine to render a game in 1080p as compared to 1080i—the number of pixels is identical so the cost is identical. There is no such thing as a 1080p frame buffer. The frame buffer is 1080 pixels tall (and presumably 1920 wide) regardless of whether it is ultimately sent to the TV as an interlaced or as a progressive signal.

1080p is a higher bandwidth connection from the frame buffer to the TV than 1080i. However the frame buffer itself is identical. 1080p will look better than 1080i—interlaced flicker is not a good thing—but it makes precisely zero difference to the game developer. Just as most Xbox 1 games let users choose 480i or 480p, because it was no extra work, 1080p versus 1080i is no extra work. It’s just different settings on the display chip.

Of course these games couldn't advertise as such because the 360's output chip doesn't support it until after the update.

Looks like I was right guys ;)
 
Yeah, because of the scaler in the 360, its games can render at more or less any resolution the developer picks. Once the fall update rolls around, every game will *magically* support 1080p, probably even X1 games.
 
Yeah, because of the scaler in the 360, its games can render at more or less any resolution the developer picks. Once the fall update rolls around, every game will *magically* support 1080p, probably even X1 games.

That means Halo 1 was rendered at 1080p internally on the Xbox1? The truth finally comes out! :oops:

;)
 
darkblu said:
what's interesting is that while 1st parties could go convinient resolutions, others non-1st parties did not enjoy such conveniencies.
MS having a case of Nintendo-itis? ;)
Anyway favouritism isn't uncommon in the industry - certain TRC violations are "allowed" depending on who you work for and what the project it (best example is fantastic releases like Enter the Matrix, which got very blatant special treatment on all platforms it released on).
This resolution thing reminds me of V-Sync on PS2, where TRC explicitly stated title should never run without it, but some special games eventually got around it (MGS2 was the first I remember), and later it became negotiable alltogether (personally I thought the rule had its benefits - one nice diferentiation console games used to have over PC was absence of tearing. Nowadays it's gotten so bad that 2 out of 3 360 titles are tearing all over the place :cry:).

acert said:
I thought that the Xbox had its API tweaked pretty early on to reduce draw call overhead.
Yea, while I couldn't say about higher level API, I was lead to believe they had low-level access to pushbuffers from the start, so I never imagined this could have been an issue.
 
Last edited by a moderator:
Yeah, and our eyes are more sensitive to virtual resolution as well. so using a 5:4 or 4:3 resolution to render a widescreen image often looks notably better than using more pixels pixels in a widescreen format. For instance I play some PC games at 1280x1024 with a 16:9 rather than using 1600x900; even though there is there is over 100,000 more pixels in the latter, I simply find that with qualilty scaling the former hides aliasing better.

Cheers! I didn't know that...

So that's another nice loophole for the PR guys.

The frame buffer is 1080 pixels tall (and presumably 1920 wide)...

Not that I care. I even like PGR3. ;)
 
According to Sr. Software Engineer for Xbox
to render a game in 1080p as compared to 1080i—the number of pixels is identical so the cost is identical. There is no such thing as a 1080p frame buffer. The frame buffer is 1080 pixels tall (and presumably 1920 wide) regardless of whether it is ultimately sent to the TV as an interlaced or as a progressive signal.

1080p is a higher bandwidth connection from the frame buffer to the TV than 1080i. However the frame buffer itself is identical. 1080p will look better than 1080i—interlaced flicker is not a good thing—but it makes precisely zero difference to the game developer. Just as most Xbox 1 games let users choose 480i or 480p, because it was no extra work, 1080p versus 1080i is no extra work. It’s just different settings on the display chip.
That's only true for 30 fps games. 1080p at 60fps requires 1920x1080 pixels to be rendered every 60th of a second. 1080i at 60 fps requires 1920x540 pixels per 60th of a second. Well, you could render whole frames and only output half the data, but that'd be a stupid waste of half your graphics power! You'd also get interlace artefacts with 1080i @ 60 fps if you're updating the engine. 1080i is ideally suited for 30 fps where the alternating fields are showing the same frame. To date many a game is 30 fps, so it could be argued that's the assumed frame rate and why it makes no difference whether you're 1080i or 1080p, but there are upcoming games that are 1080p 60 FPS.
 
Back
Top