When do output resolutions affect framerate?

Hi,

this is my first post so greeting to everyone.

I already have an HDTV capable of displaying 720p and I consider upgrading to a new one with 1080p. The point is that higher framerates are more important to me than image quality. The main thing I ask myself is, if I should set my Xbox 360's output resolution to 1080p when I have a new TV or not.

From the theoretical point of view when would a higher resolution output image resolution impact framerate. The more details (especially regarding the 360) the better. I am a (primarily non-games) programmer. If you could provide a fake scenario that illustrates the point, that would be nice as well.

The other thing is, has anyone made comparisons about which games' framerate drops when the output resolution is upped? I only know that NBA Homecourt drops to half but that is because in this particular case changing the ouput resolution also changes the buffer-resolution where the game renders at. The more info the better.

Thanks in advance
P.
 
You can set the 360's output to 1080p and in 99% of cases there will be no discernable drop in frame rate. Only games like NBA Homecourt - which render more pixels at the expense of refresh rate - are adversely affected.

The 360 doesn't have a big catalogue of 1080p titles to be honest. Super Street Fighter II Turbo HD Remix appears to be downscaling from 1080p to 720p if you have the dash set to the latter resolution. I could be wrong but I think Geometry Wars is another.

Whether the Xenos scaler affects frame rate or not is a test that I've undertaken, but not that seriously. Basically I captured the intro sequence of Call of Duty 4 at 1080p and it maintained the exact same refresh rate as it does at 720p. However, you still hear the odd person swearing that running at 720p causes fewer frame drops or 'less tearing' than running at 1080p.

Based on the test I did, I am not convinced. If any one wants to suggest a more fool-proof testing method than the CoD4 captures I did, then suggest away!
 
Based on the test I did, I am not convinced. If any one wants to suggest a more fool-proof testing method than the CoD4 captures I did, then suggest away!

Well the fool-proof method, for some games, would be to ask a dev how long Xenos takes to scale. NDA's could be an impediment though.

As you say, other games render at higher resolutions, so it's really got to be handled in a case-by-case basis. Perhaps we should have a list of rendering resolutions in both display resolutions as well for all the games that get tested by the resident experts. :) The only way to say if it's better to set the output resolution for 720p is counting the number of games that interest you and use a higher rendering resolution.

Still, if you consider games where you _know_ the scaling to 1080p is handled by the hardware and are being rendered at 720p... your CoD4 test should give the answer for all such titles.
 
I thought COD4 runs at sub HD resolution
but anyways output resolution will affect framerate in the vast majority of titles the only cases are where it wouldnt

A/ CPU limited, definitely possible
B/ geometry limited, unlikely in the vast majority of titles
 
The original poster is asking about frame rate impact of running his 360 at 1080p over 720p. In 99% of cases, this is going to involve Xenos scaling 720p to 1080p, and if that's the case, I personally don't think there's any performance difference based on my experiences and the test I ran. But as I said, a lot of people swear that running in 1080p does impact performance. If so, let me know the game and I'll put it to the test.

That said, as most Xbox games are 720p, you're not going to magically get better picture quality by running on a 1080p set, that wouldn't be explained by newer/better technology in the 1080p display.

In terms of native 1080p games, there are very few on 360. FIFA Street and NBA Homecourt are the only ones - I think - that will have detrimental frame rates compared to the other 1080p titles.
 
That said, as most Xbox games are 720p, you're not going to magically get better picture quality by running on a 1080p set, that wouldn't be explained by newer/better technology in the 1080p display.

Might there be confusion stemming from different processing features of the TV set at specific resolutions with respect to judder/pulldown or other (e.g. 720p60 vs 1080p30 etc)
 
That said, as most Xbox games are 720p, you're not going to magically get better picture quality by running on a 1080p set, that wouldn't be explained by newer/better technology in the 1080p display.

I know. A new TV doing 1080p is a byproduct of me buying it because of many other reasons (24p, screen size, my parents' TV being broken and me giving them my old one, hoping Samsung is better than Toshiba).

I guess that setting the output resolution to 1080p on the TV will result in better scaling and less lag until the TV displays the picture. I am trying to find out about the downside of it.

Obviously there are games that theoretically could render at a mich higher framerate than their framerate cap most of the time (Devil May Cry?, Virtua Fighter) and even if a higher output resolution needs more processing time, it won't be visible because it doesn't drop below the framerate cap. Call of Duty 4 also seems to mostly hit its framerate cap as far as I remember.

I am more interested in games like Halo 3 that seems to only reach 30 fps when the planets are aligned accordingly. Or pretty much any Ubi-Soft game: Rainbow Six, Assassin's Creed, Splinter Cell. I haven't played Far Cry 2 yet but I doubt it has a steady framerate most of the time.
 
Isn't the scaling done in hardware by the dedicated avivo decoder on the GPU?
Why would there be any difference if that's the case?
 
Yeah, it's scaled on dedcated hardware. I suppose it is bound to take some fraction of a second to accomplish, but it apparently doesn't have much of an effect as I've never seen any difference in framerates between output resolution other than those caused by changes in rendering resolution.
 
Well, dedicated hardware or not, the data will still have to be fetched from RAM. If multi sampling is going on, then that will take up bandwidth. I have no clue how this works in XB360 or PS3, however, as Xenos is doing the scaling "on the fly", it is the video output device, so if it requires more access to RAM to do the scaling, then it could have some efect on performance of the machine as a whole. Or am I totally off the mark here?
 
I'm guessing the data fetched from RAM would be the unscaled frame, and then the scaled image is output to the RAMDAC, so there is no extra load on the system bus.
 
I can't see it fetching the entire front buffer, but it wouldn't need to, as it would only require a certain number of lines on a FIFO. However, if it is doing this, where is it holding it while it does the sampling? It would not require that much storage space, but still, it would have to be held somewhere. I would be nice to hea the views of someone that actually knows how the XB360 works in this regard.

Also the PS3. I am supposing that the way it is done there is that they have say a 1920x1080 front buffer when they are outputting at that resolution, and they copy the 960x1080 back buffer into it, scale into the full 1920? Am I correct in this, that the PS3 scaler is not done on video ouput but rather on the buffer itself before output?
 
Back
Top