360 individual console specific framerate fluctuations

mrdarko said:
from what i have read about call of duty 2,opinions range from "runs smooth as silk" to "runs like crap!"
i don't think that can be put down to 2 peoples perception.
That does come down to perception. Some people can watch games drop well below 30fps and say it is smooth as can be just because the game doesn't hitch and stutter; while for others, anything less than a constant 60fps simply doesn't qualify as silk and framerate drops into the 20's is outright crap. So whether you appreciate the performance of CoD2 on your 360 is going to come down to your own perspective. Personally. I have taken to turning my 360 down to 480p to play the game; mostly because it runs much better like that, and the addition of AA when rendering at the lower resolution is nice as well.
 
kyleb said:
That does come down to perception. Some people can watch games drop well below 30fps and say it is smooth as can be just because the game doesn't hitch and stutter; while for others, anything less than a constant 60fps simply doesn't qualify as silk and framerate drops into the 20's is outright crap. So whether you appreciate the performance of CoD2 on your 360 is going to come down to your own perspective. Personally. I have taken to turning my 360 down to 480p to play the game; mostly because it runs much better like that, and the addition of AA when rendering at the lower resolution is nice as well.

what i can't get my head round is how does down scaling to 480p make a difference if the game is internally rendered at 720p?

in what way does it run "better"?
 
Exactly, MrDarko, and being that COD2 is probably not fillrate limited at 720p on Xenos, I find it hard to believe that 480p mode runs smoother, even if it is rendered at 480p.
 
Luminescent said:
Exactly, MrDarko, and being that COD2 is probably not fillrate limited at 720p on Xenos, I find it hard to believe that 480p mode runs smoother, even if it is rendered at 480p.

I think MrDarko's point was that even when you set the game at 480p on the X360, internally the game is still rendered at the same resolution. There is no logical reason to assume that changing the output resolution on the X360 will have any affect on its performance -- it's all done through scaling. It isn't going to render at anything but 720p (or 1080i if they so choose, depending on the game, I suppose).
 
360 individual console specific framerate fluctuations

Doesn't this have something to do with people running the games that run 30fps (60hz) on 25fps (50hz).

Because my brother had that problem with PGR3 and turned out he had to upgrade his LCD TVs firmware because his component could only support 50hz at that time.
Now it's fixed but he already uses his DVI-I port (through a DVI to VGA converter with the VGA cable) which always supported 60hz.
 
Bobbler said:
I think MrDarko's point was that even when you set the game at 480p on the X360, internally the game is still rendered at the same resolution. There is no logical reason to assume that changing the output resolution on the X360 will have any affect on its performance -- it's all done through scaling. It isn't going to render at anything but 720p (or 1080i if they so choose, depending on the game, I suppose).

Nope. Games can do different things depending on the dashboard settings. PGR3 does. GW does. I'm not sure about the other games, but why wouldn't they?

Besides, 16:9 vs 4:3 often has impact on a hud, fonts, etc.
 
pipo said:
Nope. Games can do different things depending on the dashboard settings. PGR3 does. GW does. I'm not sure about the other games, but why wouldn't they?

Besides, 16:9 vs 4:3 often has impact on a hud, fonts, etc.

Hrmm... Everything I've read says the internal rendering resolution never changes (I'll admit, it isn't very much), only the scaler's output resolution. I thought the point of the scaler chip was to take care of such things so the game didn't need to bother rendering in several different modes.

You have any links or anything that says otherwise? Or anyone else corroborate his claim?
 
I can't find the link, but Bizarre talked about using more AA on lower resolutions.

On top of that, don't you agree the internal rendering resolution should at least be different for the different aspect ratios?
 
Last edited by a moderator:
pipo said:
I can't find the link, but Bizarre talked about using more AA on lower resolutions.

I'd assumed they were talking about the AA that comes when you resize the frame for the lower resolution (supersampling).

On top of that, don't you agree the internal rendering resolution should at least be different for the different aspect ratios?

It could be, but there's no reason why it would have to be unless the video output chip only took certain resolutions for certain aspect ratios (can't see why this would be the case).
 
pipo said:
On top of that, don't you agree the internal rendering resolution should at least be different for the different aspect ratios?

That would make sense... I guess I didn't really think about it. So there is likely at least two resolutions that the game has to render at (widescreen and standard), possibly more it seems (480p WS, 480p STD, 720p)... gone are the days of only worrying about having a game run perfectly in one res?
 
Bobbler said:
Gone are the days of only worrying about having a game run perfectly in one res?
What about screen refreshes? I had a nasty shock on PC when the program I'm writing that I thought I had locked regardless of screen refresh actually wasn't in some aspects (physics as it happens). I guess monitor's are happy to drop down to 60 Hz so the devs needn't worry, and 50 Hz systems just run slower, right? Though how does a 60 Hz LCD cope with a 50 Hz PAL system?
 
mrdarko said:
what i can't get my head round is how does down scaling to 480p make a difference if the game is internally rendered at 720p?

in what way does it run "better"?
It isn't internally rendered at 720p when using 480p or 480i; it is rendered at low res and with anti alising and mantains a higher framerate when doing so. I can take some pics to illsutrate the visual difference if you like.
 
kyleb said:
It isn't internally rendered at 720p when using 480p or 480i; it is rendered at low res and with anti alising and mantains a higher framerate when doing so. I can take some pics to illsutrate the visual difference if you like.
It seems like that would defeat the purpose behind the scaler. But it would also explain why the games tell you what resolutions they support on the back of the box.
 
kyleb said:
It isn't internally rendered at 720p when using 480p or 480i; it is rendered at low res and with anti alising and mantains a higher framerate when doing so. I can take some pics to illsutrate the visual difference if you like.

Are you sure its not the combination SSAA that gives it that effect at 480 cause i know exactly what you mean having played it in both and found the first time much better on 480 than on 720.

Its strange for ex that PGR3 dont do the same at 480 and i dont buy that they didnt have time either to do so.
 
Last edited by a moderator:
I don't believe the 360 is rendering at 480p if its set to 480p.

The way i see it is everything is rendered in 720p and then downscaled, in the case of 480 4:3 the scaler would only display the 4:3 area.

Don't forget that 16:9 on a SDTV is the same resolution as 4:3 just one is anamorphic.

If the GPU is rendering at different resolutions then MS have made a stupid mistake.
 
Diamond.G said:
It seems like that would defeat the purpose behind the scaler. But it would also explain why the games tell you what resolutions they support on the back of the box.
Nah, the back of the box isn't even worth paying attention to; like Perfect Dark says 1080i on the box, but the Special edition says 720p and I assure you that both render at 720p reguardless of if the 360 is set to output 1080i or 720p. And CoD2 does use the scaler to upscale to 1080i when slected, it just renders differently for 480p and 480i and thankfully so as I much aprecate the improved framerate.

overclocked said:
Are you sure its not the combination SSAA that gives it that effect at 480..
Yeah I'm sure, it is quite clearly using MSAA when set to 480p or 480i.
 
xboxyde has a video of the first 10 minutes of full-auto..

There are some pretty harsh words about it's frame rate.


That said....

When I watch it, I don't notice the frame rate until:

a) it hitches,
b) it gets to a low detail area and jumps to ~60fps instead of 30
c) time gets out of sync slightly*

* what I mean by this, and I've seen (and coded) this badly, is when the game timing isn't accurate enough to account for the changing frame rate. I've seen this happen with performance counters on PCs when under heavy load, the results get innaccurate, or, more comonly, when the delta-time value is calculated as an average from the last X frames.
What effectivly results is that an event might occur that slows the game a lot, and those frames slow, but over the next X frames the game 'catches up'. It's technically not a frame rate drop, but to me it's really really distracting and I can't stand it.

The full auto video has significant problems with this. Even some of the menu screens exhibit it, (when the camera spins around the car during setup a motion blur effect is applied which appears to do this).

Besides those issues I personally think that game looks quite awesome, and I'd be very proud to have been on the team that created it. It just seems a minor problem has crept in somewhere that is going to ruin the experience for a lot of people. Although that said the actual pauses that occur in the video are inexcusable.

I do wonder if differences occur between a gold and preproduction game (for example, the way the DVD is written.. something as simple as this could easily play havoc with a games content streaming code...)

Unrelated,

One thing that occurs in the video, is the car drives through extremly thick smoke. Now maybe it was the EDRAM here, but I really expected that the frame rate would tank when this happened, but nope, if anything it got better, yet there was no sign of draw distance reduction, etc.
 
function said:
I'd assumed they were talking about the AA that comes when you resize the frame for the lower resolution (supersampling).

I'm pretty sure it was something along the lines of '2xAA in HD and 4xAA in SD'. But I can't find the source (hmm... - I'll have a look at the interview in Edge tonight).
 
Shifty Geezer said:
I guess monitor's are happy to drop down to 60 Hz so the devs needn't worry, and 50 Hz systems just run slower, right?
Games aren't timed to the framerate anymore, 50 or 60 Hz should matter as little to the game speed as it does on a PC where framerates vary wildly depending of the hardware configuration of the host system.

Though how does a 60 Hz LCD cope with a 50 Hz PAL system?
Pretty much all LCDs ought to support down to 50Hz without a problem, besides, 720P rez apparantly outputs at 60Hz even on PAL 360s so it won't be an issue.

DoA4 for example even states on the back of the box it only supports PAL60 (for SDTV I'm sure they mean but it isn't specifically mentioned), so there won't be any 50Hz seen anywhere in that particular game. :)
 
Back
Top