1080i support for XBox360. How?

Sandwich said:
Try any shooter you like really. You can lower the refresh of your monitor to 60Hz under the windows display properties panel.
Then when you've played a while at somewhere around 100 fps, increase some AA / AF levels until you are around 60 fps.
You should be able to tell the difference between 60@60 and 100@60. 100 fps feels alot smoother.

Try that with Doom 3, where the game logic always runs at 60 Hz, and the frames are capped at 60 fps. That would be a good example of the new games around the corner.

Edit: and I would prefer a real minimum of 30 fps over any high maximum as well.
 
DiGuru said:
Sandwich said:
Try any shooter you like really. You can lower the refresh of your monitor to 60Hz under the windows display properties panel.
Then when you've played a while at somewhere around 100 fps, increase some AA / AF levels until you are around 60 fps.
You should be able to tell the difference between 60@60 and 100@60. 100 fps feels alot smoother.

Try that with Doom 3, where the game logic always runs at 60 Hz, and the frames are capped at 60 fps. That would be a good example of the new games around the corner.

If you want you, can remove the cap in Single player. Create autoexec.cfg. Set com_fixedTic to -1
You obviously can't test 100fps with a 60fps cap in place. That destroys the entire comparison.

Edit: and I would prefer a real minimum of 30 fps over any high maximum as well.
All the more reason to have a higher average framerate. That way the minimal framerate increases aswell.
 
Sandwich said:
All the more reason to have a higher average framerate. That way the minimal framerate increases aswell.

But it doesn't work that way!

"Average frame rate" doesn't guarantee anything, at all. Different gamers will get a different "average" frame rate (assuming no vsync here) depending on how long they spend in one area, where they look, how enemies respond to them, which weapons they use, which view they use etc. And a game with an "average" frame rate of 100 fps might very well have a lower "lowest" frame rate than a game with an average of 30 fps.

Processing load varies so greatly between parts of a game like a fps that chasing an "average" figure is a senseless way of trying to fix issues with minimum frame rate!
 
Okay, so you want 120 fps when you play the game on your CRT. But how then does that make playing on your LCD acceptable to you? Specially if you up the detail levels and get only 75 fps?
I don't mean to critize you or anything, but i don't understand you. 120 fps on the LCD would still be better.

Because the monitor can't display anything more than 75htz . So why would I want more ? As long as its a steady 75fps that is all that matters . Of course it be nice to have 500fps but it doesn't really matter after 75fps . As I said the main problem are the drops in frames nad how it can quickly go down to a few fps from many hundreds .



A shot at the center of the target would *miss*, because the target has already moved again half his width again before the frame is finally displayed on screen.
Now I know that in the real game, the sniper would lead the target, matching the speed of his crosshair to the target, so the problem is partially alleviated, but it should be clear the lag makes a difference.
The player is looking in the past.

You naturaly adjust . As I said i rather have 75fps stead with no drops than to be targeting someone at 200fps pressing fire and watching it drop to 2 fps totaly messing up my shot and my reactions . Thats not really fun . I agree if you can give me 200fps sustained with no drops i much rather have that than 75fps. But its not going to happen anytime soon . Esp not in the pc space
 
function said:
Sandwich said:
All the more reason to have a higher average framerate. That way the minimal framerate increases aswell.

But it doesn't work that way!

"Average frame rate" doesn't guarantee anything, at all. Different gamers will get a different "average" frame rate (assuming no vsync here) depending on how long they spend in one area, where they look, how enemies respond to them, which weapons they use, which view they use etc. And a game with an "average" frame rate of 100 fps might very well have a lower "lowest" frame rate than a game with an average of 30 fps.

Processing load varies so greatly between parts of a game like a fps that chasing an "average" figure is a senseless way of trying to fix issues with minimum frame rate!
I'm not talking garantees or anything.

It really depends on how you look at it: if I have this game and I want better performance, the first thing I'd look into is lowering the visual quality one way or the other.
This will increase performance overal, though I realize I am generalizing here and there may be exceptions.

As a rough guideline I always take the minimal framerate should be half the average framerate.
Your milage may vary. A game may adjust the detail levels when frames drop low, etc.
This does not mean a developer cannot target a desired performance. Just don't expect it to be a very exact science. It's estimates, averages, guesses.

That is what i mean when I talk about aiming at 90 fps.
 
jvd said:
Because the monitor can't display anything more than 75htz . So why would I want more ? As long as its a steady 75fps that is all that matters .

Personally I'd want more because the lag or latency I described before and because higher averages can mean shallower declines aswell.

A shot at the center of the target would *miss*, because the target has already moved again half his width again before the frame is finally displayed on screen.
Now I know that in the real game, the sniper would lead the target, matching the speed of his crosshair to the target, so the problem is partially alleviated, but it should be clear the lag makes a difference.
The player is looking in the past.

You naturaly adjust .

True. I'm not a poor a player I cannot play the game with "merely" 60 fps, but the advantage of higher fps is still there.
 
Personally I'd want more because the lag or latency I described before and because higher averages can mean shallower declines aswell.
Not if there are still dips lower than what you want. That will mess u up more than the lower refresh rate . In cs source i can feel when someone tosses a nade as it spikes my fps down by about 20fps . It messes up my tracking .

True. I'm not a poor a player I cannot play the game with "merely" 60 fps, but the advantage of higher fps is still there.
any advantages will be negated due to the dips in framerates . So as i said unless its sustained it doesn't matter .


The problem then becomes do i want to still be playing quake 3 graphical games so i can get those few hundred fps or half life 2 and doom 3 or soon unreal 3 engine games with those 60-75fps ?
 
jvd said:
The problem then becomes do i want to still be playing quake 3 graphical games so i can get those few hundred fps or half life 2 and doom 3 or soon unreal 3 engine games with those 60-75fps ?

Yes. The hardest part is knowing where to strike a balance between visual quality and performance.
For myself I know I also want much better performance from the next gen console games, compared to this generation.
The limitation of the TV display is not and should not be a hard limitation.
 
Sandwich said:
As a rough guideline I always take the minimal framerate should be half the average framerate.
Your milage may vary. A game may adjust the detail levels when frames drop low, etc.
This does not mean a developer cannot target a desired performance. Just don't expect it to be a very exact science. It's estimates, averages, guesses.

That is what i mean when I talk about aiming at 90 fps.

Ok. Last try.

There are three different things here.

1. The game logic processing the input and moving stuff around on the screen.
This will very likely be fixed in most upcoming games to a specific value, like 60Hz. When you're aiming and there is a dip in the frame rate, it will have no effect on your accuracy whatsoever. The distance moved, your aim and the movement of everything around you will always be processed at the same rate. The screen will not jerk around at a different speed, depending on the frame rate. The movement will stay exactly the same.

2. The amount of frames rendered.
This will vary, according to lots of things. But unlike the "old" games, the amount and distances of the changes between two frames does not depend on the speed at which frames are rendered, but on the steady clock of the game logic. So, instead of slowing everything down when there is much to be rendered, or just skipping parts of the action, everything goes along at the same speed. So, the main difference with slow frame rates, is that you see less frames. All the action continues at the same speed at all times.

3. The refresh of the monitor.
When the game logic runs independent of the rendering process, there is nothing whatsoever speeding up when you render more frames than your monitor can display. Nothing at all. The excess frames are just discarded when you use vsync and a fixed refresh of 60 Hz. Which is what you do on a console, that uses a tv for display.

So. Even if you use a PC and run a game like Doom 3 and disabled the maximum frame rate, the only thing that changes is that you can use it as a benchmark. Because everything else going on runs at a fixed speed, and will not change.

When looking at lag (and not network lag), you might have a problem when the framerate drops very low, so you don't see what is going on and start to over-compensate. Therefore, it would be nice to have a fixed minimum frame rate.

If the computer has enough memory, the developers are starting to load stuff seamlessly, before it is needed. That means, that there are no level loads, but that you might get a serious dip in your frame rate when the new content is loaded. The same thing happens when you enter a building, or when a lot of stuff comes around the corner. If you're trying to do something at that moment, you perceive lag. So, it would be very nice if the game developers try to limit the loss in speed to a lower limit, like 30 fps. That way, the game won't freeze on you and there is no lag to mess up your aiming at all, whatsoever, at any time.
 
DiGuru said:
Ok. Last try.

There are three different things here.

1. The game logic processing the input and moving stuff around on the screen.
This will very likely be fixed in most upcoming games to a specific value, like 60Hz. When you're aiming and there is a dip in the frame rate, it will have no effect on your accuracy whatsoever. The distance moved, your aim and the movement of everything around you will always be processed at the same rate. The screen will not jerk around at a different speed, depending on the frame rate. The movement will stay exactly the same.
I blame myself for being such a poor writer, but atleast jvd gets it, though he disagrees with me on the severity of the issue.

With a seperate game logic thread, and if and only IF you can attain *sustained* performance, you take care of the physics and the user input, but you still don't touch on the visual latency.
Visual latency does have an impact on accuracy as I showed earlier.

2. The amount of frames rendered.
This will vary, according to lots of things. But unlike the "old" games, the amount and distances of the changes between two frames does not depend on the speed at which frames are rendered, but on the steady clock of the game logic.
So, instead of slowing everything down when there is much to be rendered, or just skipping parts of the action, everything goes along at the same speed. So, the main difference with slow frame rates, is that you see less frames. All the action continues at the same speed at all times.
No disagreement here. until we get to point 3:

3. The refresh of the monitor.
When the game logic runs independent of the rendering process, there is nothing whatsoever speeding up when you render more frames than your monitor can display. Nothing at all.
And that's where you are wrong.
I'll try one more time. Remember this doesn't relate to controls or physics, just pure visuals. I'm going to do this example step by step.

Assume 60Hz display. tripple buffer + vsync.
60 fps means it takes 17ms to render the frame.

procedure: t=0 begin frame n+1; retrace; finish n+1; t=17 begin n+2; retrace; finish n+2, ...

The latency between what you see on screen and when it happened varies between 17ms and 34ms.
Not only do we have a visual lag of up to 2 frames, a sprite moveing from left to right at an internally constant speed, does not move at a constant speed on screen.

compare to 100 fps:
t=0 begin n+1; retrace; finish n+1; t=10 begin n+2; retrace; finish n+2; t=20 begin n+3; finish n+3; t=30 begin n+4; retrace; finish n+4, ...

Latency now varies between 10 and 20ms. Also movement on screen is displayed at a more constant speed.

There's no denying this is better. How much you are willing to give up for better performance is another question, but the advantage is there.

BTW Have you actually tried the experiment for yourself yet?
 
Sandwich the main problem is you don't take into account dips or spikes in the fps .

As i've been saying sure 100fps is better than 75. But if that 75 is silky smooth where its 75fps each second that i play the games , i would take that over a 100fps that changes from frame to frame. Heck i'd take a locked 75fps over 800fps non locked cause once you hit a spot where your frames drop off your screwed
 
jvd said:
Sandwich the main problem is you don't take into account dips or spikes in the fps .

As i've been saying sure 100fps is better than 75. But if that 75 is silky smooth where its 75fps each second that i play the games , i would take that over a 100fps that changes from frame to frame. Heck i'd take a locked 75fps over 800fps non locked cause once you hit a spot where your frames drop off your screwed

I do this to keep the explanation simple. The thing is your 75 fps are probably going to chance from frame aswell.
My bet is on the 800 fps to have the shallower drops than the 75 fps, atleast in absolute values above a desired minimum.

Just because you can or might have slow frames is no reason why the entire game experience should be slow, is what I'm thinking.

I bet that if you tried to synchonize the frames and keep a constant 75 fps this would cost you more cycles than a strait forward 1000 fps with occasional declines to 100 fps.
 
well lets take this for example


I'm playing a game on the xbox 360 and its locked in at 60fps . The devs tweaked it worked on it and it never drops below 60fps and it never goes above 60fps .


You have the latest pc . A tricked out system dual core athlon 64 4800s , a ppu , 3 ati r520s .


Now say this game has smoke nades . Now smoke is a bitch on the systems . Now we are playing death match and we enter a room with tons of smoke all over the place and i'm staying steady at 60fps and your going from 1-200fps at any given second things change drasticly for you. Who do you think is going to leave that room alive ?

I will bet u it would be me with the locked system .

THat is why when i play online i use the lowest possible visual settings so that my valleys aren't that low .
 
jvd said:
well lets take this for example


Now say this game has smoke nades . Now smoke is a bitch on the systems . Now we are playing death match and we enter a room with tons of smoke all over the place and i'm staying steady at 60fps and your going from 1-200fps at any given second things change drasticly for you. Who do you think is going to leave that room alive ?

I will bet u it would be me with the locked system .

THat is why when i play online i use the lowest possible visual settings so that my valleys aren't that low .

Sure I'd take a magical fixed 60fps above 1-200fps

But the thing is at a minimum(say >90%) of 60fps, you get peaks *atleast* 200fps for FREE. Capping the peaks doesn't give you anything.

For this fixed 60 fps you would have to have very good load balancing and ingame detail adjustment in the first place. Then you'd have to aim at something like 120fps without caps, before you finally cap it.
 
Consider monitor refresh at 50 Hz, and screen updates at 200 Hz. Consider an FPS, looking down an alley, and you turn. The screen will show four distinct bands where the front buffer has updated 4 times during the screen refresh, the walls being fractured in four places. Lag between frames in the top quarter is still 1/50th of a second. Lag for action happening in the bottom quarter is still 1/50th of a second. So if you've got a plane flying past in the sky there's still the visual lag to contend with limited by the monitor refresh rate, not the game draw rate. But that's not a porblem because watching the plane over a series of frames your brain predicts where it will be in x amount of time, so you'll pull the trigger between screen refreshes if that's the right time to shoot. Though remember also humans aren't that accurate in the main and there'll be a few ms either way from when you intend to shoot and when you actually signal your muscles to press the button and they respond.

Visual lag is tied to monitor refresh, irrevocably. Updating the screen faster than the refresh causes image tearing, but doesn't reduce visual lag for different areas of update. As visual lag is so restricted to monitor, one may as well lock to the monitor refresh rate to elliminate visual tearing at no penalty to the player who's still going to get a regular visual update to what's happening at the speed his monitor runs.

For less visual lag you need a monitor that can support higher refresh rates.
 
Shifty Geezer said:
Visual lag is tied to monitor refresh, irrevocably. Updating the screen faster than the refresh causes image tearing, but doesn't reduce visual lag for different areas of update. As visual lag is so restricted to monitor, one may as well lock to the monitor refresh rate to elliminate visual tearing at no penalty to the player who's still going to get a regular visual update to what's happening at the speed his monitor runs.
Nope. Tripple buffering + vsync. We can draw as fast as we like without having to worry about either tearing or vsync.

For less visual lag you need a monitor that can support higher refresh rates.

You want both. The effects of a slow refresh and slow fps don't overlap. They add up.
 
I'm confused. How can you draw 200 fps on a monitor that only renders 75 Hz?

Ah....I see (I think :p ). Your saying render frames at 200 fps, and the moment one frame is finished it can grab the next one from the buffer. Which means no hanging around. Can't see that having any benefit though.

Code:
ms   X    Y

0    a    a
5        
10        b
15   b
20        c
25
30   d    d
35   
40        e
45   e
50        f
55
60   g    g
65
70        h
75   h
80        i
85   
90   j    j
95   
100       k
Trying to understand, I've just compiled the above table. Column X is tied to monitor refresh (every 15 ms) and column Y is tied to drawing rate (10 ms/frame). Seems to me this method would cause an irregular elapsed time between frames. Time elapsed between frames a and b in game time is 10ms, whereas between frames b and d it's 20 ms. The player will be seeing graphics moving at alternating rates between frames. Whereas a fixed framerate would keep the same uniform time interval for game progression. And data like that rendered for frame c would be wasted and contribute nothing.
 
Shifty Geezer said:
I'm confused. How can you draw 200 fps on a monitor that only renders 75 Hz?

Ah....I see (I think :p ). Your saying render frames at 200 fps, and the moment one frame is finished it can grab the next one from the buffer. Which means no hanging around. Can't see that having any benefit though.

Code:
ms   X    Y

0    a    a
5        
10        b
15   b
20        c
25
30   d    d
35   
40        e
45   e
50        f
55
60   g    g
65
70        h
75   h
80        i
85   
90   j    j
95   
100       k
Trying to understand, I've just compiled the above table. Column X is tied to monitor refresh (every 15 ms) and column Y is tied to drawing rate (10 ms/frame). Seems to me this method would cause an irregular elapsed time between frames. Time elapsed between frames a and b in game time is 10ms, whereas between frames b and d it's 20 ms.

Yup there is an irregularity in the speed, you'll always have that. The irregularity becomes even bigger when fps aproaches rr. See my examples above.

Now try to draw the same table, with fps *almost* equals RR. What you will see is that when you finish rendering the frame a little early, the latency is low but when you finish a tad to late you get double the latency.

And don't tell me the fps and rr are exactly the same and completely sync. That just doesn't happen. You'd have to keep your gpu running 99% idle and waiting for vsync to accomplish that.
 
:eek: LOL! :LOL:
Finally I understand what DiGuru is trying to do! :D

DiGuru
1. The game logic processing the input and moving stuff around on the screen.
This will very likely be fixed in most upcoming games to a specific value, like 60Hz. When you're aiming and there is a dip in the frame rate, it will have no effect on your accuracy whatsoever. The distance moved, your aim and the movement of everything around you will always be processed at the same rate. The screen will not jerk around at a different speed, depending on the frame rate. The movement will stay exactly the same.
I'll sumarize part of our discussion thus far:

Me: Whaa!! 60fps is too slow I want 100fps!
60fps kills my visual latency, the internal physics engine and my keyboard readout!

DG: Here's a solution: let's seperate the game logic then arteficially limit it to 60Hz, so you don't have to worry about the visual latency from having only 60 fps!

Me: *rants on about visual latency*

OMG! :oops:
 
Back
Top