1080i support for XBox360. How?

A well written game can have logic buzzing through as fast as possible, but reduces graphical errors by locking the framerate to vsync on the monitor. Objects are positioned, input evaluated, at 200 fps, but the output that the user sees is rendered perfectly in sync with monitor, with no tearing, no flickering, no slowdown, at the optimum rate the display can output the visuals.
 
Shifty Geezer said:
A well written game can have logic buzzing through as fast as possible, but reduces graphical errors by locking the framerate to vsync on the monitor. Objects are positioned, input evaluated, at 200 fps, but the output that the user sees is rendered perfectly in sync with monitor, with no tearing, no flickering, no slowdown, at the optimum rate the display can output the visuals.

200fps on a vsynched 75Hz monitor is 75fps. might be "more responsive" but in the end it's 75fps in our eyes.
 
Shifty Geezer said:
A well written game can have logic buzzing through as fast as possible, but reduces graphical errors by locking the framerate to vsync on the monitor. Objects are positioned, input evaluated, at 200 fps, but the output that the user sees is rendered perfectly in sync with monitor, with no tearing, no flickering, no slowdown, at the optimum rate the display can output the visuals.

Maybe. But I say, use triple buffer+vsync then draw every frame as fast as possible, no messing around with stalling for retrace etc. Simple, fast, direct.
 
The only reason old games like quake (1-3) benefit from high fps is, because they use the free running game loop model. They consist of a single loop, that is iterated as fast as possible. The moment you start using separate threads for the game logic and the rendering process, it becomes moot.

Even more, with a fixed 30/60 fps output, you waste a large amount of your resources by rendering excessive frames that are never displayed and don't contribute to anything else. Use those resources to speed up the game logic and do better physics, instead of rendering frames that get thrown away.

Part of the reason that for FPS the focus is mostly on graphics and everything else gets very little resources, is because that is needed to get acceptable frame rates on current hardware. But we see now that the disparity between low-end and high-end graphic cards is so huge, that for the low-end cards the GPU is the bottleneck, while for the high-end ones the CPU is. So it's a wasteful model, especially for computers that have multiple processor cores.

And if you do all that (which you should, on all future consoles and PCs), the best way to have snappy responses in online FPS, is to reduce the network lag (ping) as much as possible.
 
Alstrong said:
Doom 3's fixed 60Hz tick rate seems good enough. :)

Yes, for Doom 3 it shouldn't matter.

It might be better if we would ask: do we want the extra resources to be used to render more frames, or do we want to use them for things like better physics and better looking graphics?
 
DiGuru said:
The only reason old games like quake (1-3) benefit from high fps is, because they use the free running game loop model. They consist of a single loop, that is iterated as fast as possible. The moment you start using separate threads for the game logic and the rendering process, it becomes moot.
Good point. Seperate threads may complicate the issue.

Even more, with a fixed 30/60 fps output, you waste a large amount of your resources by rendering excessive frames that are never displayed and don't contribute to anything else. Use those resources to speed up the game logic and do better physics, instead of rendering frames that get thrown away.

On the contrary, for consoles there's all the more reason to output more frames faster: the TV signal is 60hz or 50 for PAL.
There's a reason why people with TFT screens complain more about tearing than gamers with proper CRTs. 60Hz is too low.
If I get 60fps on my 120Hz CRT not only do I perceive less lag, but the screen is also refreshed once more often. Thus I can disable Vsync and it won't be as obvious as on the TFT.

Aiming for 60fps as is common for console shooters is worst case scenario really.
Example: vsync on. begin retrace. generate new frame takes 17ms. Missed retrace by a hair. Wait another 17ms.
You are seeing on screen is what occured up to 2 frames ago.

What you should actually want is a great disparity between fps and RR. On the one hand rendering excessive frames reduces lag. On the other hand, low fps and fast refresh hides tearing and minimizes lag (you can up the detail even further and use double buffer no vsync for extra efficiency).
 
Sandwich said:
Example: vsync on. begin retrace. generate new frame takes 17ms. Missed retrace by a hair. Wait another 17ms.
You are seeing on screen is what occured up to 2 frames ago.
I would say write the game so that you never miss that refresh. Keep to the limits of the hardware so in the most complicated situations you never go beyond 16ms to calculate and render a frame. It's really a dev choice to push the boat out, have nicer visuals, but fractured framerates, or reign in the eye-candy for a stable framerate, and mostly I prefer the latter.
 
It gets even more complicated than that. If the cpu is the worst bottleneck you can mess around some, but when it's more likely the gpu (fillrate, bandwidth limited or whatever) there's little you can do except keep feeding it.
 
Sandwich said:
On the contrary, for consoles there's all the more reason to output more frames faster: the TV signal is 60hz or 50 for PAL.
There's a reason why people with TFT screens complain more about tearing than gamers with proper CRTs.

If you turn on vsync, you will have no tearing. And most of the time, there is tearing because you don't use triple buffering, and there are too many frames rendered.

So I don't see the point why a framerate that is much higher than the monitor refresh would reduce tearing. It doesn't. It increases tearing, but makes it less visible, as the changes in between frames get smaller.

60Hz is too low.
If I get 60fps on my 120Hz CRT not only do I perceive less lag, but the screen is also refreshed once more often. Thus I can disable Vsync and it won't be as obvious as on the TFT.

If the game uses a different thread for the game logic, that's all the lag you'll get: perceived lag. Especially with consoles, when using some auto-aiming to give you the possibility to actually hit something, the only perceptible lag will be in over correcting or the movement of the cursor by the auto-aiming.

Aiming for 60fps as is common for console shooters is worst case scenario really.
Example: vsync on. begin retrace. generate new frame takes 17ms. Missed retrace by a hair. Wait another 17ms.
You are seeing on screen is what occured up to 2 frames ago.

That's why frames are buffered. That doesn't mean, that the next frame isn't displayed, or the computer won't start rendering the next frame. But you might skip a frame every once in a while.

What you should actually want is a great disparity between fps and RR. On the one hand rendering excessive frames reduces lag. On the other hand, low fps and fast refresh hides tearing and minimizes lag (you can up the detail even further and use double buffer no vsync for extra efficiency).

No, using the resources for other things than rendering excess frames reduces lag and tearing, and gives you more detailed visuals. Just because the resources aren't wasted.

And by upping the detail by hand, you reduce the fps in any case. You're only saying, that if you have excess frames, you can reduce that and get better visuals by changing the setting yourself.

But if you leave the frame rate management to the game, it can make sure you always get the minimal amount you want, instead of highly varying frame rates, where you would want to change the detail settings every minute.
 
DiGuru said:
Sandwich said:
On the contrary, for consoles there's all the more reason to output more frames faster: the TV signal is 60hz or 50 for PAL.
There's a reason why people with TFT screens complain more about tearing than gamers with proper CRTs.
If you turn on vsync, you will have no tearing. And most of the time, there is tearing because you don't use triple buffering, and there are too many frames rendered.
Could be a particular game that only works with double buffering.

So I don't see the point why a framerate that is much higher than the monitor refresh would reduce tearing. It doesn't. It increases tearing, but makes it less visible, as the changes in between frames get smaller.
A CRT doesn't increase framerate obviously. I'm pointing to the higher Refresh rate of good CRTs. With twice the refresh rate, tearing images will be alternated with complete frames. Much better.

Aiming for 60fps as is common for console shooters is worst case scenario really.
Example: vsync on. begin retrace. generate new frame takes 17ms. Missed retrace by a hair. Wait another 17ms.
You are seeing on screen is what occured up to 2 frames ago.
That's why frames are buffered. That doesn't mean, that the next frame isn't displayed, or the computer won't start rendering the next frame. But you might skip a frame every once in a while.
Ofcourse, I'm not saying that the fps decreases, but you still get the lag. Triple buffering will do little about the lag, but unlike with double buffering you don't get your fps halved aswell.
You still get a lag between 1 or 2 frames, but with this advantage that slow frames can also get skipped, so on average <1.5.

What you should actually want is a great disparity between fps and RR. On the one hand rendering excessive frames reduces lag. On the other hand, low fps and fast refresh hides tearing and minimizes lag (you can up the detail even further and use double buffer no vsync for extra efficiency).
No, using the resources for other things than rendering excess frames reduces lag and tearing, and gives you more detailed visuals. Just because the resources aren't wasted.
Slower frames doesn't reduce lag, because the time it takes to render a frame itself IS lag, but you get less trace lag if the refresh rate is much higher. You get maximum additional lag if fps and rr are equal.
And by upping the detail by hand, you reduce the fps in any case. You're only saying, that if you have excess frames, you can reduce that and get better visuals by changing the setting yourself.
I'm saying faster is better. Speed for visuals is a trade-off.
But if you leave the frame rate management to the game, it can make sure you always get the minimal amount you want, instead of highly varying frame rates, where you would want to change the detail settings every minute.
You always get variation. Nothing prevents that except the pointless capping of frames.

It's just pathetic for a console shooter to assume less than 60 fps will somehow be enough. 60 fps should be the bare minimum. Consoles should be about blazingly fast action. That's what a console is supposed to be good at.
I'm not even a hardcore gamer, but even I can clearly see how sluggish console shooters are compared to the PC.

For a shooter you shouldn't aim for 60fps, just because the display is 60hz.
Aim higher. 90 fps will do.
You get less lag, which many gamers *will* notice at these low speeds.
Minimum frames would also increase, which is always a good thing.
 
Sandwich said:
It's just pathetic for a console shooter to assume less than 60 fps will somehow be enough. 60 fps should be the bare minimum.

According to whom? The 7 million or so people that have bought Halo 2 (making it the best selling FPS ever, IIRC, and giving it the highest attach rate of any game this generation)? The scores of professional reviewers who have stated Forza to be the best racing sim-u-game ever?

Don't get hung up on a number without considering the delicate balancing act that is making a game.

Consoles should be about blazingly fast action.

The speed of the action and the rate at which new frames are displayed are two very different things.

For a shooter you shouldn't aim for 60fps, just because the display is 60hz.
Aim higher. 90 fps will do.
You get less lag, which many gamers *will* notice at these low speeds.
Minimum frames would also increase, which is always a good thing.

If it weren't for frame rate caps, the frame rate of any "30 fps" game would probably vary between the hundreds and 30fps, or < 30fps when you get the drops you actually notice. Where you're pulling these figures from I have no idea, but the work done in generating a frame isn't static at around one level casually chosen by the developer.

It's also an absolutely crazy way to go about game devlopment, purposefully planning on having a minimum frame rate far beyond your ability to display (vsync enforces a cap at your refresh rate) in the hope that this will somehow magically prevent frame rate drops (it probably won't). Throwing away a machines power on things that offer no benefit is no substitute for fixing a game so as to minimise or remove frame rate drops.
 
Sandwich, I don't follow what you are saying about latency since the screen only refreshes so fast and the frame take the same time to render regardless. So sure, the portion of the image under the tears is updated sooner than with vsync; but the top part of the refresh is still no newer than with vsync and triple buffering.
 
function said:
Sandwich said:
It's just pathetic for a console shooter to assume less than 60 fps will somehow be enough. 60 fps should be the bare minimum.

According to whom? The 7 million or so people that have bought Halo 2 (making it the best selling FPS ever, IIRC, and giving it the highest attach rate of any game this generation)? The scores of professional reviewers who have stated Forza to be the best racing sim-u-game ever?
Poll some time ago at b3d:
http://www.beyond3d.com/forum/viewtopic.php?t=17734&highlight=poll+minimum

Apparently a sizeable group here want better than 60fps.
I guess some console gamers just don't know any better. Yeah, it's still an opinion, but I take the PC as an example of how it could have been done better. We can have +200fps when we want to, if we got the hardware. Maybe we don't need that kind of power, but we sure as hell can do better than 60 fps and it does make a big difference in a death match.

Consoles should be about blazingly fast action.

The speed of the action and the rate at which new frames are displayed are two very different things.
I see a relation between the two. Fast movement requires more visual updates.

If it weren't for frame rate caps, the frame rate of any "30 fps" game would probably vary between the hundreds and 30fps, or < 30fps when you get the drops you actually notice. Where you're pulling these figures from I have no idea, but the work done in generating a frame isn't static at around one level casually chosen by the developer.
Certainly, the frame isn't static, I merely used an average 60 fps as an example. Fluctuating numbers doesn't change the principle.

It's also an absolutely crazy way to go about game devlopment, purposefully planning on having a minimum frame rate far beyond your ability to display
Average, not minimum. Anyway, you do get to see the results of drawing faster than you can display, even if you discard most frames.

(vsync enforces a cap at your refresh rate) in the hope that this will somehow magically prevent frame rate drops (it probably won't). Throwing away a machines power on things that offer no benefit is no substitute for fixing a game so as to minimise or remove frame rate drops.
It isn't about "throwing away power". The game will still be faster if you exceed the refresh rate.

Ideally you would have no lag and every frame displayed on screen would be "now". That is not possible, but you can reduce lag, by rendering more frames.
The faster you can draw a frame the better.
Try it for yourself. Take an older shooter you know your PC can run at 100-200fps, then lower the monitors refresh rate to 60Hz and test if you can notice the difference between 60fps and 100fps. I can, and i think most here do.
 
Apparently a sizeable group here want better than 60fps.
I guess some console gamers just don't know any better. Yeah, it's still an opinion, but I take the PC as an example of how it could have been done better. We can have +200fps when we want to, if we got the hardware. Maybe we don't need that kind of power, but we sure as hell can do better than 60 fps and it does make a big difference in a death match
I like 75fps or more . Reasons ? That is the native refresh rate of my lcd . For my crt i want a 120fps as that is the refreshrate of that monitor at my target res .

Most pc gamers want more than the target res because on pc games you can go from 200fps to 10 fps just by turning around .

On a fixed platform giving you a locked 60fps (meaning no drops ) is much more enjoyable than a platform that gives u 1 -200fps in an instant
 
jvd said:
I like 75fps or more . Reasons ? That is the native refresh rate of my lcd . For my crt i want a 120fps as that is the refreshrate of that monitor at my target res .

Okay, so you want 120 fps when you play the game on your CRT. But how then does that make playing on your LCD acceptable to you? Specially if you up the detail levels and get only 75 fps?
I don't mean to critize you or anything, but i don't understand you. 120 fps on the LCD would still be better.


I'm going to thrown in another example.

Target moving at 10 m/s in an open field. let's say he's 180cm (6 ft) tall, 50cm wide.
Then we have the sniper at 100 yards away. Should be an easy kill.

60 fps @ 60hz would mean an average latency of 25ms (see previous ex).
10m/s*25ms = 25cm

A shot at the center of the target would *miss*, because the target has already moved again half his width again before the frame is finally displayed on screen.
Now I know that in the real game, the sniper would lead the target, matching the speed of his crosshair to the target, so the problem is partially alleviated, but it should be clear the lag makes a difference.
The player is looking in the past.
 
People don't react to computer games in such situations, but preempt. If the fella's walking, you time your trigger pull when he's about to walk into the line of fire, not when he's exactly there. Same for racing. You don't need to be exactly at the point of turn visual to make the make the adjustments. As you approach, the visual information provides clues for you to respond to. In such situations the limiting factor is lag on the controls (several milliseconds between pressing the button and getting the response) and not lack of refresh. Brain's interpolate animations to know where you will be within a fraction of a second, and then it's the fidelity of the controls that determine whether you make that jump just at the edge of the platform, or go tumbling into the abyss.
 
Sandwich said:
Try it for yourself. Take an older shooter you know your PC can run at 100-200fps, then lower the monitors refresh rate to 60Hz and test if you can notice the difference between 60fps and 100fps. I can, and i think most here do.

So, you're still talking about old games, not about the new games for the new consoles, at 1080i. Old, single threaded games. The new games are going to be different. Not that you have any possibility to change the refresh rate on a 1080i tv anyway, but whatever.
 
DiGuru said:
Sandwich said:
Try it for yourself. Take an older shooter you know your PC can run at 100-200fps, then lower the monitors refresh rate to 60Hz and test if you can notice the difference between 60fps and 100fps. I can, and i think most here do.

So, you're still talking about old games, not about the new games for the new consoles, at 1080i. Old, single threaded games. The new games are going to be different. Not that you have any possibility to change the refresh rate on a 1080i tv anyway, but whatever.

Try any shooter you like really. You can lower the refresh of your monitor to 60Hz under the windows display properties panel.
Then when you've played a while at somewhere around 100 fps, increase some AA / AF levels until you are around 60 fps.
You should be able to tell the difference between 60@60 and 100@60. 100 fps feels alot smoother.
 
Shifty Geezer said:
In such situations the limiting factor is lag on the controls (several milliseconds between pressing the button and getting the response) and not lack of refresh. Brain's interpolate animations to know where you will be within a fraction of a second, and then it's the fidelity of the controls that determine whether you make that jump just at the edge of the platform, or go tumbling into the abyss.

I'm not really convinced of that. I think we're done with the theory for now. Best way to know for yourself is play a game.
 
Back
Top