Fixed framerate: 60 fps is enough

60 Hz max and the best IQ that can be done at that?

  • Just no

    Votes: 0 0.0%
  • 30 Hz minimum and higher IQ would be better

    Votes: 0 0.0%
  • I don't care

    Votes: 0 0.0%

  • Total voters
    226

Frank

Certified not a majority
Veteran
With the latest generation(s) of GPUs, I think we have hit some kind of threshold. I think it starts to make more and more sense to adjust the amount of frames produced not towards as much as possible, and use the "excess" framerate for AA/AF, but rather to adjust the amount to the optimal IQ that can be produced at that speed.

So, it might make more sense to try and hit a fixed framerate of 60 fps (with occasional dips on scene changes and as long as there is no virtual memory and stuff), while adjusting the image quality to the best one that can be archieved at that speed.

All that is needed to do that is available with the NV4x/R3x0/R4x0. Shaders, to offload and synchronize the amount of work, the amount of different rendering models that have to be supported anyway, normal maps, detail textures, direct support for application AA/AF, better switching of render states, etc.

The highest IQ possible would be good as well, so it might even be better to make the framerate adjustable between two thresholds, say 30-60 default. That way, the developers can limit it to their liking, and you can as well.

What do you all think?



Edit: that should be fps in the poll as well, instead of Hz. Sorry!
 
I think the minimum could be higher. I'd want to be sure that everyone has a framerate they would like. A lot of people see 60 Hz as a bare minimum, and there may be a few cases and types of scenes where human perception might pick up on jumpiness at that frame rate.

Perhaps a bit more margin might help, and I wouldn't mind getting more frames than the minimum.
 
3dilettante said:
I think the minimum could be higher. I'd want to be sure that everyone has a framerate they would like. A lot of people see 60 Hz as a bare minimum, and there may be a few cases and types of scenes where human perception might pick up on jumpiness at that frame rate.

Perhaps a bit more margin might help, and I wouldn't mind getting more frames than the minimum.

It might be, that a lot of people who think like that do so, because they (think they) can get an advantage in certain multiplayer FPS when they have an extreme high frame rate, not because they see the difference.
 
digitalwanderer said:
60 fps is just peachy, 60Hz gives me headaches though.

Good point. I should have said fps, but I always try to avoid the confusion with FPS. I'll change it.
 
First thing I thought of was, "How could anyone be advocating 60Hz as a good thing?!?! ? :oops: ". :LOL:

I figured it out after reading the thread.

30 fps isn't always enough, but 60 fps hits that magic smooth point for me for sure. (It's actually somewhere around 45-50 fps that usually qualify as "smooth!" for me, but it does vary from game to game a tad)
 
hertz is just frequence he could be talking about the frequence of frames I don't think he ment the frequence of a CRT screen refresh :p
 
60 Hz with best possible IQ, a comprehensive* and congruent* set of eye candy. On the CPU side of things: great AI, physics, etc. No stutters/slowdown in framerate whatsoever. I would really respect a game that polished.

*Provided by a robust engine that takes into consideration various lighting models [at the pixel-level], has a complete solution for the shadow corresponding to those models, sports a diverse set of material effects, and enough geometry to round out polygonal edges while objects are in motion

*Convincingly consistent and homogenous set of visuals
 
I think we need to move to console gaming to see that kind of polish for a new game ;)
 
digitalwanderer said:
First thing I thought of was, "How could anyone be advocating 60Hz as a good thing?!?! ? :oops: ". :LOL:

I figured it out after reading the thread.

30 fps isn't always enough, but 60 fps hits that magic smooth point for me for sure. (It's actually somewhere around 45-50 fps that usually qualify as "smooth!" for me, but it does vary from game to game a tad)

Yes, after you said it, I had to admit it was stupid. Especially here, where everyone knows the difference. Too bad I can't change the text of the poll anymore. Ah, well.

I can stand even lower framerates, as long as it doesn't start to freeze on me. Although I agree that it depends on the game. Morrowind, for example, is acceptable from about 15 fps to me. Some other games from about 20-25 fps. But I do value nice effects, so I generally turn on as much as I can, even if I have to turn off AA/AF sometimes. That depends on the game as well.

For Morrowind, you have this frame rate optimizer, that tries to give you the maximum view distance and the most effects it can, on the fly, while making sure your frame rate doesn't drop below the minimum one you specify. And it will try to hit your preferred fps while doing so. I think that is superb and the way to go.

With the co-development for PC and consoles for a lot of new titles, and Doom 3 (and probably HL 2 as well) putting a cap on the game logic and all frames over max just being the exact same until the next one is sceduled, this seems like a new trend. But I know most people value high frame rates, so I was really curious about this.
 
This has been done over and over and over again. Blah.

Anyway, the answer is very simple: more is always better, but it's not going to get really good unless good motion blur is implemented. An ideal motion blur implementation would essentially be rendering the scene an infinite number of times between each frame, and summing up the contributions. This basically allows the screen to display the data which is inbetween refreshes of the monitor.

Obviously this is unrealistic, but there has been research into means of effective motion blur. The problem is that they all take up lots and lots of fillrate, so it'll be a while before it happens.
 
i play alot of ut, and my aim is very sensitive to the smoothness of the game. i find anything over 85 is overkill. however if it drops under 85 i feel a difference.
 
hovz said:
i play alot of ut, and my aim is very sensitive to the smoothness of the game. i find anything over 85 is overkill. however if it drops under 85 i feel a difference.
Maybe you're running your refresh rate at 85Hz?
 
Chalnoth said:
Obviously this is unrealistic, but there has been research into means of effective motion blur. The problem is that they all take up lots and lots of fillrate, so it'll be a while before it happens.
Indeed, why would we want a motion blurred title if the technique incurs more of a performance hit than it could justify by blurring the choppiness. "Wow, the blur gives me the illusion that my fps doubled! Fraps is reporting 1 fps, so effectively I feel I'm playing at a blistering 2."
 
Luminescent said:
Indeed, why would we want a motion blurred title if the technique incurs more of a performance hit than it could justify by blurring the choppiness. "Wow, the blur gives me the illusion that my fps doubled. Fraps is reporting 1 fps, so I feel like I'm playing at a blistering 2."
Now that's an ignorant statement. Motion blur will never be used to drop the framerate. What will happen is that if a game decides to use motion blurring, it will drop detail elsewhere to keep performance roughly the same. The issue is that we don't yet have enough spare fillrate for motion blurring to make sense.
 
Chalnoth, just pointing out some extreme situational irony. I know that game engine's are smarter than that and modern cards have fillrate to spare relative to their bandwith constraints.

We can easily conclude that motion blur will be most effective for relieving visual choppiness when the bottleneck is not fillrate and there is sufficient fillrate to spare for the effect to occur at a decent enough framerate (movies are 24 fps). Problem is that decent enough and sufficient should be determined on a case by case basis and, frankly, pc engines can only be so scalable and driver compilers so intelligent.
 
Chalnoth said:
hovz said:
i play alot of ut, and my aim is very sensitive to the smoothness of the game. i find anything over 85 is overkill. however if it drops under 85 i feel a difference.
Maybe you're running your refresh rate at 85Hz?

100
 
Chalnoth said:
Now that's an ignorant statement. Motion blur will never be used to drop the framerate. What will happen is that if a game decides to use motion blurring, it will drop detail elsewhere to keep performance roughly the same. The issue is that we don't yet have enough spare fillrate for motion blurring to make sense.
Is there such a significant penalty on fillrate to using the accumulation buffer? Or are you talking about other techniques, since most non-ATI cards don't support it in hardware?
 
Ostsol said:
Is there such a significant penalty on fillrate to using the accumulation buffer? Or are you talking about other techniques, since most non-ATI cards don't support it in hardware?
Well, I think the accumulation buffer implementation is a pretty simplistic one, and would only be useful in removing the limitation of a monitor's refresh rate on framerate.

There are techniques that I've seen that attempt to use more elegant methods to get more correct motion blurring with less processing power, but they still eat up lots of fillrate.

One way that you could do it, for example, is have a velocity vector associated with each vertex, and use a combination of pixel shader and vertex shader math to "smudge out" each triangle. Since this would be a linear interpolation, it wouldn't be perfect, and would look pretty bad for anything changing direction significantly when displayed at low framerates, but it should work okay most of the time. A second order approach could also attach an acceleration vector to improve this (with little increase in processing requirements). This approach would eat a lot of pixel processing power, and a relatively small amount of vertex processing power.

Another thing you can do is have uneven sampling of the multiple images that make up the final frame. Similar to how with a large number of FSAA samples it makes sense to start using stochastic sampling, one could randomly sample between, say, 10 frames rendered per one displayed. This may be a bit cheaper on fillrate than a plain accumulation buffer, but would be much more expensive in geometry and memory bandwidth efficiency.
 
Back
Top