I went from a steady 60 fps to something more like 130. .
what was your frefresh rate
I went from a steady 60 fps to something more like 130. .
strangely quake 3 at 1000 fps seems worse than at 60
Well, off the top of my head, there are two reasons not to use high frame rates and both are cost related:You sure? No IMAX movie that I've seen is 60 fps
Looks like it was short lived and abandoned.
so whats better :
30fps on single card, or 50fps with SLI/Crossfire? (average fps)
Of course slowdowns are more noticeable. But 30fps even constant is noticeable too. There's nothing wrong with a constant 30fps but it would be even more ideal to have a more fluid 60fps.It's still the case, though, that it's the slowdowns that are the much more dramatic impact on gameplay.
I'm just giving an example of what the demo looks like. I thought it would be obvious that you would have to see it in motion to see the difference in framerate. The demo and glide wrapper together are a few MB, seeing the demo would put away any doubts some of you may have in the differences with framerate. It is clearly evident when a game is at 30fps or 60fps it's not even a matter of it being difficult to distinguish. The difference is especially pronounced when switching on progressive scan on last gen console games since 480p allows for full frame 60fps.theres no motion blur in those pics
Considering TV's have a 60fps interlaced refresh, there won't be any difference. You'd need to have some sort of motion blur tech to see any.Haven't tried a 120 fps to 60 fps in a big screen TV yet, though I think it might just be my next project aftr I get back from vaction.