scooby_dooby said:I just have to ask the question, if we can't even believe that the PC Editor of 1UP.com is capable of determing the difference between 60FPS and low 30's, 40's and 50's, then why the hell does it matter that every game be at 60FPS?
I mean, it's either noticeable or it's not. It's either important, or it's not. This guy is as close as you can get to an expert in the field, and if he is incapable of telling the difference who IS capable? And why does it matter?
This is a little OT, just wondering how some people can claim 60FPS is "essential for next-gen gaming", then at the same time say that nobody, not even PC game reviewers, can tell the difference between 60FPS and lower.
Frankly I would say its not all that important on consoles as long as the average is high enough so that the minimum under heavy loads doesn't dip below 30fps.
Slower framerates are much more noticible on the PC though where mouse control is much more sentitive to lag.
And regarding your question about not believing the devs, if you can show me a statement from the devs explicitly stating that the game never, ever drops below 60fps then I will accept that for the sake of argument (however I wouldn't bet my life on it being the truth). However to me, a "stable 60fps" does not imply that the game never drops below 60fps. I could read that to mean that the average framerate is 60fps and the minimum is still high enough so that the games percieved framerate remains stable.