Insomniac says no to 60 frames per second

What they need is an overhaul of their lighting and shadowing, Resistance 2 ran at 30fps and lighting and shadowing was seriously lacking. Gameplay-wise R2 gameplay scenarios and boss fights were lame, framerate wouldn't change that at all. All of that boils down to Insomniac Games trying to pump out one game a year, probably not enough time to really include significant improvements to the tech or even to play-test new gameplay ideas.

Whooa whoa whooa.... lol

I actually thought resistance fall of man looked great and so does re2. They definitely werent subpar in anyway and I think they are great examples of what 30fps solid can do. The gameplay for me is just as solid as cod4 (less scripted though), by gameplay i mean moving and shooting. They did well with their wide open spaces in those games
 
Well, CoD4 also sold millions despite all the ruckus of the sub-HD rez? Are we to infer resolution doesn't matter? Anecdotally I did see plenty of mentions of the smooth framerate in CoD4 reviews. Even in CoD5 reviews I saw mention how the game is smooth because of the CoD4 heritage.

There was a wildfire on that? Werent the Res Equal on both platforms?
 
That's not entirely true. Different people have different degrees of sensitivity and perception. We all have the same ability to experience pain but where people can have a dental filling without any anaesthetic, many can't. But that isn't really my point. I certainly don't believe hardcore gamers all have better visual accuity or faster perceptual response than casuals! Put 30fps next to 60 fps and perhaps lots of people will notice. If you've done some tests, that's data on the matter. But if you put someone in front of a 30fps game, will they lament the lower framerate and wish for a faster framerate, or will they not even notice unless they have something to compare it to? As some have said here, once they adjust to watching lower framerates, they're okay with it, although it's jarring come off a faster refresh. Some of us are always aware of framerate and no something's not smooth when we see it without needing a control to compare it against. If most people are like the former, really don't notice any difference between 30 and 60 unless you show them there is a difference, as it were, then 30fps is definitely the better choice.

I suppose my question shouldn't be phrased 'can they see it?' but instead 'are they sensitive to it?'

I agree. Given that (unless Im mistaken) the vast majority of console games are 30 FPS, then most gamers, regardless if they are are aware of the concept of FPS or not, will be condition to 30 FPS. For most its not 30 FPS vs. 60 FPS but falling substantially below 30FPS that bothers or becoming noticeable to most players.

You can take the same game make it 30FPS on one setup and 60FPS on another. Alot of gamers may be able to readily able to tell the difference and feel 60FPS is better. However, the problem with this scenario is that how often in the real world is your average gamer going to experience such scenarios. Furthermore, 30 versus 60 is not a game breaker. A crappy 30 FPS game doesn't suddenly become a good game just by upping the frame rate.
 
I'd just test 30 versus 60 identical, just to see if people can recognise it. When playing Booty on PS3, if you switch down to 720p, you get a huge increase in framerate. To me this is a significant advantage, but my friends had real trouble spotting it and didn't care. I'd like to know if the general populace can actually even perceive the difference, at least notably. To me, 60fps adds a sense of class or quality, and an easiness in viewing the game. I think most people don't notice any benefit though, which makes it wasted effort to pursue.

That's easy to do. Just get a game that has a solid 30fps framerate and turn on the motion enhancer on a 120hz TV. I've done it with uncharted 2 and it looks absolutely stunning over running it at 30fps mostly because everything is so clear. Too bad it was laggy :(

IMO, the problem with better visuals at 30fps is that if you play on a LCDTV the improved details get lost because of the inherent motion blur. 60fps significantly reduces motion blur and for me actually improves IQ. 30fps looks better when you're not moving (or moving very slowly), but in motion 60fps is the winner.
 
Last edited by a moderator:
There was a wildfire on that? Werent the Res Equal on both platforms?

HD generation. Anything found to not run at 720p causes a stir especially if it is a popular title. People were mad to find out that the res didn't improve in MW2.
 
Last edited by a moderator:
I also hate that the "professional" reviewers don't seem to care about the framerate....Anyone remembers Mass Effect for the 360? It was hailed for its graphics, but had a terrible framerate

I am really terrified seeing how people "love" these news...
Would you love it if developers suddenly dropped V-sync to give you your better looks?
 
That's easy to do. Just get a game that has a solid 30fps framerate and turn on the motion enhancer on a 120hz TV. I've done it with uncharted 2 and it looks absolutely stunning over running it at 30fps mostly because everything is so clear. Too bad it was laggy :(

IMO, the problem with better visuals at 30fps is that if you play on a LCDTV the improved details get lost because of the inherent motion blur. 60fps significantly reduces motion blur and for me actually improves IQ. 30fps looks better when you're not moving (or moving very slowly), but in motion 60fps is the winner.

I was under the impression that the lack of motion blur and the perception of visual gaps between frames is what makes 30 fps less smooth. 60 fps overcomes this by reducing the visual gaps percieve by the gamers making for smoother gameplay.
 
It's not motion blur exactly, it's judder ... your eyes are moving but the frame is still. End effect is still that it does not look sharp when you try to follow something with your eyes instead of the camera.
 
There was a wildfire on that? Werent the Res Equal on both platforms?

I was talking about generally in game reviews where CoD4 had one of the lowest resolutions of the titles shipping at the time. But platform-wise, knowing that, why didn't the PC version gather all the graphical praise since it wasn't using sub-HD rez? Why is CoD6 going to break all sales records despite using a vertical resolution I used back in 1997?

Anyway, this is a bit of devil's advocate, my point being that just because you don't hear any praise for framerate in reviews doesn't mean it's not important. Humans often just complain a lot more than they compliment. Also, reviewers probably have top of the range systems (for PC reviews) where any observed good framerates are taken for granted and probably not sufficiently appreciated.

Same reason why "EXCLUSIVE!!!11!!1(one)!!" reviews often don't point out bugs/speed problems because they aren't based on final code and there's always the excuse the retail version is going to be faster; so also on consoles, the framerate remarks, or lack there of, can't be a strong argument for or against fps as they pertain to review scores.

Or to put it in [strike]pedantic[/strike] scientific research terms: the Unit of Analysis is being compromised by the Unit of Observation.
 
I was under the impression that the lack of motion blur and the perception of visual gaps between frames is what makes 30 fps less smooth. 60 fps overcomes this by reducing the visual gaps percieve by the gamers making for smoother gameplay.

The motion blur I'm talking about is from the sample and hold effect from LCD displays. 60fps has less blur than 30fps because each frame is onscreen for only half the duration. Motion interpolation and 120hz are used to reduce blur from sample and hold.
 
The motion blur I'm talking about is from the sample and hold effect from LCD displays. 60fps has less blur than 30fps because each frame is onscreen for only half the duration. Motion interpolation and 120hz are used to reduce blur from sample and hold.

Gotcha
 
I'll agree with Insomniac as long as they don't get the same idea about screen tearing :)

It would definitely be strange not playing Ratchet at 60fps, though. I think even the PSP games were 60fps.
 
That's easy to do. Just get a game that has a solid 30fps framerate and turn on the motion enhancer on a 120hz TV. I've done it with uncharted 2 and it looks absolutely stunning over running it at 30fps mostly because everything is so clear. Too bad it was laggy :(

Motion enhancer? Never heard of it. I'm guessing though it just ups the response time of the monitor? Also why 120hz? I looked it up but I'm still confused by the process.
 
Ive heard the statement that casuals dont notice the difference in framerates, aspecially on B3D, but frankly, i find it to be a completely retarted statement.

People notice 60fps over 30fps. Doesn't matter if they are hardcore gamers or not! We all have the same ability to see, certainly, hardcore gamers doesn't have any better vision than non gamers. The only difference is that gamers KNOW that the framerate makes the game smoother etc, wheras casuals also notice the difference but cannot put it into words.

I only have casual gaming friends in real life. We play Fifa and some CoD. Every single one of them (most of them dont even have consoles at home) notice that when we play Fifa (prior to Fifa10) on my PS3, its a lot slower and less responsive than playing it on the X360 at my friends house. They dont say that the framerate is shitty, because they dont really know what framerate is, but they certainly notice the difference. They might not comment on the difference, because frankly, they dont really care, but everybody is able to see it!

Similarly, its VERY easy to prove that casual spot 60\30 fps differences, 3d FX did this a decade ago, when they released a demo that rendered half the screen at 60fps and the other half at 30fps. Everybody with a working set of eyeballs saw that 60fps is more fluent. Case closed, casuals do notice difference between 60 and 30 fps.

The real question, is not if casual gamers can see the difference between 60fps and 30fps (they can, beyond any doubt), its if they prefer 30fps and better visuals over 60fps and worse visuals. That question however, is extremely difficult to answer, as framerate have alot of impacts aside from smoothnes of picture, that are hard to measure (responsiveness, how fluid the controls feel, etc), also how important the impacts of framerate is, will also depend completely on the game in question.

Nobody cares if Civilization runs at 20fps, but everybody would cry if CoD suddenly started to run at 20fps ;)


You know, I've absolutely got to go along with this. If you can't see a difference between 30fps and 60fps then you have faulty eyesight. Sure, you may not care for the difference, you may not be able to explain what's causing it, but you absolutely are seeing it, are processing it and are noticing that it is different.

Here's the perfect test, show the panning hills scene in FPSCompare to anybody in split 30/60 mode and I guarantee you nobody will say both sides look exactly the same. Do the same whilst switching quickly between fullscreen 30fps and fullscreen 60fps, don't tell people what you're testing for or what's causing the difference, just ask them if they perceive a difference and I guarantee they absolutely will. If they don't then I suggest you direct them to the optician! :LOL:
 
I also hate that the "professional" reviewers don't seem to care about the framerate....Anyone remembers Mass Effect for the 360? It was hailed for its graphics, but had a terrible framerate

I am really terrified seeing how people "love" these news...
Would you love it if developers suddenly dropped V-sync to give you your better looks?

I can't think of a single review that didn't point out the shortcomings of ME's performance actually.
 
30 fps on consoles is fine as long as two things are taken into account.
- consistency. no fluctuations in frame rates.
- internal tick-simulation.
 
Motion enhancer? Never heard of it. I'm guessing though it just ups the response time of the monitor? Also why 120hz? I looked it up but I'm still confused by the process.

A motion enhancer actually interpolates frames and inserts the interpolated frame in between the original ones. This increases the framerate which reduces the blur caused by sample and hold inherent to LCD. 120hz is important because it is a multiple of 24, 30 and 60.

For games running at 30fps, turning the motion enhancer to high will make it look like its running at 60fps. A 60fps game looks like 120fps. The result is very noticeable because everything is so smooth and clear. The only downside is that any hitch in framerate will throw off the motion enhancer and make things look strange and also introduces a noticeable amount of lag
 
Whooa whoa whooa.... lol

I actually thought resistance fall of man looked great and so does re2. They definitely werent subpar in anyway and I think they are great examples of what 30fps solid can do. The gameplay for me is just as solid as cod4 (less scripted though), by gameplay i mean moving and shooting. They did well with their wide open spaces in those games

I'm a big fan of RFOM's gameplay but graphically it wasn't that great in terms of effects, and R2, when compared to other 30fps games like Killzone 2 and Uncharted 2, is lacking graphically and gameplay-wise, much of it has to do with their choice to let their lighting and shadowing tech lag behind, lacking HDR lighting, with much of the environment sporting blatantly noticible baked shadows or lacking shadowing altogether, water lacking any kind of reflectivity (which carried over to A Crack In Time) but the biggest disappointment for me was how many of the combat arenas felt so incredibly narrow and lacking in combat options, despite Insomniac Games trying to sell the game on scale, the SP mode was the antithesis of gameplay scale.

I think they're trying to shift the focus to 30 vs 60 fps while we kind of overlook the fact that their engine simply lacks a lot of the effects featured in most top tier games, the player doesn't care what's under the hood, the player cares about what he sees.
 
Last edited by a moderator:
The only downside is that any hitch in framerate will throw off the motion enhancer and make things look strange
If the TV makers wanted to they could easily adapt to this ... the problem is that the circuitry is designed for 24/25/30 Hz interpolation and not for variable framerate content. Personally I think the console crowd would present a good marketing opportunity, the changes they'd have to make would be pretty minimal.

The extra input lag is a bit of a downer, but if you limit it to an extra 33 msec then it's not so bad in the grand scheme of things. Of course if TVs can do a decent job of it the question becomes why don't the games do it themselves? (It would still add input lag, but no more than tripple buffering does to start with.)
 
Last edited by a moderator:
Back
Top