Insomniac says no to 60 frames per second

We need to be careful here. A high metacritic score may not be translatable into a 30fps vs 60fps argument. Games look good and play well for a good many reasons.

Don't want to put the cart in front of the horse. I agree that 60fps is not a must. That is all. v_v

There are still a lot of work and variables after this point.

EDIT: Bah... I should just give Mike more room to express his views without overreacting and overanalyzing. In isolation, I see nothing wrong in 60fps or no 60fps.
 
I don't know. I think he's right. I've seen that sort of thought in tons of sandbox games. I've seen people do crazy stuff in Crackdown just to see how much they could make the otherwise solid engine chug.

I suppose it's a lot to ask from a sandbox game to have to keep it's graphics simple enough to maintain a solid framerate even in the rare occurrence that players cause an extremely heavy load on the engine.

Insomniac's games so far have all been build around individual levels. So it should be that much easier for them to maintain a solid framerate.
 
The best selling HD game is COD 4 (I belive). The framerate is mostly 60. Insomniac should take notice.

I am not sure its the framerate that helps cod4. There is more to the feel of a game than the fps. I will try cod on pc with framerate caps and see if it changes but i know it was still the same game when i played it at 40fps

I dont think they are giving up on 60fps, just not putting as much emphasis on it.
 
EDIT: Bah... I should just give Mike more room to express his views without overreacting and overanalyzing. In isolation, I see nothing wrong in 60fps or no 60fps.

But it isn't isolated. Better graphics trend toward better reviews. Better reviews trend toward better sales (all things being equal). 60fps doesn't necessarily mean better graphics; indeed in many cases I would believe if a developer had a 60Hz version and 30Hz version w/ extra eye candy the 30Hz would be considered superior.

There are exceptions (e.g. racing sims, fighting games), but you won't see a huge docking of points (lets say 2-3 points) for not hitting 60fps in those games. How many reviews gave Shift a 5/10 for graphics?
 
But it isn't isolated. Better graphics trend toward better reviews. Better reviews trend toward better sales (all things being equal). 60fps doesn't necessarily mean better graphics; indeed in many cases I would believe if a developer had a 60Hz version and 30Hz version w/ extra eye candy the 30Hz would be considered superior.

There are exceptions (e.g. racing sims, fighting games), but you won't see a huge docking of points (lets say 2-3 points) for not hitting 60fps in those games. How many reviews gave Shift a 5/10 for graphics?
Thank you. I was about to say something very similar but not a well worded.

Also does RC really need 60fps? Couldn't their clever artist, programmers, and animators deliver comparable gameplay at a rock solid 30fps with motion blur?


Offtopic: RC seems like a perfect candidate for SSAO to me.
 
Last edited by a moderator:
I completely agree with Insomniac.

60FPS is a tool not a requirement. It is meant to convey something and is not a standard as many games on the market denote.

For certain games it is a boon and for others its simply there to be there because it doesn't add more to the experience than other design tradeoffs would have.

As a matter of preference I can live without 60FPS in exchanged of being amazed and I emphasize that qualifier vociferously. Tradeoffs are done to improve the experience not to cop your way out of delivering more or the best experience you can to the player.

Many many high profile games do not run at 60FPS because its not a distinguishing factor for those games.

I've on more than one occassion hoped that Insomniac dropped 60FPS for a game because its so obvious to me what they could accomplish with more frame time.

I am not sure that 30FPS as a standard makes any more sense because I don't see the value in adherence to anything other than "do what's best for the game." Though I am not sure about that decision (If I understand Mike correctly) I do feel more often than not potential is wasted on 60FPS when 30FPS + "more whatever" would make for a better experience in many games.

If gamer response is a rubric then many many titles support this conclusion.
 
Last edited by a moderator:
Of course great graphics tend to correlate with the overall score and thats because pubs that put alot of work in visuals tend to put alot of work in their overall product. There doesn't tend to be alot of "WOW" games in term of visual, yet "Man this is a POS!!!" in terms of gameplay.

There no correlation between overall score and 60 FPS because high FPS isn't, in and of itself, a metric of quality development. 60 FPS is an option even for a lazy or untalented dev, because 60 FPS isn't the hard part, its trying to balance maximum visuals and 60 FPS thats difficult for just about everybody.

Devs tend to sacrifice 60 FPS because only fraction of the market can tell the difference. Furthermore, that phenomenom is helped because only a fraction of games are release at 60 FPS so the majority of market is left without alot 60 FPS experiences .

Just about any gamer can discern the difference between framerates and resolution given the right circumstances. This is why its important to create studies that mirror reality. And the reality is that people's ability to see or feel a difference in framerate and resolution is determine how fresh the experiences are in their memory. Most mainstream consumers can easily see the difference between 1080p and 720p in a setting like BestBuy because that setting allows you to easily compare two different images. But let those same consumers go around random to their neighbors' homes with HDTVs, they would be practically unable to tell which one was a 720p set or a 1080p.

Same goes for framerates and overall game IQ. If you really wanted to know if people can tell the difference between AA settings. Space the exposure of the two images to your subjects by two days or more. The point of spacing out the exposures is to disallow the creating of a strong baseline for the subjects and basically judge the each image more independently. A person really sensitive to framerates, AA settings and resolution will tend to be able judge an image without the need to compare against fresh memories.
 
Boooo! :cry:

I disagree. A fast car in a straightish line is easier to watch at 30fps then an FPS where the player is turning. In quitre a few games I find myself wanting a better framerate to avoid the jitter. eg. Fat Princess last night. Sadly I think I'm in the minority and most people don't perceive the benefit.

I disagree, when you're turning as a person, your whole screen is changing and a lot is going on, there's monitor refresh rate, motion blur, and your eyes' ability to process fast changing information. For example, move your hand slowly in front of your face and you can probably see it at 100 fps, but move it fast in front of your face and all you see is a blur, this tells me that the human eye or at least my eyes can't process each "frame" separately if a lot has changed from one to the other. At least not at 60 hz.

When you're driving in a car, your eyes can more easily tell the difference between 30 and 60 fps since not a lot is different between two consecutive frames, and your eyes have an easier time keeping up with it. I definitely notice it when I was playing GRID and NFS shift, they weren't enjoyable experiences for me, since the car appeared to have "jumped" between each frame, since it covers more ground in 1/30s compared to 1/60s. This is greatly distracting to me. On the other hand, I don't really feel that NGS2 plays better than God of War. While playing World at War felt a lot smoother right after Killzone 2, I still would take the extra graphics.
 
What they need is an overhaul of their lighting and shadowing, Resistance 2 ran at 30fps and lighting and shadowing was seriously lacking. Gameplay-wise R2 gameplay scenarios and boss fights were lame, framerate wouldn't change that at all. All of that boils down to Insomniac Games trying to pump out one game a year, probably not enough time to really include significant improvements to the tech or even to play-test new gameplay ideas.
 
It's a worthwhile discussion to have and I suggest Mike publish more information about their research.
As I said earlier, the industry as a whole should commission some proper research. Maybe Gamasutra could do an investigation if the IGDA won't?
 
Shifty said:
I'd like to know if the general populace can actually even perceive the difference, at least notably. I think most people don't notice any benefit though, which makes it wasted effort to pursue.

Ive heard the statement that casuals dont notice the difference in framerates, aspecially on B3D, but frankly, i find it to be a completely retarted statement.

People notice 60fps over 30fps. Doesn't matter if they are hardcore gamers or not! We all have the same ability to see, certainly, hardcore gamers doesn't have any better vision than non gamers. The only difference is that gamers KNOW that the framerate makes the game smoother etc, wheras casuals also notice the difference but cannot put it into words.

I only have casual gaming friends in real life. We play Fifa and some CoD. Every single one of them (most of them dont even have consoles at home) notice that when we play Fifa (prior to Fifa10) on my PS3, its a lot slower and less responsive than playing it on the X360 at my friends house. They dont say that the framerate is shitty, because they dont really know what framerate is, but they certainly notice the difference. They might not comment on the difference, because frankly, they dont really care, but everybody is able to see it!

Similarly, its VERY easy to prove that casual spot 60\30 fps differences, 3d FX did this a decade ago, when they released a demo that rendered half the screen at 60fps and the other half at 30fps. Everybody with a working set of eyeballs saw that 60fps is more fluent. Case closed, casuals do notice difference between 60 and 30 fps.

The real question, is not if casual gamers can see the difference between 60fps and 30fps (they can, beyond any doubt), its if they prefer 30fps and better visuals over 60fps and worse visuals. That question however, is extremely difficult to answer, as framerate have alot of impacts aside from smoothnes of picture, that are hard to measure (responsiveness, how fluid the controls feel, etc), also how important the impacts of framerate is, will also depend completely on the game in question.

Nobody cares if Civilization runs at 20fps, but everybody would cry if CoD suddenly started to run at 20fps ;)
 
What they should really do is a public study into framerates by getting people to view 30fps and 60fps side by side. I imagine the findings are most people can't perceive the difference.
As do I. Also I completely agree with Mike's posts about people going wow over visuals and not frame rate. In my opinion, they're absolutely making the right choice.

The best selling HD game is COD 4 (I belive). The framerate is mostly 60. Insomniac should take notice.

COD4 sold because of the content not the frame rate.
 
As do I. I completely agree with Mike's posts about people going wow over visuals and not frame rate. In my opinion, they're absolutely making the right choice.



COD4 sold because of the content not the frame rate.

And a important point is the current "hate" towards sub Resolution. If a game gets released which is Sub-HD we have a shit storm and bad PR.

30hz instead of 60hz, not a peep. Sacrifice 30 frames pr second and hit 720p and don´t risk a "public" backlash?

The money would always pick 30hz.
 
And on that note, I can't believe I'm the first person to suspect this, but Insomniac is towing the party line here. This whole dialogue reads to me as "the PS3 hardware is pretty much tapped out and all the new stuff we'd like to do can't be done at 60fps."
 
And on that note, I can't believe I'm the first person to suspect this, but Insomniac is towing the party line here. This whole dialogue reads to me as "the PS3 hardware is pretty much tapped out and all the new stuff we'd like to do can't be done at 60fps."

I don't think there's any basis for this.

The same could be said for PC or X360 or Wii if anyone wanted to but that wouldn't lend any credibility to this theory. The research was about a specific tradeoff not about what is and is not "taped out" which encompasses a far larger spectrum than framrates or visuals alone for that matter.
 
And on that note, I can't believe I'm the first person to suspect this, but Insomniac is towing the party line here. This whole dialogue reads to me as "the PS3 hardware is pretty much tapped out and all the new stuff we'd like to do can't be done at 60fps."

That's a bit of a leap, don't you think? All the major PS3 releases have been 30Hz aside from (again genre defining framerate) GT5 and whatnot. Aside from the fancrowd, has there really been such a view?

And a important point is the current "hate" towards sub Resolution. If a game gets released which is Sub-HD we have a shit storm and bad PR.

30hz instead of 60hz, not a peep. Sacrifice 30 frames pr second and hit 720p and don´t risk a "public" backlash?

Well, CoD4 also sold millions despite all the ruckus of the sub-HD rez? Are we to infer resolution doesn't matter? Anecdotally I did see plenty of mentions of the smooth framerate in CoD4 reviews. Even in CoD5 reviews I saw mention how the game is smooth because of the CoD4 heritage.
 
Every single one of them (most of them dont even have consoles at home) notice that when we play Fifa (prior to Fifa10) on my PS3, its a lot slower and less responsive than playing it on the X360 at my friends house. They dont say that the framerate is shitty, because they dont really know what framerate is, but they certainly notice the difference. ;)
Fifa has been 60fps since fifa 08 on the ps3. And it didn't slow down much except for the nonplayable parts such as before the game half time goal celebrations etc where the stands and spectators took up a large part of the screen.
 
People notice 60fps over 30fps. Doesn't matter if they are hardcore gamers or not! We all have the same ability to see...
That's not entirely true. Different people have different degrees of sensitivity and perception. We all have the same ability to experience pain but where people can have a dental filling without any anaesthetic, many can't. But that isn't really my point. I certainly don't believe hardcore gamers all have better visual accuity or faster perceptual response than casuals! Put 30fps next to 60 fps and perhaps lots of people will notice. If you've done some tests, that's data on the matter. But if you put someone in front of a 30fps game, will they lament the lower framerate and wish for a faster framerate, or will they not even notice unless they have something to compare it to? As some have said here, once they adjust to watching lower framerates, they're okay with it, although it's jarring come off a faster refresh. Some of us are always aware of framerate and no something's not smooth when we see it without needing a control to compare it against. If most people are like the former, really don't notice any difference between 30 and 60 unless you show them there is a difference, as it were, then 30fps is definitely the better choice.

I suppose my question shouldn't be phrased 'can they see it?' but instead 'are they sensitive to it?'
 
That makes no sense. Acton had no reason to make the post at all, in that case. All they had to do was make the next game 30 fps and show us fancier graphics.
 
Back
Top