Insomniac says no to 60 frames per second

Still got 9 in graphics on IGN for example...

Well I don't think bad framerate should lower the graphics score, but gameplay score instead. In Mass Effect the stuttering didn't even bother me that much, the long elevator rides were more annoying, but the game was a gem anyways.
 
I agree. Given that (unless Im mistaken) the vast majority of console games are 30 FPS, then most gamers, regardless if they are are aware of the concept of FPS or not, will be condition to 30 FPS. For most its not 30 FPS vs. 60 FPS but falling substantially below 30FPS that bothers or becoming noticeable to most players.

You can take the same game make it 30FPS on one setup and 60FPS on another. Alot of gamers may be able to readily able to tell the difference and feel 60FPS is better. However, the problem with this scenario is that how often in the real world is your average gamer going to experience such scenarios. Furthermore, 30 versus 60 is not a game breaker. A crappy 30 FPS game doesn't suddenly become a good game just by upping the frame rate.

Nah man. They still notice the difference (read my last post and the one above this one), but they may not perhaps care about it or be able to explain it. If you cannot see 60hz vs 30hz difference then you have faulty eyesight. Everybody can do it, but casuals just dont care. Just like casuals dont give a rats ass about resolution aslong as it looks good.
 
Svensk Viking said:
I also hate that the "professional" reviewers don't seem to care about the framerate....Anyone remembers Mass Effect for the 360? It was hailed for its graphics, but had a terrible framerate

I can't think of a single review that didn't point out the shortcomings of ME's performance actually.

I cannot think of a single review that didn't absolutely smash ME for its pisspoor framerate either..
 
liolio said:
If the renderer runs @30fps it doesn't prevent the physic engine to run way faster (360 updates per second).
Which changes nothing regarding visual updates - the game will still look 30fps as much as any other 30fps game.

For the record, there's also a number of high-profile titles out there that render at 60fps but update the simulation at 30(animation, physics, AI, whatever). They look just as 60fps as those that do it the "standard" way - frankly as far as compromises go, that's one that I can actually get behind.


Anyway, back on topic, I can see where Insomniac is coming from, but I don't entirely agree.
One thing I've noticed working on online games is that framerate is a pretty common concern for players(even if game is not critically dependent on it). Largely they equate it to network-lag (and to be fair, the two can be difficult to distinguish depending on how your network code works), but stable fps is actually in pretty high appreciation. And in terms of bearing on commercial success, game-performance in this market segment translates to userbase size, so it has a very clear effect on it (graphics quality, not so much IME).
 
Those arguing whether they themselves, hardcore gamers, casual gamers or anyone can distinguish between 30 Hz and 60 Hz are missing the point. Quoting 3dfx demos, comparing Xbox 360 to PS3 versions etc.

The question is not whether games should display the exact same content at 30 or 60 Hz. The question is whether the game should display much richer content at 30 Hz, or a cut-down one at 60 Hz. And bear in mind that with today's highly parallel machines (starting with even CPU/GPU parallelism on the simplest singlecore PC or the Wii) a 30 Hz image should be more than twice as rich as a 60 Hz one.
 
I cannot think of a single review that didn't absolutely smash ME for its pisspoor framerate either..

Yeah I remember it highlighted in lots of reviews back then. They mentioned almost stalling gameplay in larger combats due to very low framerate.


But as for 30 vs 60fps it really is noticable. Even those not into graphics says it just seems more 'vivid, real, natural' aslong as it doesn't look like the game is on what could be called fastforward gameplay speed. :smile:
 
30 fps actually looks more cinematic, too, which is the look that many devs are going for. On TV, 60 fps is the mark of a handycam, while 30 or 24 fps is what the TV studios use. That's why it's such a hyped up feature on camcorders. Even motion blur is not always there on cinema, as I've seen panning scenes in movies that clearly used a fast shutter to make each frame crisp.

Aside from hardcore gamers, the conditioning to 30fps has always been there.
 
My issue here is that most games that are quoted as 60fps means they go up to 60fps, while occasionally dipping below that point. Similarly, games that are built at 30fps rarely are rock-solid at 30fps. Saying your game will be better looking at 30fps doesn't mean much to me when the game will dip into the teens when the action is high (i.e. when I need it most). Graphics negatively impacting gameplay should never happen, regardless of the "bling" factor.

This isn't a shot at Insomniac - their engines are generally quite nice - but the 60fps target is usually a great sign that a game won't drop below a comfortable framerate limit. When I hear a game is 30fps I usually expect the engine to chug in parts into low-playability... which has become so common that I sigh and buy it anyway. Maybe I'm the real problem :)
 
After spending years of reading about technology and some of the integral debatable points of console generations and gaming in general, I have come to the conclusion that I personally don't give a shit. More power to Insomniac for designing the game based primarily on graphics rather than framerate and more power to any developer that focuses on framerate over graphics. Honestly, if more developers/publishers spent time making a quality title than worrying over these "bulletpoints" Im sure I would enjoy gaming considerably more.

As far as a business standpoint, I think it goes without saying that a better looking game will sell more copies than a game with the higher framerate (generalization of course).
 
30 fps actually looks more cinematic, too, which is the look that many devs are going for. On TV, 60 fps is the mark of a handycam, while 30 or 24 fps is what the TV studios use.

The video vs film look has much more to do with the piss poor dynamic range of most electronic image sensors compared to film than with 60 fps vs 24 fps. I've seen many 24 fps shot-on-video movies that look like video, and some 48 fps IMAX films that look like film.

I'm a film buff and movie collector, and 24 fps is the bane of my existence. It's a horrible legacy that I hope the industry moves away from in the era of digital theatre.
 
The video vs film look has much more to do with the piss poor dynamic range of most electronic image sensors compared to film than with 60 fps vs 24 fps.
I've heard this explanation before, but it's just not true. You see this in low dynamic range indoor scenes as well. Not only do avid videographers love 24p and 30p over 60i/p, but home theatre buffs also find the motion interpolation of 120Hz sets to be distracting when watching film.

For video, the "too smooth" look of 60i just looks so amateur, because that's been the difference between home video and professional work that is broadcast on TV or shown in the theater. For games, it depends on whether you want to have the arcade look or the watching-a-movie look.
 
... For video, the "too smooth" look of 60i just looks so amateur, because that's been the difference between home video and professional work that is broadcast on TV or shown in the theater. For games, it depends on whether you want to have the arcade look or the watching-a-movie look.

When I see a 60fps game, I can see the motion is smoother. However, I've also always had a strange feeling about it. A feeling that I don't understand and couldn't quite put in words. You may have just done that for me. :eek:
 
30 fps actually looks more cinematic, too, which is the look that many devs are going for. On TV, 60 fps is the mark of a handycam, while 30 or 24 fps is what the TV studios use. That's why it's such a hyped up feature on camcorders. Even motion blur is not always there on cinema, as I've seen panning scenes in movies that clearly used a fast shutter to make each frame crisp.

Aside from hardcore gamers, the conditioning to 30fps has always been there.

Maybe, but for me personally, it's something I've hated in cinema, especially in big panning shots it can be jarring. It bothered me in sections of Planet Earth for instance too. The other day when I went to see a 3D movie for the first time, one of the things I enjoyed really much apart from the 3D of course is the smoothness thanks to the high framerate (this was Pixar's Up).

I think you can get a cinematic experience in a different way easily - but often there's just not enough juice left in the engine to do these kinds of filters in 60fps.
 
30 fps actually looks more cinematic, too, which is the look that many devs are going for. On TV, 60 fps is the mark of a handycam, while 30 or 24 fps is what the TV studios use. That's why it's such a hyped up feature on camcorders. Even motion blur is not always there on cinema, as I've seen panning scenes in movies that clearly used a fast shutter to make each frame crisp.
Film has motion blur to go along with the low framerate. When you don't have motion blur in film like you've mentioned that's when we get judder which is something film enthusiasts hate. With 3D rendering with a low framerate you get after images. With games you have an input device that benefits from the smoother control of 60fps.
Aside from hardcore gamers, the conditioning to 30fps has always been there.
Oh man oh man I remember when everyone was striving for 60fps. I remember in my noob days I questioned people's ability to notice 60fps on a forum and got my ass owned in that thread. What's happened? Has people's fanboyism for these consoles made them lower their standards?

Next thing you know people are going to say listen servers over dedicated are fine. Oh wait... Or maybe the next thing will be people saying high pings are fine since games have become so good at masking latencies. GGPO is magic now!
 
Last edited by a moderator:
Most mainstream consumers can easily see the difference between 1080p and 720p in a setting like BestBuy because that setting allows you to easily compare two different images. But let those same consumers go around random to their neighbors' homes with HDTVs, they would be practically unable to tell which one was a 720p set or a 1080p.
Anecdote: I got a PS3 recently and was playing Bioshock. The frame rate was a bitt jittery, so I dropped the machine down from 1080p to 720p. On the PS3, this requires stopping and restarting the game. Even with just the one or two minutes needed to switch, I didn't even notice the lower resolution (certainly not the way I noticed switching from 800x600 to 640x480). I did notice the game running a little more smoothly, though.
 
Bioshock doesn't render at 1080p. It upscales to 1080p. Only disc games I know off that renders native 1080p are Lair and Virtua Tennis 3.
 
This news trully saddens me, as I've seen Insomniac as one of the last to devote themselves to 60 fps or in that sense "gameplay over graphics". To see them go the other way is very disappointing.

I'm quite at loss for words at the stupid poll that was used to form some argument. A poll without a context isn't worth much and this poll certainly didn't have one, besides framerate. Surely, it depends on the type of game if a 60 fps framerate is beneficial or not. In any game where you have a freely movable camera (technically, any shooter, 3rd person game or platformer), you can appreciate the higher framerate.

In some games, the line is more blurred as the pace is limited by the movement of the player or the speed at which the camera can be rotated. To imagine a game like Ratchet & Clank reduced to 30 fps is just awful.

Sadly, with all the credit Naughty Dog are receiving on their 30 fps games, it was just a matter of time. The market is turning increasingly shallow with everything directed at visuals and screenshots. Can't really blame them. If you want to survice in todays market, you better have something that looks good.

I really hope console makers enforce a 60 fps rule next generation, limiting the playing field to all. If everyone sticks to the same framerate, there should be no reason to go "lower for better". That framerate however should be 60 and not 30.


This move by Insomniac to me, looks like a hurt gesture that Ratchet & Clan underwhelmes visually and all the rave on Naughty Dog's Uncharted 2. Looking back at our R&C thread, it does seem there are an awful lot of posters that are unappreciative of what they are doing. Sadly, I bet, an awful lot of that criticism comes from people that have absolutely no incentive to play the game, regardless the framerate or how the game looks. I just can't believe the market is becoming THIS shallow.

:devilish:
 
Back
Top