Svensk Viking
Regular
I can't think of a single review that didn't point out the shortcomings of ME's performance actually.
Still got 9 in graphics on IGN for example...
I can't think of a single review that didn't point out the shortcomings of ME's performance actually.
Still got 9 in graphics on IGN for example...
I agree. Given that (unless Im mistaken) the vast majority of console games are 30 FPS, then most gamers, regardless if they are are aware of the concept of FPS or not, will be condition to 30 FPS. For most its not 30 FPS vs. 60 FPS but falling substantially below 30FPS that bothers or becoming noticeable to most players.
You can take the same game make it 30FPS on one setup and 60FPS on another. Alot of gamers may be able to readily able to tell the difference and feel 60FPS is better. However, the problem with this scenario is that how often in the real world is your average gamer going to experience such scenarios. Furthermore, 30 versus 60 is not a game breaker. A crappy 30 FPS game doesn't suddenly become a good game just by upping the frame rate.
Svensk Viking said:I also hate that the "professional" reviewers don't seem to care about the framerate....Anyone remembers Mass Effect for the 360? It was hailed for its graphics, but had a terrible framerate
I can't think of a single review that didn't point out the shortcomings of ME's performance actually.
Which changes nothing regarding visual updates - the game will still look 30fps as much as any other 30fps game.liolio said:If the renderer runs @30fps it doesn't prevent the physic engine to run way faster (360 updates per second).
Still got 9 in graphics on IGN for example...
I cannot think of a single review that didn't absolutely smash ME for its pisspoor framerate either..
30 fps actually looks more cinematic, too, which is the look that many devs are going for. On TV, 60 fps is the mark of a handycam, while 30 or 24 fps is what the TV studios use.
I've heard this explanation before, but it's just not true. You see this in low dynamic range indoor scenes as well. Not only do avid videographers love 24p and 30p over 60i/p, but home theatre buffs also find the motion interpolation of 120Hz sets to be distracting when watching film.The video vs film look has much more to do with the piss poor dynamic range of most electronic image sensors compared to film than with 60 fps vs 24 fps.
... For video, the "too smooth" look of 60i just looks so amateur, because that's been the difference between home video and professional work that is broadcast on TV or shown in the theater. For games, it depends on whether you want to have the arcade look or the watching-a-movie look.
30 fps actually looks more cinematic, too, which is the look that many devs are going for. On TV, 60 fps is the mark of a handycam, while 30 or 24 fps is what the TV studios use. That's why it's such a hyped up feature on camcorders. Even motion blur is not always there on cinema, as I've seen panning scenes in movies that clearly used a fast shutter to make each frame crisp.
Aside from hardcore gamers, the conditioning to 30fps has always been there.
Film has motion blur to go along with the low framerate. When you don't have motion blur in film like you've mentioned that's when we get judder which is something film enthusiasts hate. With 3D rendering with a low framerate you get after images. With games you have an input device that benefits from the smoother control of 60fps.30 fps actually looks more cinematic, too, which is the look that many devs are going for. On TV, 60 fps is the mark of a handycam, while 30 or 24 fps is what the TV studios use. That's why it's such a hyped up feature on camcorders. Even motion blur is not always there on cinema, as I've seen panning scenes in movies that clearly used a fast shutter to make each frame crisp.
Oh man oh man I remember when everyone was striving for 60fps. I remember in my noob days I questioned people's ability to notice 60fps on a forum and got my ass owned in that thread. What's happened? Has people's fanboyism for these consoles made them lower their standards?Aside from hardcore gamers, the conditioning to 30fps has always been there.
Anecdote: I got a PS3 recently and was playing Bioshock. The frame rate was a bitt jittery, so I dropped the machine down from 1080p to 720p. On the PS3, this requires stopping and restarting the game. Even with just the one or two minutes needed to switch, I didn't even notice the lower resolution (certainly not the way I noticed switching from 800x600 to 640x480). I did notice the game running a little more smoothly, though.Most mainstream consumers can easily see the difference between 1080p and 720p in a setting like BestBuy because that setting allows you to easily compare two different images. But let those same consumers go around random to their neighbors' homes with HDTVs, they would be practically unable to tell which one was a 720p set or a 1080p.