it's not always simple for a developer to "turn down a few settings" in order to double the framerate. If a studio targets 30 fps, there's probably a good reason for it.
I laid out the reasons why a studio might decide to target 30 fps. Because graphics sell. It's easier to sell your game on the premise of good graphics because that's what the people tend to see first. Games are marketed through magazines and the internet - and even if videos on Youtube and other platforms are becoming bigger factors in the sales of games, due to the nature of how movies are captured and compressed, it's easier to mask a game running at less than optimal framerate. Add to that, that framerate is not just something you see, it's something you
feel - something you can't relate to at all when just watching a pre-recorded video review of a game.
Graphics sells.
Maybe to illustrate my point by this little illustration I've drawed up:
What you see here is a timeline that not only shows the time, but more importantly also shows the technological progress and the graphical perception. I say perception, because technically, what you have when your hardware is finalized is what you have - and everything, right down to game mechanics, framerate, texture resolution is a compromise limited to what your hardware can do.
What I tried to portray is that graphics and framerate are directly related to each other. Up the graphical fidelity and it's at a direct compromise of framerate and vice-versa. Of course, this is a simple way to look at it - technically, you'd have to add the other aspects to this diagram, but given the focus of this topic, it's easy to just compare the two in an isolated case.
My point is, when the new hardware launches in 2013 or 2014, you'll have a line on that diagram somewhere with graphics that is limited to the technological progress of that given time. We won't know yet to what kind of graphics that will translate to - we can only make educated guesses on what is possible on the PC today, because it's an evolving platform.
What we will get, is what we will get used to. If console makers enforce a hypothetical 60 fps minimum limit, we'll get used to that. We wouldn't know how much more would be possible, just as today on our current platforms, we can only guess how much better games would look if they'd be shipped with a targeted framerate of 15fps (half of what is the norm).
In other words, this argument about wanting better graphics is all relative anyway. We can only judge what we get and see. If we stick to what we know is a solid framerate for all genres (60fps), our graphical perception will be limited to that. So, I really don't see how this is bad. If we want better graphics, by the same argument, why not just wait another half a year to launch your hardware? Or buy a platform that is evolving, like the PC?
Setting a mandatory framerate limit to 60fps instead of 30fps is IMO a little price, relatively speaking. But it's only really possible at the start of a new platform, enforced by the hardware maker, because by the way the industry works and how games are sold, it's doubtful that studios will make the choice on their own.
EDIT: Realistically - I would say that the lines in my diagram are far too close. I would guess the difference between i.e. 480p/30 and 480p/60 must be bigger than a few months. Probably closer to a year? It's just a minor point - but one that I wanted to make before it is pointed out. And I was too lazy to change it.