Ok, if I may play Devil's Advocate for a sec: A quick search of single GPUs running Crysis 2 finds that at 1900x1200 (fairly close to 1080) at 'Extreme' preset, only the GTX 580 was able to break 50 fps.
Now it seems to me that people here are of the opinion that Crysis 2 'Extreme' graphics at 1080p60fps will be standard in the next gen. I understand its 2 years out, but that seems a shade optimistic. I'd love to see it.
The tech enthusiast response to your concern would be that the new consoles from Sony and MS will be implemented on lithography that is one or even two generations beyond the GTX580, and that this will allow them to match or even surpass the GTX580 while being significantly smaller and cooler running.
The less trivial response would be that the console scenario of having a set target resolution and framerate, and a fixed resource to accomplish it with is very different from benchmarking a PC program. If you take Crysis 2 as an example, it has a ton of settings and techniques, some of which are quite inefficient for what they accomplish, but since they can be turned off if you don't want them, and the game will encounter hardware from the mediocre from yesteryear to the top of the line years in the future, they might as well be made available. Plus they might make a certain sponsor look good in benchmarks, and reviewers will like to use your product at its highest settings because there it will tax the high-end hardware and thus we can pretend that these devices have some feeble justification outside e-penis concerns, and the carousel of hype, articles, ad revenue, and retail sales goes around another turn. Everybody wins, right?
Those extreme settings that reviewers like to use, are typically very far from optimal in terms of visual return on rendering investment. You can experiment with tweaking the settings down - how far down can you go without affecting the visual impression of the game more than minimally? Or go the other way and start from minimum settings and increase them judiciously until you start noticing that responsiveness is affected, or to hit a certain target framerate on your system. Then max the settings and see how much you gained visually with that increase in rendering load, and how much it cost you in responsiveness.
These experiments do not fully mimic the console situation of targeting specific hardware and resolution/frame rate, but it gives an idea of how much can be gained in visual quality/rendering cost by making the right set of compromises. A
lot. And on consoles you can work in closer harmony with the particular hardware, and even have a dialog with the art asset creators in how to maximize the result given the peculiarities of the platform, which will produce even better visual results for your rendering effort.
Console games can look surprisingly good given the hardware, because knowledgeable people sat down and made the right compromises. Which is rather the opposite of reviewers benchmarking PC graphics cards with every setting on Max.