I'm not sure if it's meaningful for me to chime at this point, but here are the things a lot of you are ignoring:
1. You get a higher resolution in console games whenever the developer believes that whatever is gained by higher resolution is worth whatever is lost in whatever else the fillrate could have been used for. If, next gen, there's so much fill that devs really can't find anything else to do with it, you'll see lots of 1080p games. Current-gen consoles are already capable of 720p games. Developers tend to prefer using the available fill for other things. It is highly likely that next-gen, as new graphical techniques are invented, resolution will drop back down to 720p or even 600p.
2. PC games don't "run at 1080p." They run at whatever resolution you set them to run at, at whatever detail settings you set them to run at, at whatever frame rate your graphics card can handle. Different users have different preferences on how to balance those things. Two users with the exact same hardware specs might run the exact same game at different settings. I remember back when I played PC games, I typically preferred to run at 800x600 with higher graphics settings if they were available rather than 1024x768 with lower graphics settings (yeah that dates me).
3. There's no objective measure of the "best" way to use available resources. There are only preferences. And yes, there are even tradeoffs in PC games. There does not exist PC hardware that can run every game that will come out now and in the next four years at maximum settings, resolution, IQ, and 120 Hz frame rate. Yeah, it's nice that you can run Call of Duty, a game designed to run at 60 fps on 7-year-old hardware, at super-duper ultra settings, but that's not saying much. The difference is that on consoles, you typically have no control over which tradeoffs you prefer; the developer's preferences hard-coded.