Since graphics chips have finite fill rate, you can do more operations per pixel at a lower resolution than a higher resolution. Modern graphics are heavily dependent on per-pixel operations and will be for the foreseeable future. There are a few games that run at native 1080p this generation. They don't look very good because resolution is a fill rate hog.I don't see how improvements in fidelity as well as resolution are mutually exclusive especially when considering the relatively low base we're improving from.
You mean like IW and Treyarch did? Call of Duty games have all run at 600p or less in order to maintain 60 fps. I doubt you'll see resolutions that low next gen. You should be able to achieve enough asset fidelity at 60ps and 720p that the gains from dropping to 600p just aren't worth it. Does Rage run at 720p native? They certainly cut the complexity of the lighting compared to other games.Shifty_Geezer said:If framerate was ever considered important, they'd reduce the rest of the graphics, even resolution, to hit it, and that'll be true next-gen.
I still expect a lot of 30 fps games, though, because I don't think developers have run out of things to invent yet.