Personally after looking at things like Crysis (and specialy after thinking in what a last gen 360/PS3 game should do) I ask for myself if even in things like physics, animation and subtle detail will actually see such a big jump as one might think initialy. I mean in some games (crysis) almost everything is interactive.
I do belive that in next gen there will be a good and noticiable improvement in all the traditional areas, but I think it will be only a "step" futher instead of a jump.
This lead me to think that the big improvements will be in no traditional areas like interfaces or AI.
I think it would be cool if say, Sony, went that route, focused on AI, Physics, new controls, but not graphics, maybe a slightly clockbumped RSX only, but lets say 4+ Cell's for tons of physics, in PS4. And say, microsoft, really just went all out on graphical power on Xbox720 (10X Xenos), thereby giving gamers the best of both worlds, so to speak, depending on which console they choose, or both.
I really kind of think Sony has been leaning that way anyway, with certain statements. This would leave Xbox as the sole "graphics" platform in the future, and Sony+Ninty the "fun+Physics (but not great graphics) systems.
Since as you say, graphics aren't increasing that much, it shouldn't be a big problem for Sony in that regard. Sure Xbox720 would have more bells and whistles, but would Joe consumer really be able to tell Xbox 720 was 10X more powerful GPU than PS4? That's the way I think it may go in the future. What I see would be something like:
PS4: 8 to 12 cells, RSX clocked at 700 mhz (~300m transistors for RSX)
Xbox 720: 3.5+ billion+transistor ATI chip, 1.5-2.0+ ghz ATI GPU speed, Intel quad/8 core kentsfield 3 ghz+
This would give Sony a huge cost edge, and I'm not sure laypeople would be able to tll the grahical difference immediatly because of diminished returns. But the 8-12 cells would allow huge gigaflops of physics, AI, etc. Every extra transistor Sony has would be thrown into moreCells, rather than a diminishing returns GPU rat race, and Cell is some amazing technology, so I would see them going with all their eggs in that basket.
I am saying 3.5 billion on the Xbox 720 chip, it sounds like a ton, but my thinking is, 90 nm started around 300m on GPU's (G71/R580 class) but then doubled to around 600-700m (G80/R600 class) on the same node. So I am figuring two transistor doubles per node (one at the start, then a second at process maturity) so say, 65 nm starts at 1.2b, then ends at 2.4, so, you quickly get to very high figures, even though 2010 may only see 45nm node.
Say that might lead to, 600-800 shader ALU's on Xbox720 clocked at 1.5ghz+ (the way GPU clocks are scaling), but even so, will a layman be able to tell between that and RSX? I'm betting Sony thinks not.