If you can quantify "returns" meaningfully, there are diminishing returns with every successive increase in hardware speed, going all the way back to the very first computer games. It's true in every aspect of computing, too, not just graphics. For example, in word processing software, the quantifiable returns gained relative to increases hardware power are already near-zero compared to what they were in the 80s and 90s.
It seems pretty obvious that quantifiable returns in 2D games are already small, and I suspect they have hit a point of rapid diminishing in certain types of 3D games.
You cannot quantify the scorn of perfectionists, as it is always infinite. You can, however, quantify the error differentiating what you actually can achieve and what you'd achieve in an ideal universe when talking about computer graphics, since they're intrinsically quantitative.
The quality gain from 1080p from 4k measured as image error, for example, is not nearly as large as the gain from 320x240 to 1080p. If you're using nearest-neighbor sampling of whatever image you're trying to draw (i.e. the worst possible sampling), it converges like a Riemann sum.
You could also quantify it in economic terms, which is really what matters in the game industry. If the costs associated with creating delivering higher-quality content are below the revenue you gain from it, then your returns have gone from positive to negative.
Yet I have a 720P phone with a 4.8" screen. And this looks much better than the same phone with a 800X480 screen. Rumors abound 1080P phones are the next step.
Hmm, so what res do I need to "max out" my 42" TV? Or my 27" PC monitor I'm sitting 18 inches from typing this? Which is a lowly 1080P?
How do you measure what the eyes see as "image error"? I'm not entirely following you.
But the rest of my points remain anyways. We dont have enough power for 720, many games cannot get there. We need a LOT more power. These games cut every corner imaginable in the lustful chase for one ounce more graphics, hundreds of people push them for one ounce more graphics. 300 people each made Crysis 3, and Assassins Creed 3, and Halo 4.
It's hard for me to say where the "end" is, just that I'm sure we're not there, and I have a funny suspicion we're really far away. I kind of think, in ~2017, we are going to get ANOTHER huge jump after the next consoles. Just say Durango ships with a 1.5 teraflop GPU, I can already see the beginnings of that being weakling.
There's going to be some point where we look at an X360 game and say "that looks like shit", just like we do a PS2 game now. There's likely going to be a point where we look at an Xbox Durango game and say "that looks like shit", too, even if it seems too far away to comprehend now.
I do agree some games seem more susceptible to less returns. Car games for one. FPS are the genre that has the most headroom. RTS are maybe even more demanding.
Even 2D though, I think this gen's 2D is the best ever, for example (move to hd alone could do that). Why wont next gen better it?
I'm also not sure you can "quantify" graphics, entirely. It's not how close you can get to a photo imo. But if it is, check out any ingame screens to see, they look very crap to a photo. Yes you can get a car model looking really nice, but then you have to draw the track, and a thousand trees, and 20 other cars at varying distance in perfect clarity, and the horizon, and everything else, it gets exponentially more complicated. In essence I'm not sure graphical motion can be compared to a photo.