What do you mean "yet"? Of course it does. That doesn't contradict anything I just said.
I went through the steps. You're trying to show a resolution diminishing returns, I'm saying we must be really far from it given my phone's PPI is much much higher than any TV's yet still showing improvement, major improvement, with increases.
I explained the math in my original post. I don't think you followed it. Are you very good at math? Because if you aren't, I doubt I could explain it better. I might give it a try with some pictures, though.
I mean I get what you're saying, vaguely enough. I think you're trying to say if we analyze image X, and then try to render it, and then compare the two mathematically, the error rapidly converges as you gain tech power.
I'm not sure it's all that valid to our discussion though. We're rendering moving images, not a static one, the former is exponentially harder because we have maybe 16ms in the case of a 60 FPS game to do it, and it has to be interactive. If we looked at any given current gen screenshot of in game gameplay, it will look really bad compared to a photo. Then there's the whole question of whether photorealism is as good as it gets or even what we are shooting for.
"Like shit" is not an objective measure...I've seen it used to refer to everything from 15-year-old 3D games to the very latest games run on a midrange graphics card.
Do you think the jump from lets say, NES to SNES, is greater than the jump from PS2 to X360? By your theory it must be.
I dont know that it clearly is (in fact I would probably lean to the latter jump as greater, SNES could be argued as a prettier NES or something rather than fundamental change), they are both equally massive jumps. so where's the diminishing returns? Not proven yet. They have not appeared in any generation of consoles yet. You state they will next gen. Ok, but that's unproven.
Obviously "like shit" must be a comparative term, not a general one here. The 15 year old 3D game looks much worse than the latest game on a midrange card, if we directly compare them.
They're already quantified, yes, entirely. Everything on your screen is a number. If you have sets of numbers, it's not hard to impose metrics on them.
Ahh, so finally you can settle the debate for me whether Killzone 3 or Crysis 2 has better graphics then? It's all objective after all. Also, can you give me a percentage for how much better Halo 4 looks than Killzone 2 (PS2) ? Thanks
I'm also not sure where you get the idea that current-gen consoles can't do 720p or struggle with it. Of course they easily have enough fill rate and memory space to render a 720p image. But this is a perfect example of diminishing returns. Developers and gamers have almost unanimously judged that the graphical gains from using the available fill rate to render at true 720p are less than the gains you get from rendering at a lower resolution with better lighting and other effects. Perhaps next gen, that won't be the case--perhaps the returns from extra effects and whatnot will be smaller than the returns from increasing the screen resolution.
Resolution vs effects is a tradeoff. Dev's dont have enough power to make 720P often, so they skew to the graphics side.
If we had "enough" power, eg, the returns from decreasing resolution weren't great enough, this wouldn't be an issue.
Let alone 1080P or 60 FPS, both which would require massively more power. Or 4K someday, requiring another huge bump.
It truly shows proof the graphics side is far from being maxed, otherwise all games would be 1080P 60 FPS today.
It's true we judged resolution to be a bigger improvement this gen, but I fail to see how that speaks to the returns. It speaks to a decision the games overall looked better at the higher resolution.
Somebody made the decision:
720P+less graphics=
overall better visuals than > 480P+more graphics. This doesn't speak to diminishing returns at all imo. It speaks to "480P is a blurry mess, lets fix that first, albeit only with a jump to lowly 720p for now cus thats all we can afford".
Some of what you're seeing on your phone is resolution improvement, and some is rendering improvement. Try spot the pixels on an iPhone or a Galaxy SIII. You can't, because the pixels are too small, at the distance you're viewing, to resolve. The same for a 1080P TV. For a 50" TV, 6 feet or further away, you cannot resolve the pixels. Increasing the pixel density won't do anything except require more power to drive an image that won't be any different to your eyes.
Then why are people talking about 1080P phones and abuzz about Toshiba's recent demonstration of a (allegedly glorious) 55" 4k TV at some show or whatever?
To me it just stands to reason, lets take viewing distance away by comparing my PC monitor that I sit close to to my phone. If a 5" screen isn't maxed by 720P. what absurd resolution would be needed to get the same (desirable, clearly) PPI in a 27 monitor? I cant count that high.