Then I will bitch, moan and be transparent, because anything dealing with the technical aspects of XB1 hardware or it games are off limits to little dickus, crack-function and whoever/whatever else company shill that doesn't want to address questions.
You want to have a "technical discussion" about why a console (that you show utter contempt for) can't achieve an arbitrary goalpost you equate with competency, kicking off the "discussion" with loaded terms like "hardware limitations" and "crippled performance." Right. Accusing someone else of being a shill is quite the punchline.
Why 1080p? Because 1080p is a fairly common display resolution? Isn't more = better in your book? There are quite a few 4k displays out there. What is broken in these next gen consoles that they can't handle high res output at 60 frames/sec? Is it gimped memory busses? Underpowered CPU's? Underachieving GPU's? What failures did Sony and MS have in producing hardware capable of high quality graphics knowing years ahead of time what the target displays were?
Or from an equally absurd perspective, when Crysis releases a new PC game that brings top of he line graphics cards to their knees, what is broken in these cards so that they can't handle a game at full HD resolution? Limited bandwidth from there ultrawide and fast gddr5 memory systems? Should they have not cut corners and went with 1024 bit wide bus?
No less FUD than what you wrote. No less loaded language.
Now, if you want to have a technical discussion on the topic, we can start with the old tried and true, as the rules of physics haven't changed for a new generation. What is human visual acuity in arc seconds, and what resolutions does that equate to for a given display size at a given distance? How does that change with AA, ie how much does resolution need to increase without AA as a percentage to maintain visual equivalence to a lower resolution with 2xAA? 4x? Different sampling methods? Now, how does the processing power compare at visually equivalent combinations? Bandwidth? Does esram or single fast pool fit some choices better than others? If you have an abundance of compute power, or bandwidth, does that lend certain choices more favorable? Are there situations in which a lower resolution visually equivalent or nearly equivalent choice frees enough resources to make a noticeable impact on pixel quality? What does he hardware look like in such a case, compute heavy or compute constrained? Bandwidth heavy or bandwidth constrained?
Someone else asked if given a choice, would crytek prefer to run at 1080p looking just like they want, or 900p looking just like they want. That hypothetical presumes there is enough processing power to enable such a choice... ie, infinite. Of course if you have infinite power, you pick the higher res. Back in realityland, we have consoles with cost, power, and heat budgets, and devs will have to make choices that give the best on screen appearance possible within those budgets. This is no different than any other generation. Some games going for a certain loom or having a certain style may target lower res higher quality pixels. We saw plenty of that last gen. For other game styles, more pixels may be achievable or even desirable. Maybe some game will decide to target higher than 1080p and sacrifice where necessary to achieve that. The monitors I use daily are well over twice that resolution, so I would find that intriguing.