I guess the key point here is how much further would the xbox have to stretch to obtain 1080p.
A lot of people just use TFLOPs as a way to measuring performance, but I feel from a technical perspective, that is not really capturing the entire picture.
I've often wondered how AAA developers truly budget what they can and cannot fit in a game at the start of a project.
And so, after much deliberation, I believe they do it like this
any of you guys are free to correct me.
They look at the resolution and amount of shaders involved as a target and that will ultimately determine the amount of floating point operations that the game will consume. We will say X amount - or they reverse engineer a target from their target hardware or something. But they should have an idea of how heavy their code should be.
Because you are a console developer I guess, say targeting PS4 for example. you know you have a peak 1.84 TFLOP. No problem
That means, every second PS4 is capable of 1.84 Terra floating ops. Okay. So for 60fps, you need 16.6 ms.
So that means it can only do
0.030666666666666 TFLOPs in a 16.6ms frame.
or 30.7 billion flops per frame.
Your game needs to be less than 30.7 billion flops per frame, or it is theoretically impossible to do.
Now interestingly, once that frame counter begins, every clock cycle that is unused or used, the number of flops go away. For reference on PS4 there are 1.84 Million FLOP per millisecond.
So game developers must be budgeting how much time could they have possibly idle when your GPU is doing nothing due to cache misses, waiting to read and write memory, waiting for instructions, changing states, waiting for sync points.. etc
So games will likely be targeting much less than 30.7 billion in this case, they need to factor in latency and stuff. So eventually they come to a budget.
Now this is why I believe when Naughty Dog said hey we can run UC4 at 1080p/60 it's because the GPU code must be less than 30.7 billion flops to make it happen. But they were having troubles with the loss cycles, since once the frame ticker starts, you're losing FLOPs whether you are using them or not. So it runs at 30fps instead. And that gives them access to 61.4 billion flops, how many are wasted, and how many are actually doing work is beyond me. There are only two ways ND will be able to hit this target, they either use less FLOPS to do the same task (optimization) or they find a way to use more FLOPS and have less idle time (saturation).
Same thing applies to Xbox One.
22 Billion flops per frame @ 60 frames per second. Now
if the budget of the game is less than 22 Billion flops per frame @ 1080p, it could theoretically fit in XBox's budget. But once again you're going to have to have that uphill climb. Will directX12 up the efficiency so much that there is very little wasted FLOPs? Shrug.
If so, then, yea I guess it can do it, but it's still up to the ability of the developers and the budget of the game.