Absolutely I agree.Bad example. Ultra settings are always very ineficiente afterthoughts meant to put an extra cherry on top of a niche market's ice-cream. They are definitely not representative of what a game built for a 1080Ti from the ground up would look like. There were ps360 gen games that ran sluggishly on ultra settings on cards faster than the ps4, and still didn't look an inch as good as any PS4 exclusive.
That's due to technology, feature and architectural differences.
Compute shaders didn't exist back in DX9, it's like asking a PS4 to do reflections, refraction, transparencies and all sorts of soft shadows in real time... without RT capabilities...
Do you really think that throwing more power at the solution will solve it?
Would a 12TF Xbox 360 be able to generate better graphics than a X1X?
The irony of this discussion; but without the sarcasm, this is what the actual debate is.
Are we not at the point of diminishing returns today, such that the only way to break the next graphical threshold is to move to RT?
Would we be satisfied with next gen only being higher fps and higher frame rate than what we have today? More baked lighting? SVOGI at best?
Unlike the PC market, we're locking the generation in for 6 full years starting in 2020. I can really see this debate on both ends of the argument. Believe me, it's like the twirling ballerina, some people see her twirling right the others left.
Part of where I'm looking is not just the technical piece, which is what we get hung up on a lot here. I'm looking at the business aspect. In particular Nvidia, the partnership with MS, streaming tech, the PC space. MS still very much wants you to move to windows 10. They very much want casuals who have low resolution screens to experience next gen gaming over streaming.
If I were MS, could I really sell 4K... twice? Many of you have already discussed to death that you felt 4K was a complete waste of power; drop the resolution and increase the quality per pixel, yet here we are...
Last edited: