If you know nothing about the context of the numbers I’d hardly call that the whole picture.
Not saying it’s not a relevant data point. But over attributing without knowing how many frames will dip that low or for how long or if that number can be reproduced reliably, is not really knowing the whole story.
Above 1440p for All of 30 seconds to 2 minutes for a game that plays for 200hrs is not relevant data point. I don’t even know if it amount to 0.000025% or game time. That puts it probably near 6 deviations out from average resolution.
Indeed. If the game manages to render in between 4k and 1800p most of the time, I simply don't see the point of being a stickler for the statistical oddity of spending maybe a few frames at a much lower pixel count in the grand scheme of things. I'd wager the game actually spends more time rendering inbetween 4k and 1800p than it does at 1800p. I'd say that's a massive leap compared to the base X1 version at 900p. As it is, the 1X version doesn't drop frames in the 4k mode in intensive areas.
The presence of the 60fps mode is great for future hardware as right of the bat you can get 4k/60fps TW3.