Curious quote. If they haven't tested the 1080p version of the game, how do they know it is not enough GPU horsepower to "bridge the gap"?
simple math: (1920 * 1080) / (1600 * 900) = 1.44
44% > 10%
Curious quote. If they haven't tested the 1080p version of the game, how do they know it is not enough GPU horsepower to "bridge the gap"?
While the additional 10 per cent of rendering resources afforded by the recent Kinect-unbundling XDK update are welcome, it does not offer up enough GPU horsepower to bridge the gap between 900p and 1080p rendering.
Nah, it's more plausible that the PS4 version is being downgraded to match the X1.
(read: sarcasm)
"Get away with" in what sense? NeoGAF would explode at 960x1080, especially since that's fewer pixels than 900p. Particularly if, unlike KZSF, they don't use temporal reprojection to construct a 1080p buffer (which they probably wouldn't, as the engine isn't using TAA in any version of the game, and that's not a trivial thing to get right).Not saying Bungie did, but couldn't they get away with saying 1080p for XB1, if the actual image is 960x1080p (ala KZSF:MP)?
simple math: (1920 * 1080) / (1600 * 900) = 1.44
44% > 10%
Sure. If the game ends up running at 30fps at 1080p without any obvious "downgrades", then what's the explanation?
Not saying Bungie did, but couldn't they get away with saying 1080p for XB1, if the actual image is 960x1080p (ala KZSF:MP)?
The reason I bring this up; is something puzzling about DF statement (see below). Either they know that the XB1 native 1080p resolution is slightly lower, or that the frame-rate isn't a solid or locked 30fps at 1080p. Like a preemptive warning of things to come...
Sure. If the game ends up running at 30fps at 1080p without any obvious "downgrades", then what's the explanation? It's definitely a bit weird. Maybe there are some late-game levels that are more demanding that would bear different results.
That word is probably one of the greatest ironies in language.It would be a win/win for XB1 owners, better performance, better lisibility.
simple math: (1920 * 1080) / (1600 * 900) = 1.44
44% > 10%
I with they had an uncapped mode like Infamous or Tomb Raider. I'd like to see what the "natural" frame rate was without the parity purity belt on.
That 1080P 30fps with this engine and assets is easy. So in effect there is wasted horse power for the sake of parity.
Perhaps the game had some headroom at 900p? But not enough to get to 1080p. The 10% allowed it to reach the steady(or close to steady) 30fps.
Perhaps the game had some headroom at 900p? But not enough to get to 1080p. The 10% allowed it to reach the steady(or close to steady) 30fps.
Based on the latest DF PS4/XB1 face-off video of Destiny Beta, It hasn't, quite the contrary.
The dips were small and occasional, and some could have been related to the CPU. There were also parts where the Xbox (and PS4) were going above 30 fps. I wouldn't rule of frame pacing for some of this.
There was even at least one spot where the PS4 dropped under 30 fps (perhaps during heavy motion blur?). I wouldn't say this means there is no 'headroom' in the PS4 either, btw.
*AHEM* This is the technical discussion thread, not the "I'm pissed off because my ePenis doesn't look as large as the others now." So don't go bringing up the parity bullshit here.