Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Not saying Bungie did, but couldn't they get away with saying 1080p for XB1, if the actual image is 960x1080p (ala KZSF:MP)?

The reason I bring this up; is something puzzling about DF statement (see below). Either they know that the XB1 native 1080p resolution is slightly lower, or that the frame-rate isn't a solid or locked 30fps at 1080p. Like a preemptive warning of things to come...

While the additional 10 per cent of rendering resources afforded by the recent Kinect-unbundling XDK update are welcome, it does not offer up enough GPU horsepower to bridge the gap between 900p and 1080p rendering.


Edit: Scott_Arm beat me to the punch...
 
Nah, it's more plausible that the PS4 version is being downgraded to match the X1.
(read: sarcasm)

re: KZSF:MP, A long time ago I spotted something odd, like a feathering issue on the MP video cap, but it was brushed off as "compression" at that time, ironical how that one turned out to be.
 
Last edited by a moderator:
Nah, it's more plausible that the PS4 version is being downgraded to match the X1.
(read: sarcasm)

No tinfoil needed there... It's multigenerational game, that seems to scale quite well across platforms. Hell, the PS3 version looks quite good...
 
Not saying Bungie did, but couldn't they get away with saying 1080p for XB1, if the actual image is 960x1080p (ala KZSF:MP)?
"Get away with" in what sense? NeoGAF would explode at 960x1080, especially since that's fewer pixels than 900p. Particularly if, unlike KZSF, they don't use temporal reprojection to construct a 1080p buffer (which they probably wouldn't, as the engine isn't using TAA in any version of the game, and that's not a trivial thing to get right).
 
simple math: (1920 * 1080) / (1600 * 900) = 1.44
44% > 10% :rolleyes:

Sure. If the game ends up running at 30fps at 1080p without any obvious "downgrades", then what's the explanation? It's definitely a bit weird. Maybe there are some late-game levels that are more demanding that would bear different results.
 
I with they had an uncapped mode like Infamous or Tomb Raider. I'd like to see what the "natural" frame rate was without the parity purity belt on.

Sure. If the game ends up running at 30fps at 1080p without any obvious "downgrades", then what's the explanation?

That 1080P 30fps with this engine and assets is easy. So in effect there is wasted horse power for the sake of parity.
 
Not saying Bungie did, but couldn't they get away with saying 1080p for XB1, if the actual image is 960x1080p (ala KZSF:MP)?

The reason I bring this up; is something puzzling about DF statement (see below). Either they know that the XB1 native 1080p resolution is slightly lower, or that the frame-rate isn't a solid or locked 30fps at 1080p. Like a preemptive warning of things to come...

Yes it could be that too. Anyway the game at 900p on XB1 has already framerate drops during gunfights, I counted 3 hiccups in the performance video during shooting (1 framerate drop, 2 frame-pacing/judder issues on a very short video with very few different gunfights...), the PS4 has no drops during shooting, just one small hiccup during one streamed loading, a bug it shares with the PS3 version apparently.

But turning-off some useless effects like Chromatic aberrations on XB1, like they did on the first XB1 1080p build, could really help though. It would be a win/win for XB1 owners, better performance, better lisibility.
 
Sure. If the game ends up running at 30fps at 1080p without any obvious "downgrades", then what's the explanation? It's definitely a bit weird. Maybe there are some late-game levels that are more demanding that would bear different results.

Well, if there's only universal formula that converts power to pixels drawn :idea:

and this is why I think DF should have just waited when the retail versions are out.
 
I with they had an uncapped mode like Infamous or Tomb Raider. I'd like to see what the "natural" frame rate was without the parity purity belt on.

Now frame cap is about "parity purity belt" and not about capping the frame rate.

That 1080P 30fps with this engine and assets is easy. So in effect there is wasted horse power for the sake of parity.

Making the leap that "allowing headroom = wasting power" is questionable. But saying it's being done "for the sake of parity", considering that without a last minute update following a freeing up of resources and technical assistance there was a pretty big difference, is insanely stupid. It's fanboy nonsense.

PS4 hit Bungie's performance targets. Xbone could only hit it with a drop in resolution. There was not parity. There was not going to be parity. PS4 was not reduced down to 1600 x 900. Xbone was behaving in line with other titles. Until at the last minute, it no longer was.
 
Perhaps the game had some headroom at 900p? But not enough to get to 1080p. The 10% allowed it to reach the steady(or close to steady) 30fps.

Based on the latest DF PS4/XB1 face-off video of Destiny Beta, It hasn't, quite the contrary.

Similarly to 900p/1080p multiplats like AC4 and Trials Fusion, where both games being identical (except the resolution), PS4 at 1080p had slightly better framerate than XB1 at 900p.
 
Perhaps the game had some headroom at 900p? But not enough to get to 1080p. The 10% allowed it to reach the steady(or close to steady) 30fps.

There's also the possibility that the render targets were re-factored / combined / jiggled to allow them to better fit in the esram while being larger.

There's also the possibility a little extra performance made the tradeoff between resolution and frame rate tip from "900p at rock solid 30 fps" to "1080p with some occasional dips" instead of being "... with frequent dips".

I do wonder if above 1600 x 900 the perceived increase in quality on a 1080p panel begins to drop until you hit the sweet spot of 1:1 native res. It seems strange that we haven't seen anything yet go above that without hitting 1080p.
 
Based on the latest DF PS4/XB1 face-off video of Destiny Beta, It hasn't, quite the contrary.

The dips were small and occasional, and some could have been related to the CPU. There were also parts where the Xbox (and PS4) were going above 30 fps. I wouldn't rule of frame pacing for some of this.

There was even at least one spot where the PS4 dropped under 30 fps (perhaps during heavy motion blur?). I wouldn't say this means there is no 'headroom' in the PS4 either, btw.
 
The dips were small and occasional, and some could have been related to the CPU. There were also parts where the Xbox (and PS4) were going above 30 fps. I wouldn't rule of frame pacing for some of this.

There was even at least one spot where the PS4 dropped under 30 fps (perhaps during heavy motion blur?). I wouldn't say this means there is no 'headroom' in the PS4 either, btw.

Yes DF noticed it, but it's during a streamed loading when in fast travel, a hiccup present at the same exact place in the PS3 version, probably not GPU or CPU related.

There are no framerate drops during gunfights in the PS4 version versus 3 already in this very short video in the XB1 version.

But anyway that face-off video is already useless as nobody knows how the 1080p XB1 build will run on XB1 yet. The only thing we know is that they already cut some effects: more rudimentary HUD and no more chromatic aberrations. But those 2 missing "features" are already a win/win for XB1 IMO.

And Shifty, I still don't understand the irony, by now you should certainly know that english is not my first language :LOL:
 
*AHEM* This is the technical discussion thread, not the "I'm pissed off because my product doesn't look as great as the others now." So don't go bringing up the parity bullshit here.
 
I would love to see DF include the power draw in face-offs;
it's a good indication of seeing how much a game stresses the hardware.
Can anyone contact Richard Leadbetter about this? If you can, you are allowed to say you came up with the idea.

For example I expect Destiny to use significantly less when compared to other PS4 games like Infamous.
Or even Infamous versus itself; both capped and uncapped; there should be a noticeable difference
 
Status
Not open for further replies.
Back
Top