Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
In the Eurogamer download high quality 1080p PlayStation 4 vs Xbox One video (172.2MB). At 4:55 even the "COVER" HUD text displayed during the Gameplay is clearly upscaled (right) when PS4 "COVER" hint is native:


At this precise moment Lara is running, it's 100% gameplay confirmed. Also the textures, geometry and assets are obviously upscaled, it's easily visible on the board right next to Lara.

Here another HUD text comparison during normal XB1 native image, the text on XB1 is perfectly identical and looks as native as PS4 text:


It's just about being exact. The game dynamically changes the resolution on the XB1 version also during gameplay. And Digital Foundry did miss it.

Congrats, you found the needle in the haystack.

I'm still confused if that's coming out of a cutscene though.

The actual clip in the video is literally only 2-3 seconds long and starts there. My guess due to the onscreen text cover command, is it is coming out of a cutscene. Even a bug maybe.

I'm going to say gameplay is 1080P if DF (and the GAF's fine toothed comb) didn't notice any other instances. DF has a ton of like for like comparison shots and does even one gameplay example show a res difference?

To label this game with the broad stroke of "dynamic resolution" on XBO seems simply untrue during gameplay, if it's only that one instance.

DF has certainly outed dynamic resolutions in games before, so I think they'd spot it.

but you are getting double the amount of temporal information,

Not double, the PS4 gameplay average was 50.98 FPS not 60. XBO average could also be significantly higher if it wasn't capped (not that it matters, since that would have caused judder despite more "temporal information")

I do wish we still had Lens of Truth or something around for a different, even if ham handed, look at multiplats. It's kinda scary DF is literally our only source for FPS info, even if they do a great job.
 
Found in a playthrough, the 900P XO TR DE gameplay does appear to be coming out of cutscene fwiw

http://www.youtube.com/watch?v=IgENZRWxH1E&t=10m20s

ALSo DF's "enhanced framerate analyses" is up (not sure if already posted). Seems to include frametimes too, but i cant make much of it

http://www.youtube.com/watch?v=RWWtm4Wq9QU


It's not exactly that advanced, it pretty much just tells you exactly whenever a frame was a duplicate of the last, i.e. no frame refresh.

if it's 16ms, it's running at 60 fps.
if the number drops to 33ms, then it's a duplicate frame for the PS4, but this the normal for the XB1
another drop (only happens on XB1 in this case) throws it down to ~60ms.

So whenever you see changes in the "advanced framerate", it tells you whenever the load is high in a more precise manner.
 
It's not exactly that advanced, it pretty much just tells you exactly whenever a frame was a duplicate of the last, i.e. no frame refresh.
Nope. You can check back on Richard Leadbetter's early discussions on this forum on how to measure framerate. He has to count partial frames when not v-locked, which means spotting a tear position and getting a precise ms count to the new frame.
 
Nope. You can check back on Richard Leadbetter's early discussions on this forum on how to measure framerate. He has to count partial frames when not v-locked, which means spotting a tear position and getting a precise ms count to the new frame.

Ah, didn't see that. On closer inspection it makes more sense. about the torn frames.


I think you meant 16.7ms x 3 = 50ms.

Sorry my bad, it's 50 ms.
 
Some texture improvements on ps4 in comparison with pc from madfanboy
1920x-1pc1.jpg

1920x-1ps41.jpg

1920gdsfgsd.jpg

1920gfdgfd.jpg

tochka-v-sravnenie-tomb-raider-definitive-edition_3.png

tochka-v-sravnenie-tomb-raider-definitive-edition_1.png
 
Latest COD patch lower XB1 frame rate.

http://www.eurogamer.net/articles/digitalfoundry-2014-performance-drops-on-cod-ghosts-patch

Meanwhile, aiming down the barrel of a sniper scope can see significant drops down as far as the mid-30s - something we never encountered before in ether next-gen version of Ghosts during multiplayer. As a result of the fluctuating frame-rate we can feel a clear reduction in the responsiveness of the controls, while on a visual level the introduction of stutter impacts the smooth flow traditionally associated with Call of Duty multiplayer action.

On the handful of other maps we tried - both on the standard levels and in the Onslaught DLC - the drops in performance are far less severe, with momentary dips occurring during more hectic moments of play or when explosions and alpha effects are rendered on screen. Controller response is much more solid, and we don't get the feeling that gameplay has been compromised to anywhere near the same extent as we saw on Stonehaven - the experience remains mostly smooth, even though there's a distinct feeling that things aren't quite as stable as they were before the latest patch was installed.

Overall, it's clear that the experience isn't significantly broken across the multiplayer game, even though the update has impacted on the stability of the game to a noticeable degree. With that said, the fact that a patch designed to improve the game can actually significantly degrade performance and negatively affect the gameplay is clearly disappointing, and we would hope that it will be addressed. For the record, we also tested out the PS4 version of Call of Duty: Ghosts online and found no change to the performance level we saw at launch - not as smooth as Xbox One, but with a much higher rendering resolution.

Quite why the Xbox One version appears to run less smoothly than it used to remains unclear. We reached out to Activision and Infinity Ward earlier this week in the hopes of shedding more light on what might be causing these issues, but as of this writing, we've received no comment.
 
The dynamic res doesn't seem to actually be working during even the heaviest moments of gameplay though (framerate drops under 30).

Or maybe they just use it only specifically (not dynamically) for the most demanding levels/sub-levels (some shanty town levels/sub-levels for instance) and the most demanding cutscenes?

That would explain everything: the discrepancies shown between the framerate gaps between the 2 versions in the performance videos during different levels/cutscenes.
 
Wonder what res the 360 version runs at? I'm assuming it's still 60fps.
 
Last edited by a moderator:
Wonder what the 360 res the 360 version runs at? I'm assuming it's still 60fps.

mm... yeah. I was thinking they would just be around MW2 res (to fit 10MB) with 2xMSAA as well, but maybe Bluepoint would go lower if they really had to. :s
 
Welp, that's pretty impressive for DF (does Al still do their pixel counting?) to nail Titanfall at 792P exactly and then have Respawn confirm it. Pretty granular number there, which is tough as my recent short foray into pixel counting taught me.

Anyways that's ~1.1m pixels. About 21% over 720P. Not that it matters with the res in flux.
 
Thank Tom for the very good shots with super long edges. :)

---

It's kind of a suspicious choice considering we just heard about the 8% reservation being freed up and the resolution is +10% over 720p. It's entirely possible there's no causal relation. "It's not 720p" works in PR favour.

Even so, in terms of milliseconds, 8% time slice should mean about 1.33ms for 16.67ms frame (I think). Pixel load may increase 10%, but that's only part of the rendering load, so maybe it works out (I'd expect the framerate dips in this game to be fill limited anyway though).

Anyways, the framebuffer should be ~17MB assuming it's a forward render 32bpp. I was expecting something ~1360x768 2xMSAA since that's neatly around 16MB, and then they could just fit a 2k x 2k shadowmap (16MB) into the rest (all speculation).

The talk of increasing res (from that blog) is kind of hard to believe given the obvious trade-offs between MSAA and higher res. From a scratchpad POV, even 1080p no AA is slightly smaller memory footprint than 792p 2xMSAA.

Woulda-coulda...
 
I'm a bit perplexed by the seemingly arbitrary resolutions being used in next-gen games. It's not like there's a rendering resolution value on the packaging to give consumers an idea of game quality. I doubt 792p is any discernibly better than 720p, so why not just go with a standard resolution and add more stuff/smoother framerate?
 
Indeed. :p 792p scaled either way to 720p (10/11) or to 1080p (15/11) is pretty wacky too. It's probably worse for folks who still have 720p sets, but I wonder if just 720p is still a better scaling to 1080p due to it being well... easy (1.5x).
 
Really curious to see what the difference to X360 version is. Xbox 360 runs the source engine pretty well. If its 720p/60fps or just slightly sub hd it has to be really cut down version and not just textures

Makes no sense. Will a PC with similar GPU power to One is run this at 1080p?
 
Status
Not open for further replies.
Back
Top