Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Destiny beta comparison between PS3 and PS4:

http://www.eurogamer.net/articles/digitalfoundry-2014-vs-destiny-ps3-beta

Baffled by the comment about water not reflecting bodies and effects, as the game uses screen-space reflections. When the SSR is hitting, if *ought* to reflect bodies just fine, and when there are misses, it looks like it fades to something that sort of looks like an extraordinarily simplified planar reflection which fails to reflect a lot more than just bodies.
 
When the SSR is hitting, it *ought* to reflect bodies just fine

Strange... I'm pretty sure the water reflects the other player in the original demonstrations, particularly when the other player runs ahead in the tunnel with the giant fan.

I've only spent a small amount of time with the PS3 version & solo, so I haven't really checked it out.
 
DF were looking at the performance of the relative hardware.

There is no way to predict what frame rate caps or resolutions developers will target.

Caps and resolution selection immediately mean you are not directly comparing the performance of the hardware.
 
I was reading this article again:
http://www.eurogamer.net/articles/digitalfoundry-can-xbox-one-multi-platform-games-compete-with-ps4

And it's strange how wrong it was in hindsight..
We have had 100% resolution or framerate increases in actual released games; none of examples in the article could have predicted such a worse case scenario. In other games, like Outlast; there is no difference at all. So in that case DF was wrong as well
There can be more subtle things, harder to measure in a single simple procent-scale. Draw distance, LOD, additional or higher quality effects. The more the game is adopted to both plattforms the harder and more useless it is to measure just how many pixels appear on the screen in a second.

The article was about the relative performance on the same games - down the every bit of code. And it never claimed to be more than that.
Its like pointing out that no American Family has 3 and 0.14 kids lying around, even they should on average.
 
Destiny beta comparison between PS3 and PS4:

http://www.eurogamer.net/articles/digitalfoundry-2014-vs-destiny-ps3-beta

Baffled by the comment about water not reflecting bodies and effects, as the game uses screen-space reflections. When the SSR is hitting, if *ought* to reflect bodies just fine, and when there are misses, it looks like it fades to something that sort of looks like an extraordinarily simplified planar reflection which fails to reflect a lot more than just bodies.

Ok, so I've looked at it a bit more. Everything can reflect. It's just that the angles really matter (and basically view cam distance). It's an odd one to see.

---

Shadows on 360 have jittered filter too.
 
Beta Beta Beta - Remember just Beta!

Tech Analysis: Destiny beta on Xbox One

While the additional 10 per cent of rendering resources afforded by the recent Kinect-unbundling XDK update are welcome, it does not offer up enough GPU horsepower to bridge the gap between 900p and 1080p rendering.

But even in the Xbox One's improved 1080p build, one nit-pick stands out between the next-gen consoles; a change to the HUD. On PS4 we see a three-dimensional curved display for ammo and health gauges, backed by a chromatic aberration after-effect. The Xbox One, meanwhile, adopts the flat designs also laid across the last-gen versions' screens, delivered without the extra visual twist. This is obviously a minuscule detail in the grand scheme of what the game achieves, and largely comes down to personal preference.

Meanwhile, the unleashing of the Xbox One beta coincides with some good news for PS4 owners. As reported in our earlier hands-on, the PS4 beta ran with a frame-pacing issue that impacted the smoothness of the game in motion. A perfectly smooth 30fps game will deliver each frame every 33ms - but the earlier code could see them arrive at 16ms, 33ms or even 50ms time intervals, resulting in stutter and the feeling that the game was running at a lower, less consistent frame-rate. However, both versions are beneficiaries of a patch - effective as of the beta's return on Wednesday - that corrects the issue. Simply put, Bungie's engineers have forced the engine to output one unique frame followed by one duplicate, producing an even 30fps experience throughout with no stuttering.
 
How does the frame spacing bug happen in the first place, and will it affect a load more games?

RE: Need for Speed Rivals:

"The explanation for the frame-pacing is a bug in our settings, Even though the gameplay code is configured to run the game at 30Hz, the present interval of the renderer is incorrectly set to 1, which means that the renderer can present a frame at any multiple of 1/60 seconds. This will be fixed in an upcoming patch."
 
Destiny's 1080p upgrade may not be in the beta code, but it does indeed exist, and IGN recently posted a short video showing it in action. By matching select stills from this footage, we see the gap in image quality is mostly bridged when compared with the PS4's output - tree details, in particular, now appearing crisp rather than interpolated. Some far-off, intricate detail isn't quite so well defined, but this may well be down to the fact that the 1080p images are resolved from compressed video. We chose shots that were as static as possible in order to reduce compression artefacts - but only lossless captures will show the full picture.
So because IGN got the exclusive 1080p capture, without the navtive DF uses the IGN video screen cap for comparison...just because?
 
So because IGN got the exclusive 1080p capture, without the navtive DF uses the IGN video screen cap for comparison...just because?

I'd say it's mostly because if they didn't discuss the 1080p IGN screens they'd get accused of being SonyPonies or what have you. It's a damned if you do, damned if you don't to my mind, they clearly highlighted why it was mostly a useless exercise.
 
What defines the render interval? Is that an AMD 'driver' setting, or something specific to Sony's SDK? I suppose a few high profile cases like these will ensure devs remember to set it correctly. Seems a weird feature though. Asynchronous output. What for? AMD's flexible refresh rate tech? If so, are we looking at hardware support in the consoles for variable refresh on a suitable monitor?
 
What defines the render interval? Is that an AMD 'driver' setting, or something specific to Sony's SDK? I suppose a few high profile cases like these will ensure devs remember to set it correctly. Seems a weird feature though. Asynchronous output. What for? AMD's flexible refresh rate tech? If so, are we looking at hardware support in the consoles for variable refresh on a suitable monitor?

FWIW, AMD Catalyst (CCC) has a specific setting dealing with Frame Pacing. I'm not a 100% sure (remember) seeing a frame pacing setting in Nvidia's control panel. It's weird though, frame pacing issues seems more common across AMD GPUs. But it could be strictly a driver/software issue more so than hardware.
 
I'd say it's mostly because if they didn't discuss the 1080p IGN screens they'd get accused of being SonyPonies or what have you. It's a damned if you do, damned if you don't to my mind, they clearly highlighted why it was mostly a useless exercise.

If the source material is not good enough, then just don't do it...

Can anyone honest say the middle image looks more 'crisp' than the other 2?

SGPU0Wp.jpg
 
What defines the render interval? Is that an AMD 'driver' setting, or something specific to Sony's SDK? I suppose a few high profile cases like these will ensure devs remember to set it correctly. Seems a weird feature though. Asynchronous output. What for? AMD's flexible refresh rate tech? If so, are we looking at hardware support in the consoles for variable refresh on a suitable monitor?

[strike]I thought it had something to do with the tic rate in the engine (for physics/animation etc. vs render rate). Maybe it was something else. Dunno. :oops:[/strike]
 
Can anyone honest say the middle image looks more 'crisp' than the other 2?
There's often a lot to be seen even in images with compression artifacts. Anything that's sufficiently low-frequency will stand out (presence of certain details and effects, some aliasing including on poor shadows, some LOD discrepencies, etc), and you could get a sense for performance if you have a video.

Nobody is arguing that it's a perfect comparison, but as long as there's the disclaimer that it's compressed footage, I'm not seeing that it's a huge issue.
 
What defines the render interval? Is that an AMD 'driver' setting, or something specific to Sony's SDK? I suppose a few high profile cases like these will ensure devs remember to set it correctly. Seems a weird feature though. Asynchronous output. What for? AMD's flexible refresh rate tech? If so, are we looking at hardware support in the consoles for variable refresh on a suitable monitor?

The quote seems to put this on the renderer, not the system. The renderer seems to be monitoring a timer and waiting to present a frame at the defined interval, but the interval is set too short and is allowing the renderer to present the next frame before 1/30 sec that the rest of the game assumes has passed. That means that frame comes too soon, and could in subsequent frames not have something available in time for the appropriate 1/30 increment because of the mismatch.
 
While the additional 10 per cent of rendering resources afforded by the recent Kinect-unbundling XDK update are welcome, it does not offer up enough GPU horsepower to bridge the gap between 900p and 1080p rendering.

Curious quote. If they haven't tested the 1080p version of the game, how do they know it is not enough GPU horsepower to "bridge the gap"?
 
Status
Not open for further replies.
Back
Top