Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
I never said it would be difficult to implement. The point was in the article Df doesnt make it clear if they are guessing if it is used or if they know. Take the recent announcement about Diablo 3. It comes out only a week earlier yet the res upgrade is implemented in a day one patch not on the disk. I question if it is used mainly because of the short amount of time since the SDK released, the fact that discs are made weeks in advance before launch and the minimum increase in res. Why bother with such a small increase. Df doesnt clarify anything about the code they are using for analysis.


We'll cover like-for-like performance on Xbox One and PlayStation 4 in a forthcoming pre-launch update, but we're going into that testing with the expectation of very close results. Differences kick in at the resolution level: PS4 hits its 60fps target at full 1080p, while Xbox One currently stands at a curious 912p native resolution - that would be something in the region of 1620x912 (assuming square pixels). The original plan for Xbox One was to ship at 900p, but the June XDK update (returning the Kinect GPU resources to developers) has allowed for a tiny resolution boost - our guess here is that 4A opted to bank the additional resource to help lock down that all-important frame-rate rather than really push the pixel-count. If so, that's the right trade.

Sounds like they know to me... the only Eurogamer assumption being, that 4A felt the resources were better met/needed for locking framerate, rather than pushing above 912p.
 
Metro Redux

This game can show us the potential of those consoles compared to old gen:

On PS3 the game ran at 1152x640@30fps, on PS4 it's already 1080p at locked 60fps and better assets/lighting. If we account of the fact that the PS3 game was in fact often hovering at ~25fps with constant screen tearing the PS4 game is already pushing ~6.75x more pixels than on PS3 (compare that with Destiny on PS4 that pushes only ~3.25x more pixels than on PS3, the difference is staggering, Destiny on PS4 should really run at 60fps...).

And the game on old gen was already really well optimized and probably pushed both systems to their limits and specifically to PS3's limits as the game on PS3 even won the DF's face-off against the X360 version. Metro LL on PS3 was one of the most technically impressive FPS I had played on the console, onpar with Killzone 3 and FarCry 3 on some open outdoor levels.

Kudos the 4A Games for the impressive work they have done!
 
Sounds like they know to me... the only Eurogamer assumption being, that 4A felt the resources were better met/needed for locking framerate, rather than pushing above 912p.

Either way it is fine with me. Im hyped for this game as it is.
Im glad DF has good things to say about it.
 
I'm looking forward for the PC version of Redux and their analysis, if you look at the original last light on the PC, on a GPU comparable to the PS4, even low is not enough for 60FPS

metro_7850_settings.png


while CPU performance was not a problem
http://pclab.pl/zdjecia/artykuly/chaostheory/2013/05/metro/metro_cpu_vhigh_1280_gtx680.png
 
About the latest DF SSD article on X1, which shows the expected small gains from SSD, is this due partly to the consoles having an old SATA standard bottlenecking them as I recall?

It would seem to me to possibly grant big performance gains if consoles could embrace SSD, by being built with it in mind perhaps. With SSD speeds and sizes skyrocketing and prices plummeting in recent years are they missing the boat?

I realize SSD benefit more from seek times than throughput, but even throughput should be a hefty increase by now.

I've always envisioned maybe a more expensive "super elite" type SKU with perhaps a speedy 128GB or 256GB SSD built in perhaps (need to be paired with a 500GB+ mechanical drive I suppose nowdays).
 
It was a problem once the CPU gets weak enough though, which console CPU's are. An i3 3220 trounces the Jaguars probably. 4 threads at 3.3 ghz and much higher IPC.

Obviously it wasn't for the remaster though.

The fact remains that it runs at 60fps on both consoles, on PS4 at full 1080p, which is quite a bit higher than on those graphs, for whatever reason. Perhaps they scaled back on some effects to ensure a stable framerate.
 
Original story:

Evidently, this isn't just down to the PS4's GPU struggling to render multiple enemies on-screen or complex scenes. Even with huge bursts of alpha effects and dozens of enemies flying from all angles, the console regularly proves it can hold up at 60fps where it counts - even in co-op multiplayer. This suggests a GPU bottleneck from running at 1080p is, in all likelihood, unlikely to be the cause. As a possibility, these dips may come down to Diablo 3's managing of background processes; CPU-side calculations, or asset streaming beyond the player's field of view.

After Blizzard contacts them:

So it appears as though elements of the renderer operate at a pure 60fps, while the HDMI standard actually incorporates the legacy NTSC drop-frame standard, running at 59.94fps.

This turn of event has two morals. Early analysis can help the devs fix something before release (yea!) and when presented with no evidence DF just makes stuff up rather than say "We don't know" (boo!).
 
Original story:



After Blizzard contacts them:



This turn of event has two morals. Early analysis can help the devs fix something before release (yea!) and when presented with no evidence DF just makes stuff up rather than say "We don't know" (boo!).

Not sure I follow the second part of your statement. What was made up?
 
Original story:

and when presented with no evidence DF just makes stuff up rather than say "We don't know" (boo!).

these dips may come down to Diablo 3's managing of background processes; CPU-side calculations, or asset streaming beyond the player's field of view
If you make stuff up, like they did, at least you consider all the probable historical reasons. They didn't even think about the possibility of a bug. Why? They certainly aren't stupid.

It's not like some frame pacing or framerate issue couldn't be caused by a real bug, didn't they recently review Destiny Beta with the pre-patched frame pacing issues?

So they probably thought about the possibility of a bug just because of Destiny's precedent, but the occasion was perfect to write down the different frequencies of both jaguars on a DF article for the general public even if they know that the XB1's CPU is still slower, somehow.

It's conceivable that parts of the engine are hitting a bottleneck here, where the boost to the Xbox One's CPU clock speeds (up 1.75GHz per core, as opposed to the 1.6GHz on PS4) could possibly make an impact

They insinuate (for the general public) that PS4's CPU is weaker when they certainly know that it's not true.
 
Not sure I follow the second part of your statement. What was made up?

Their original theory. They had no clue, so they just pulled some stuff out of the air and fed the console warriors across the web. Pure click bait. I'm glad they helped Blizzard, but ultimately the story left egg on their face.
 
Digital Foundry sometimes takes stabs in the dark, and I don't know if it's always appropriate. It's not as if they're saying, this is 100% the reason for the problem and turning out to be wrong. It's a fine line, I guess. Sometimes I think it would be better for them to just say, "We don't know what it is. We are trying to contact the developer," and leave it at that.

For me, the bigger question is why you'd bother doing a performance analysis on a game that you know is going to receive significant patches on day one. It's purely a click-bait article. You're not reviewing the product people are going to buy, so there's no point in doing it.
 
Their original theory. They had no clue, so they just pulled some stuff out of the air and fed the console warriors across the web. Pure click bait. I'm glad they helped Blizzard, but ultimately the story left egg on their face.

Your issue is with this statement?

As a possibility, these dips may come down to Diablo 3's managing of background processes; CPU-side calculations, or asset streaming beyond the player's field of view.

That's pretty clearly labeled as a guess."As a possibility, theses dips may come down to..." after which they least 3 possibilities. Perhaps I'm failing to understand why this would excite the reader (console warriors?) or leave egg on DF's face? I think there's an undertone here that I'm completely overlooking.
 
*ahem* This is not a versus thread so stop pissing in each others' cheerios just to show how steadfast your view of the world is.
 
I’m glad that Digital Foundry does these type of beta/alpha previews… it gives us some early insight into a games progress. I never considered their versus articles click-bait for the most part… maybe a little eye catching with the article titles… but that’s true with most publications on getting readers attention. Anyhow, I rather them spot issues early on, rather than later.
 
Status
Not open for further replies.
Back
Top