Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
http://m.neogaf.com/showpost.php?p=158626975

Is The Xbox One fillrate bound on transparency? The Xbox One look like GPU bound in Dark soul 2 and Borderland.

Seems fairly indicative, yes. These sorts of transparencies are low on shader complexity, so it's more likely to be ROP/bandwidth limited.

For 32bpp blending though, 16 ROPs*853MHz*4Bpp*(colour+z)*(read+write) -> need ~213GiB/s

hm...

----

edit: Guess one might wonder if Dark Souls 2 is using a straight port of DX11 from PC. Similarly, Borderlands as well.

Does MS mandate the latest SDK?
 
Last edited:
I think this is going to be the same issue with The Witcher 3 on XB1. Every XB1 Witcher video that I have seen seems to have framerate hitching or stuttering issues. I'm surprised DF hasn't done a video analyst yet...
DF did do a frame rate analysis of the E3 footage of The Witcher 3 for the Xbox One.
That's has been quite a while though so hopefully they have optimised it.

 
DF did do a frame rate analysis of the E3 footage of The Witcher 3 for the Xbox One.
That's has been quite a while though so hopefully they have optimised it.


Great find and memory.

So the lowest record frame(s) was 18fps, with the majority of drops hitting the mid 20s. Hopefully this year's E3 build will show some vast improvements in performance, without too much sacrifice to IQ.

Edit: I noticed at 1:11 just splashing through the water caused frame drops... while traversing through the forest around 1:23 caused some heavy frame drops.
 
Last edited:
Heh. DF did a video analysis for the patched PS4 Revelations 2.


Mostly much better! Still some issues later on in the vid though... hm... :s The denser forest section seems to be the main problem. Everywhere else seems fine. Kind of bizarre to see the framerate drop at the very end when the explosion happens. If the situation were reversed, you'd think it was just down to fillrate on XO, but that can't be the case here, surely (for PS4).

Maybe they've got something funny going on with the CPU side of particles? Titanfall had some issues with particle rendering too, CPU-related.

GNMX overhead?
 
Last edited:
Digital Foundry: Tech Analysis: Splatoon

We recently had a chance to take a closer look at a near complete build of Splatoon and, with two months remaining until release, the game is looking remarkably polished. Equipping players with paint guns instead of gaming's typical selection of assault rifles and other high calibre weaponry, Nintendo has approached the multiplayer arena shooter with fresh eyes, resulting in a fresh new experience that is welcoming for hardcore and casual players alike. Despite its unfamiliarity with the genre, Nintendo has manages to produce a visually striking game that delivers a level of polish one would expect from any of its high-profile releases.
 
Heh. DF did a video analysis for the patched PS4 Revelations 2.


Mostly much better! Still some issues later on in the vid though... hm... :s The denser forest section seems to be the main problem. Everywhere else seems fine. Kind of bizarre to see the framerate drop at the very end when the explosion happens. If the situation were reversed, you'd think it was just down to fillrate on XO, but that can't be the case here, surely (for PS4).

Maybe they've got something funny going on with the CPU side of particles? Titanfall had some issues with particle rendering too, CPU-related.

GNMX overhead?

Watching this: Sometimes ~30fps on PS4 versus ~50fps on XB1 in not much happening areas (so probably CPU bound during streamed loading), that's big and very unusual compared to 99% others multiplatform games where we can see now that the CPU bound moments perform rather similarly on both consoles (as it should anyway).

I am really thinking that the culprit could be bad (really bad) CPU code like maybe Capcom have big problems with the code generation using the PS4 tools (Clang using LLVM) which are completely different than the XB1 tools, using the good old GCC also used in the previous generation of consoles, PS3 included.

And that's their flagship engine, the MT framework...No good for future Capcom titles on PS4...
 
Maybe. I think this has more to do with XB360/PS3 being in the mix...
That would affect the general gameplay scope and the quality of assets, sure.

I am really thinking that the culprit could be bad (really bad) CPU code like maybe Capcom have big problems with the code generation using the PS4 tools (Clang using LLVM) which are completely different than the XB1 tools, using the good old GCC also used in the previous generation of consoles, PS3 included.

Yikes. :/

And that's their flagship engine, the MT framework...No good for future Capcom titles on PS4...
Well, it's essentially a budget release.

They're probably just reusing whatever they had for RER1, and downporting the PC version of the engine to consoles. One would hope that the team working on RE7 is mostly separate (aside from story continuity folks).
 
Watching this: Sometimes ~30fps on PS4 versus ~50fps on XB1 in not much happening areas (so probably CPU bound during streamed loading), that's big and very unusual compared to 99% others multiplatform games where we can see now that the CPU bound moments perform rather similarly on both consoles (as it should anyway).

I am really thinking that the culprit could be bad (really bad) CPU code like maybe Capcom have big problems with the code generation using the PS4 tools (Clang using LLVM) which are completely different than the XB1 tools, using the good old GCC also used in the previous generation of consoles, PS3 included.

And that's their flagship engine, the MT framework...No good for future Capcom titles on PS4...

MT framework is old gen Capcom engine. New engine is the Deep Down one.

Edit: Pantha Rhei engine
 
Last edited:
CPU optimization is such a rarely talked about subject compared to graphics code that I would really welcome a topic about it and get an idea of what generally can cause the frame time to go over budget etc.
 
Heh. DF did a video analysis for the patched PS4 Revelations 2.
...
GNMX overhead?

A GNMX overhead causing almost a 100% slower performance on quite sustained moments on the PS4 version in CPU bound scenes compared to the XB1 version which uses directX?

Yikes. :/
 
CPU optimization is such a rarely talked about subject compared to graphics code that I would really welcome a topic about it and get an idea of what generally can cause the frame time to go over budget etc.

Forum search: sebbbi, cpu, simd, vector, vmx128, fmad, etc. :cool:

*cough*
 
Lol that's one aspect lol. Yes that was an excellent read. But would like to see more than just game code. Other things like setting up streaming or stuff like that. SOA, and AOS tend to be the biggest topics.
 
I thought the latest Naughty Dog presentation was very enlightening ...
 
Status
Not open for further replies.
Back
Top