How should devs handle ports between consoles? *spawn

And yet an industry veteran like the metro lead programmer finds the impact on CPU performance caused by GPU access to be a real problem. Clearly having gone through optimisation 101, it's still an issue for developers.

The claims that AC is intentionally leaving zero-effort-to-attain performance on the table because MS and Ubisosft are evil may or may not be true. The claims that GPU load has no impact on CPU performance (and vice versa) are, however, false.

You can't live in the cache forever... :D
 
And yet an industry veteran like the metro lead programmer finds the impact on CPU performance caused by GPU access to be a real problem. Clearly having gone through optimisation 101, it's still an issue for developers.

Unless I missed something, the issues are only on the Xbox One. And the issues impact the 'optimisation 101' principle for cramming everything into cache. Let's go through them:

Oles Shishkovstov said:
Well, you kind of answered your own question - PS4 is just a bit more powerful. You forgot to mention the ROP count, it's important too - and let's not forget that both CPU and GPU share bandwidth to DRAM [on both consoles]. I've seen a lot of cases while profiling Xbox One when the GPU could perform fast enough but only when the CPU is basically idle. Unfortunately I've even seen the other way round, when the CPU does perform as expected but only under idle GPU, even if it (the CPU) is supposed to get prioritised memory access. That is why Microsoft's decision to boost the clocks just before the launch was a sensible thing to do with the design set in stone.

Whatever version of the XDK/OS they used, prioritisation of CPU/GPU access to RAM was broken.

Oles Shishkovstov said:
Let's put it that way - we have seen scenarios where a single CPU core was fully loaded just by issuing draw-calls on Xbox One (and that's surely on the 'mono' driver with several fast-path calls utilised). Then, the same scenario on PS4, it was actually difficult to find those draw-calls in the profile graphs, because they are using almost no time and are barely visible as a result.

In general - I don't really get why they choose DX11 as a starting point for the console. It's a console! Why care about some legacy stuff at all? On PS4, most GPU commands are just a few DWORDs written into the command buffer, let's say just a few CPU clock cycles. On Xbox One it easily could be one million times slower because of all the bookkeeping the API does.

This issue is particularly problematic because mammoth complicated DirectX API calls are contaminating the cache and you've only got 2mb per cluster to start with. When you have so little shared between four cores (and two cores of one cluster are reserved for the OS which is doing Dog-knows-what to its pool of cache), it's that much more difficult to optimise.

Presumably these issues will get resolved and optimisation 101 will get easier on Xbox One.

The claims that AC is intentionally leaving zero-effort-to-attain performance on the table because MS and Ubisosft are evil may or may not be true. The claims that GPU load has no impact on CPU performance (and vice versa) are, however, false.

I can't imagine a case where increasing resolution would have no impact on the CPU, this basically seems impossible but the degree to which the CPU is affected (computational workload, bandwidth available etc) will remain an unknown unless Ubisoft decide to explain the workings of their engine (AnvilNext). It could be a lot, it could be very little.

You can't live in the cache forever... :D
You can't and you can't live forever either but it's nice to live in your optimum environment for as long as possible! ;)
 
(Still on damn phone).

Yeah, Xbox One has issues caused by dram contention. But it's not the only platform.

Even with CPU prioritised a more saturated shared memory bus will lead to longer access times for the CPU, so more stalls, lower utilisation etc. Also, on PS4 accessing dram with the CPU drops available BW for GPU disproportionately - probably because of prioritised CPU access.

I expect both PS4 and Bone show this behaviour actually. Esram may alleviate this more or less depending on how it's used. Point is, lowering resolution might help with CPU bottlenecks.

Will look for some of the APU tests tomorrow, they showed the same thing iirc.

Phoneds are not my optimal environment. :(
 
(Still on damn phone).

Yeah, Xbox One has issues caused by dram contention. But it's not the only platform.

I'm not denying some contention for access to RAM will happen on both platforms, my post was intended to address your claim that Metro's developers complained about this. They actually didn't mention contention, they were complaining about a) a bug (no CPU prioritisation) which caused an issue similar to contention (but which isn't) and b) general API overhead making some graphics functions on Xbox One much slower than those on PlayStation 4. The byproduct of this is that complex APIs contaminate cache which was related to my initial point about trying to operate within it as much as possible.

And remember that both APUs have a 256-bit bus to RAM, not a 64-bit bus like a PC. A single read/write cycle transfers 4 times as much data between CPU and RAM. Less read/writes required for the CPU (operating outside of cache), less contention opportunities.
 
OK, so game 1 - Dirt 3:

Date Min FPS Average FPS Quality Resolution AA Mode DX Version GPU GPU RAM GPU Driver CPU
07/10/2014 21:26:15 40.68745 51.36375 Ultra 1600x900@60Hz 8xmsaa 11 AMD Radeon HD 7800 Series 2048 MB 14.301.1001.0 AMD Athlon(tm) 64 X2 Dual Core Processor 6400+
07/10/2014 21:21:28 39.06722 50.83886 Ultra 1920x1080@60Hz 8xmsaa 11 AMD Radeon HD 7800 Series 2048 MB 14.301.1001.0 AMD Athlon(tm) 64 X2 Dual Core Processor 6400+
07/10/2014 21:16:47 39.58699 51.44499 Ultra 1920x1080@60Hz 4xmsaa 11 AMD Radeon HD 7800 Series 2048 MB 14.301.1001.0 AMD Athlon(tm) 64 X2 Dual Core Processor 6400+
07/10/2014 21:11:22 35.31805 50.58019 Ultra 1920x1080@60Hz off 11 AMD Radeon HD 7800 Series 2048 MB 14.301.1001.0 AMD Athlon(tm) 64 X2 Dual Core Processor 6400+
07/10/2014 21:06:22 38.59094 49.42814 Ultra 1600x900@60Hz off 11 AMD Radeon HD 7800 Series 2048 MB 14.301.1001.0 AMD Athlon(tm) 64 X2 Dual Core Processor 6400+

Game 2 - Dirt Showdown:

<resolution width="1600" height="900" aspect="auto" fullscreen="true" vsync="0" multisampling="off">
<results samples="1855" min_fps="23.124611" av_fps="32.330303" min_fps_ms="43.243969" av_fps_ms="30.930735" />

<resolution width="1600" height="900" aspect="auto" fullscreen="true" vsync="0" multisampling="8xmsaa">
<results samples="1821" min_fps="23.215502" av_fps="32.427235" min_fps_ms="43.074665" av_fps_ms="30.838276" />

<resolution width="1920" height="1080" aspect="auto" fullscreen="true" vsync="0" multisampling="off">
<results samples="1789" min_fps="22.705389" av_fps="32.308670" min_fps_ms="44.042408" av_fps_ms="30.951445" />

<resolution width="1920" height="1080" aspect="auto" fullscreen="true" vsync="0" multisampling="8xmsaa">
<results samples="1803" min_fps="22.494267" av_fps="32.181297" min_fps_ms="44.455772" av_fps_ms="31.073950" />

Game 3 - Civ 5 late game benchmark used:

1600x900 no AA
[84595.765] LateGame Full Render Score , 1559, NumDrawCallsPerFrame ,0
[84595.765] LateGame NoShadow Render Score , 1868, NumDrawCallsPerFrame ,0
[84595.765] LateGame No Render Score , 3058, NumDrawCallsPerFrame ,0

1920x1080 no AA
[85067.031] LateGame Full Render Score , 1561, NumDrawCallsPerFrame ,0
[85067.031] LateGame NoShadow Render Score , 1861, NumDrawCallsPerFrame ,0
[85067.031] LateGame No Render Score , 2178, NumDrawCallsPerFrame ,0

1920x1080 8xMSAA
[85521.468] LateGame Full Render Score , 1539, NumDrawCallsPerFrame ,0
[85521.468] LateGame NoShadow Render Score , 1828, NumDrawCallsPerFrame ,0
[85521.468] LateGame No Render Score , 2255, NumDrawCallsPerFrame ,0

OK, so GTA IV won't complete a benchmark on windows 10 so these will have to do until Liberty City Stories downloads and I try that benchmark.

The difference appears negligible and could be explained by the benchmarks not being identical in Dirt and Dirt showdown as the benchmark runs with the AI enabled, so no race is the same.

I tried Batman Arkham Origins, but it seems even old faithful is enough CPU for that game.

My conclusion, seems in CPU limited games you can indeed ramp up the resolution and AA without much trouble if your GPU is powerful enough.
 
Last edited by a moderator:
I asked this question a few pages back and got no response.
If we go with the Cpu being a bottleneck on AC Unity for both consoles meaning there aren't any free cpu cycles or bandwidth overhead is it even possible to use the extra Gpu alu in the Ps4?
Do the compute units require instructions from the cpu to use pixel shaders?
 
I asked this question a few pages back and got no response.
If we go with the Cpu being a bottleneck on AC Unity for both consoles meaning there aren't any free cpu cycles or bandwidth overhead is it even possible to use the extra Gpu alu in the Ps4?
Do the compute units require instructions from the cpu to use pixel shaders?

GPU workloads do not descend from the heavens.
 
GPU workloads do not descend from the heavens.

But isn't that one of the features of HSA being able to use the GPU as a front end for some workloads?


03%20-%20HSA%20Features.jpg
 
They do come from heaven actually. God made developers in his own image, and gpu workloads come from the brains of devs. Ubisoft's can only been seen on Sundays though for some reason. Their marketing department however is around for the rest of the week.
 
Interesting...

http://www.eurogamer.net/articles/2...nity-graphics-lock-for-parity-on-ps4-xbox-one

Ubisoft has told Eurogamer that Assassin's Creed Unity's final technical specifications for PlayStation 4 and Xbox One are actually still to be locked down.

Ubisoft's original statement highlights to me something I've noticed when speaking to several developers - that often they aren't quite so invested in the format wars and platform comparison as the players seem to be," Digital Foundry editor Rich Leadbetter commented.

"Their emphasis is on getting a great product together first and foremost, and a good experience for all is more important to them than maxing any particular platform.

"That said, the design of Xbox One and PlayStation 4 makes the decision to lock specs on Assassin's Creed: Unity difficult to fathom. Time and time again, we've seen evidence that the Xbox One graphics hardware operates almost like a subset of the PS4's - perhaps not surprising bearing in mind that Sony's GPU is essentially a larger version of Microsoft's. We often see resolution differentials on cutting edge titles, and even though the difference is frequently less of an issue than the raw numbers suggest, a cleaner presentation overall is obviously welcome.

"We're a bit puzzled by the ACU situation and had a chat about it amongst ourselves. Internal bandwidth is shared between CPU and GPU on both machines which can result in a battle for resources, and if ACU is as CPU-heavy as Ubisoft says it is, that could potentially have ramifications - the only problem with this theory is that there's very little evidence that other titles have experienced the same issue, certainly not judged by Ubisoft's own output.

"I think the main problem here is that the initial story suggests that the publisher has put platform parity politics over getting the best possible game out there, while the updated statement offers no real answers to the questions gamers actually have. If there's a technical reason that prevents PS4 offering a full 1080p presentation - or something closer to it compared to the Xbox One's 900p - I'm really curious to know what it is."

The plot thickens...
 
OK, so game 1 - Dirt 3:

Date Min FPS Average FPS Quality Resolution AA Mode DX Version GPU GPU RAM GPU Driver CPU
07/10/2014 21:26:15 40.68745 51.36375 Ultra 1600x900@60Hz 8xmsaa 11 AMD Radeon HD 7800 Series 2048 MB 14.301.1001.0 AMD Athlon(tm) 64 X2 Dual Core Processor 6400+
07/10/2014 21:21:28 39.06722 50.83886 Ultra 1920x1080@60Hz 8xmsaa 11 AMD Radeon HD 7800 Series 2048 MB 14.301.1001.0 AMD Athlon(tm) 64 X2 Dual Core Processor 6400+
07/10/2014 21:16:47 39.58699 51.44499 Ultra 1920x1080@60Hz 4xmsaa 11 AMD Radeon HD 7800 Series 2048 MB 14.301.1001.0 AMD Athlon(tm) 64 X2 Dual Core Processor 6400+
07/10/2014 21:11:22 35.31805 50.58019 Ultra 1920x1080@60Hz off 11 AMD Radeon HD 7800 Series 2048 MB 14.301.1001.0 AMD Athlon(tm) 64 X2 Dual Core Processor 6400+
07/10/2014 21:06:22 38.59094 49.42814 Ultra 1600x900@60Hz off 11 AMD Radeon HD 7800 Series 2048 MB 14.301.1001.0 AMD Athlon(tm) 64 X2 Dual Core Processor 6400+

Game 2 - Dirt Showdown:

<resolution width="1600" height="900" aspect="auto" fullscreen="true" vsync="0" multisampling="off">
<results samples="1855" min_fps="23.124611" av_fps="32.330303" min_fps_ms="43.243969" av_fps_ms="30.930735" />

<resolution width="1600" height="900" aspect="auto" fullscreen="true" vsync="0" multisampling="8xmsaa">
<results samples="1821" min_fps="23.215502" av_fps="32.427235" min_fps_ms="43.074665" av_fps_ms="30.838276" />

<resolution width="1920" height="1080" aspect="auto" fullscreen="true" vsync="0" multisampling="off">
<results samples="1789" min_fps="22.705389" av_fps="32.308670" min_fps_ms="44.042408" av_fps_ms="30.951445" />

<resolution width="1920" height="1080" aspect="auto" fullscreen="true" vsync="0" multisampling="8xmsaa">
<results samples="1803" min_fps="22.494267" av_fps="32.181297" min_fps_ms="44.455772" av_fps_ms="31.073950" />

Game 3 - Civ 5 late game benchmark used:

1600x900 no AA
[84595.765] LateGame Full Render Score , 1559, NumDrawCallsPerFrame ,0
[84595.765] LateGame NoShadow Render Score , 1868, NumDrawCallsPerFrame ,0
[84595.765] LateGame No Render Score , 3058, NumDrawCallsPerFrame ,0

1920x1080 no AA
[85067.031] LateGame Full Render Score , 1561, NumDrawCallsPerFrame ,0
[85067.031] LateGame NoShadow Render Score , 1861, NumDrawCallsPerFrame ,0
[85067.031] LateGame No Render Score , 2178, NumDrawCallsPerFrame ,0

1920x1080 8xMSAA
[85521.468] LateGame Full Render Score , 1539, NumDrawCallsPerFrame ,0
[85521.468] LateGame NoShadow Render Score , 1828, NumDrawCallsPerFrame ,0
[85521.468] LateGame No Render Score , 2255, NumDrawCallsPerFrame ,0

OK, so GTA IV won't complete a benchmark on windows 10 so these will have to do until Liberty City Stories downloads and I try that benchmark.

The difference appears negligible and could be explained by the benchmarks not being identical in Dirt and Dirt showdown as the benchmark runs with the AI enabled, so no race is the same.

I tried Batman Arkham Origins, but it seems even old faithful is enough CPU for that game.

My conclusion, seems in CPU limited games you can indeed ramp up the resolution and AA without much trouble if your GPU is powerful enough.

What cpu are you using. Civ 5 is almost 5 years old at this point. The majority of the other games you tested were designed based around the console cpu's which are by today's standards quite lacking.
 


So maybe they gimp the Xbox One version (unnecessarily) down to 720P to prevent all the complaining :rolleyes: I could actually see that happening.

The most pressure is clearly that PS4 version should be over Xbox One. So if a developer is bowing to pressure, that would be the way they go. perhaps Ubi should have rephrased it as "to prevent all the debate, we just downgraded the Xbox resolution". Because truthfully that's the debate not what Ubi said.
 
What cpu are you using. Civ 5 is almost 5 years old at this point. The majority of the other games you tested were designed based around the console cpu's which are by today's standards quite lacking.

It's says in the first benchmark. X6400+
 
Feel free to send me a cpu lol. I have a 1 year old so I would really like to spend my money elsewhere ;)

It's not to bad I can play must things above 30fps which is acceptable to me.

And there's the ps4 should I be asked to play by the misses ;)
 
So maybe they gimp the Xbox One version (unnecessarily) down to 720P to prevent all the complaining :rolleyes: I could actually see that happening.

The most pressure is clearly that PS4 version should be over Xbox One. So if a developer is bowing to pressure, that would be the way they go. perhaps Ubi should have rephrased it as "to prevent all the debate, we just downgraded the Xbox resolution". Because truthfully that's the debate not what Ubi said.

Huh? What are you talking about?
 
It's alright, in his world there's no power difference.

At any rate with talk of the frame rate being poor maybe they should reduce the resolution on Xbox oneand even ps4 if it is indeed a cpu problem limiting the frame rate.


I would settle for 720p with 16x anisotropic filtering and 8xmsaa I'm sure that would give pretty good results.
 
Just get both versions to 1080p, PS4 should be fine as it is and just crank down settings for Xbone till it locks at 30fps. These days the difference between Low, Medium, High and Very High is not quite as drastic if all being rendered at the same res. But the performance you save can be fruitful.
 
Back
Top