Digital Foundry Article Technical Discussion Archive [2016 - 2017]

Status
Not open for further replies.
Also, what was the secret sauce between Uncharted 1 and Uncharted 2 ? Another buzz word or simply a better understanding of the hardware ?
If the devs are making better use of the hardware in XB1, why aren't they also in PS4 and maintaining the same lead? Why aren't both platforms showing improvements from general optimisation?
 
You usually spend more time optimising on/for the weaker machine, which may very well end up with the weaker machine ending up the fastest...
 
Memory bandwidth has not been mentioned yet, both are similar and must at some point become a limiting factor in title performance.

Is the engine now hitting a different buffer which is affecting resolution choice?
 
Memory bandwidth has not been mentioned yet, both are similar and must at some point become a limiting factor in title performance.

Is the engine now hitting a different buffer which is affecting resolution choice?
This is what I'm thinking, too. The xb1s has a little bit more gpu-power but what seems to be the bottle-neck is the memory-bandwidth. The higher clock-rates of the gpu also improved the bandwidth of the esram, so that should be the big difference between xb1s and xb1. Also the xbox has the slightly faster cpu, so this could be why xb1 has a little better frames (at a little lower res). The one memory pool for PS4 could also explain the lower framerates. Battlefield is normally also a CPU intensive game. So if the cpu has more cycles on memory, the gpu has less available. This could be a limiting factor in optimizing for the PS4. If you optimise your title it can heavily use the CPU, it steals memory-bandwidth from the GPU and the GPU might have to wait for some cycles and can't be maxed out. On the other hand, if you max out the GPU you can't max out the CPU. at least in this scenario it might be possible.
 
It could be that the Xbox was the lead lead dev platform for BF1 but it still is a bit impressive for Xbox to get this close to PS4 in terms of performance. Curious to see how Mass Effect Andromeda will compare as that is using Frostbite 3 as well...
 
If the devs are making better use of the hardware in XB1, why aren't they also in PS4 and maintaining the same lead? Why aren't both platforms showing improvements from general optimisation?

I have no real explanations if NXGamer is right.

A CPU bottleneck ? So why did they improve the resolution on both consoles, especially on XB1 ?

I can understand the better framerate on XB1 if we consider its slightly faster CPU, but not the much smaller resolution difference.

I have the feeling that the PS4 version is less optimized due to marketing partnership, but it's just speculation.

If the XB1 is really closing the gap, then we should see this trend in all other MP games.

However, i'm almost certain that BF1 will be an exception and that we would see the usual difference among future MP games (1080p vs 900p or 900p vs 720p).
 
Last edited:
I have no real explanations if NXGamer is right. I have the feeling that the PS4 version is less optimized due to marketing partnership, but it's just speculation.
So why do you discount the new DX12 feature optimisations as described above, and the fact devs need time to implement them being a rational explanation why we haven't seen its effect earlier, nor necessarily widespread (what if you're still targeting DX11)? To me, the coincidence between the new feature availability on XB1 and the gap closing in BF1 alludes to a correlation more readily than dismissing the former out of hand.
 
If DX12 features are truly starting to help we will need to see more 3rd party titles where the performance gap is closer than before. Games like Forza Horizon/Gears of War 4 look really nice but we have no way to compare it to how it would run on PS4.

Like I said before Mass Effect will be interesting because it is also using Frostbite.

If it is the case that DX12 features are helping to close the performance gap it's a shame for Microsoft because PS4 Pro is right around the corner and it will steal any thunder Microsoft would get from this..
 
So why do you discount the new DX12 feature optimisations as described above, and the fact devs need time to implement them being a rational explanation why we haven't seen its effect earlier, nor necessarily widespread (what if you're still targeting DX11)? To me, the coincidence between the new feature availability on XB1 and the gap closing in BF1 alludes to a correlation more readily than dismissing the former out of hand.

Because most of developers tend to think like that : http://gamingbolt.com/dice-dev-xbox...wont-help-reduce-difference-talks-development

Because the main point of the slide was about compute optimization while Execute was just a little detail in the whole presentation.

And Dice is not the only developper happy with GPGPU capabilities : http://wccftech.com/async-compute-p...tting-performance-target-in-doom-on-consoles/

According to Tiago Sousa : Plus great profilling tools on PS4/Xb1 – pc has much to improve on tools

This confirms what we already know about the differences between consoles and PC.

And with Doom we already have an example of how such a game would perform on each console.
 
This is what I'm thinking, too. The xb1s has a little bit more gpu-power but what seems to be the bottle-neck is the memory-bandwidth. The higher clock-rates of the gpu also improved the bandwidth of the esram, so that should be the big difference between xb1s and xb1. Also the xbox has the slightly faster cpu, so this could be why xb1 has a little better frames (at a little lower res). The one memory pool for PS4 could also explain the lower framerates. Battlefield is normally also a CPU intensive game. So if the cpu has more cycles on memory, the gpu has less available. This could be a limiting factor in optimizing for the PS4. If you optimise your title it can heavily use the CPU, it steals memory-bandwidth from the GPU and the GPU might have to wait for some cycles and can't be maxed out. On the other hand, if you max out the GPU you can't max out the CPU. at least in this scenario it might be possible.

That is one of the positives for Esram, CPU contention is only affecting the ddr3 portion leaving the gpu to hammer the Esram without interference.

Esram is not new now, dev tools are matured and perhaps it's progressed to actually being a positive as there is the potential for a lot of bandwidth in a couple heavy title in comparison. Xbox is the target platform so perhaps this is closer to a first party implementation and PS4 is having to play catch-up. We will probably not know unless a talk happens at gdc if this is the case.

However I do wonder if perhaps the answer is that compute based lossless and lossy techniques are actually delivering a rounded solution where we are not truly seeing what we believe we in respect to a fixed pixel count to rationalise against a flop count.

I cannot describe this well but perhaps smoke and mirrors (in a smart use of resources way) is what we are seeing and so we are comparing results using the wrong metrics or using thoes metrics incorrectly. Can we pixel count the image when perhaps not all of the image is the same, if parts of the image differ and we do not know how can we compare Xbox to PS when one may have more or less different areas at different pixel quality.

This is based on the banding in the video and the rise of clever rendering techniques being put into AAA titles.
 
If the devs are making better use of the hardware in XB1, why aren't they also in PS4 and maintaining the same lead? Why aren't both platforms showing improvements from general optimisation?
imho X1 was always the machine of this generation where there is a bit more room for improvement, whether it is because of api optimisations or tools.

on a different note, DF usually look for possible PCs with similar performance levels compared to current consoles. Now that the 1050 Ti is about to come out and costs 139$ -compared to the 109$ of the 1050-, I am thinking about building a PC. If you pair the 1050 Ti up with an Intel i3 6100 (which is the best cpu out there, value for money wise http://www.futuremark.com/hardware/cpu), you can have a PC that runs faster than expected. You don't need extra power connectors for the GPU and could do well with a 300W power supply unit for the entire PC!, with a lot of punch.
 
I've asked Johan Anderson and Graham Wihlidal and I´ve got an answer on the matter. There is still room for improvement on X1 since the features unique to X1 have not been used yet!

MmBVwXi.png
 
Last edited:
I've asked Johan Anderson and Graham Wihlidal and I´ve got an answer on the matter. There is still room for improvement on X1 since the features unique to X1 has not been used yet!

MmBVwXi.png
Appreciate the effort to go directly to the source.
 
So Battlefield 1 on XO uses DX12? Is there any other game that uses it on XO besides BF1 & Battlefront?
Gears of Wars 4, Forza and Horizon, Quantum Break, and most of te "big" games that have been released since Battlefront (ReCore is presumably DX11.x tho.. ) now are those games based on engines specifically build for DX12? Nope. besides maybe ForzaTech. (UE4 doesn't seem so bad too given that the DX12 path was actually mainly developed by Microsoft themselves when developing Fables Legends along with the sync Compute support and other things )
 
I've asked Johan Anderson and Graham Wihlidal and I´ve got an answer on the matter. There is still room for improvement on X1 since the features unique to X1 have not been used yet!

So, the reason for this improvement wasn't executeindirect...

When we know that the beta ran at 720p on XB1 and 900p on PS4, there is clearly something wrong with the PS4 version is NXGamer has to be believed.

Let's wait for the DF analysis for more information.
 
So, the reason for this improvement wasn't executeindirect...

When we know that the beta ran at 720p on XB1 and 900p on PS4, there is clearly something wrong with the PS4 version is NXGamer has to be believed.

Let's wait for the DF analysis for more information.

We don't know the reason. ExecuteIndirect and it's derivatives can still be in play as it's available to all three platforms, Taking advantage of XBO microcode was not used.
 
Status
Not open for further replies.
Back
Top