Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
In the interview with DF if I recall correctly, they mentioned particles physics are cpu driven, and they are a very expensive system for them.

For LightHeaven's reference:
"A lot of the work was making the engine multi-threaded. The original Source engine had a main thread that built up a command buffer for the render thread and there was a frame of delay. We kind of changed that so the render thread buffered commands as soon as they were issued - in the best case, it gets rid of a frame of latency there. And then almost every system has been made multi-threaded at this point. The networking is multi-threaded, the particle simulation is multi-threaded. The animation is multi-threaded - I could go on forever. The culling, the gathering of objects we're going to render is multi-threaded. It's across as many cores as you have."

The arrival of AMD's Mantle API has put a lot of focus on DirectX 11's API and driver overhead, and Respawn's solution to this is simple enough while highlighting that this is an issue developers need to work with.

"Currently we're running it so that we leave one discrete core free so that DX and the driver can let the CPU do its stuff. Originally we were using all the cores that were available - up to eight cores on a typical PC. We were fighting with the driver wanting to use the CPU whenever it wanted so we had to leave a core for the driver to do its stuff," Baker adds.

Going forward, the quest to parallelise the game's systems continues - particle rendering is set to get the multi-threaded treatment, while the physics code will be looked at again in order to get better synchronisation across multiple cores. We'll be seeing an interesting battle ahead in the PC space - the pure per-core throughput of Intel up against the more many-core approach championed by AMD. It's Baker's contention that games developed with Xbox One, PS4 and PC all in mind will see console optimisations result in better PC performance.
 
A few weeks ago, there was a DF article on The Witcher 3. At the time, I noted the following:

b1pMpot.png


I was irritated by the implication that I could not read or understand the words that I myself had quoted. Checking back on a whim, the article has since been updated:

"The stuff we're using that's really cool for next-gen is dynamic IBL," he continues, referring to image-based lighting. "We're using PBR (physically-based rendering), and water simulation's really interesting. It basically reacts to the weather conditions so you'll get choppier waves in wind." [UPDATE 14/6/14 18:13: A previous version referred to 'IDL' middleware management - this was a mistake in the audio transcription and has been corrected.]
 
Using 912p with stable framerate with the 10% boost of the June SDK makes perfectly sense. 1080p is ~40% more pixels than 912p.

That is almost exactly the difference of Tflops count between PS4's GPU and XB1's GPU (~41%).
 
Wasn't expecting a totally rock solid fps, but they delivered. And at a static 1080p. I'm more interested in how it compares to the pc release though. How far behind is my PS4.
 
I don't see the point in going with 1620*912 instead of 1280*1080.

1368x1080 would be equivalent.

It's certainly possible they didn't want to deal with any FOV issues. Maybe they simply didn't take time to do scaling quality tests as well.
 
I don't like the effect on IQ of scaling in only one direction. Scale in both directions, but don't just scale in one because from what I've seen it leads to an unwelcome astigmatism like effect.

Wasn't a problem back on crt, and not a problem if you do 960 x 1080 with 2xMSAA applied horizontally as you can avoid the upscale by not resolving the MSAA, but 1620 x 912 beats 1368x1080 for me.

Interesting to see the game's resolution mirroring the flop difference now.
 
I don't like the effect on IQ of scaling in only one direction. Scale in both directions, but don't just scale in one because from what I've seen it leads to an unwelcome astigmatism like effect.

Wasn't a problem back on crt, and not a problem if you do 960 x 1080 with 2xMSAA applied horizontally as you can avoid the upscale by not resolving the MSAA, but 1620 x 912 beats 1368x1080 for me.

Interesting to see the game's resolution mirroring the flop difference now.
PS4 has absolutely no dropped frames so I wonder if it could have been pushed past 1080p.
 
PS4 has absolutely no dropped frames so I wonder if it could have been pushed past 1080p.

I noticed that as well from both videos... would be nice to know what type of AA sampling is being done and other neat post processing effects being achieved across both systems.

Should there be an option(s) on adding tessellation and other subdued PC effects (locked at 30fps), for those graphic whores like myself? :p
 
Last edited by a moderator:
I'm glad there isn't an option. I'd rather people made decisions like that for me. I'm tweaking the settings every 5 minutes on Titanfall pc. Drive myself crazy.
 
I'm glad there isn't an option. I'd rather people made decisions like that for me. I'm tweaking the settings every 5 minutes on Titanfall pc. Drive myself crazy.

This is exactly why I stopped gaming on Pc. I spent more time tweeking than playing.
Consoles you get what you get with only gamma options and sometimes unlock or lock framerate.

After reading the DF article Im left wondering if they got confirmation that 4a games actually used the june SDK or if they are just assuming so because the resolution is 912p?
I wish they would clarify this. I just dont see them even bothering implementing the SDK for a jump from 900 to 912p. Could that have been the res of the game from the begining and they didnt bother say 912p because it would be pointless?
 
Exactly, PC gamers more often enjoy this aspect in it's own right whilst console gamers see it as a disadvantage. For me personally it would bore me silly if I couldn't tweak settings and test my hardwares capabilities.

I've always joked that PC gamers are like audiophiles. Audiophiles don't enjoy listening to music, they enjoy listening to their stereos. PC gamers don't enjoy gaming, they enjoy their computers.

I kid, I kid! (until the 360, I was a PC gamer)

Looks like this Metro remaster package is really well done. I never played those games. I'll have to check out the price. There is a small window of time before Destiny. Not sure if I can fit a game into my schedule or not. The odd Titanfall match seems to be handling that for me.
 
This is exactly why I stopped gaming on Pc. I spent more time tweeking than playing.
Consoles you get what you get with only gamma options and sometimes unlock or lock framerate.

After reading the DF article Im left wondering if they got confirmation that 4a games actually used the june SDK or if they are just assuming so because the resolution is 912p?

I wish they would clarify this. I just dont see them even bothering implementing the SDK for a jump from 900 to 912p. Could that have been the res of the game from the begining and they didnt bother say 912p because it would be pointless?

Why would it be hard to implement an update? It’s just returning the tethered 10% resource back over to the developer. The update has been available since June, and most XB1 developers should have it.

It’s not a full blown Dx12 update, which may require some time for developers to adapt their game code to the more refined/improved DX12 libraries. From my understanding, the June update is a simple release of the Kinect resources back to the developer… and the developer choosing the best usage for those resources (i.e. performance stabilization or resolution bump).
 
Why would it be hard to implement an update? It’s just returning the tethered 10% resource back over to the developer. The update has been available since June, and most XB1 developers should have it.

It’s not a full blown Dx12 update, which may require some time for developers to adapt game code to the more refined/improved DX12 libraries. From my understanding the June update is a simple release of the Kinect resources back to the developer… and the developer choosing the best usage for those resources (i.e. performance stabilization or resolution bump).

I don't know how involved it is, but I know you basically opt-in to the extra resources. If that entails any extra work, I do not know. Seems very likely that any game coming out at this point would be using it.
 
I don't know how involved it is, but I know you basically opt-in to the extra resources. If that entails any extra work, I do not know. Seems very likely that any game coming out at this point would be using it.

I would think so.

I can't picture MS not being proactive on getting developers up to speed with the latest XB1 SDK updates... unless they like bad PR, which I don't believe they do.
 
Why would it be hard to implement an update? It’s just returning the tethered 10% resource back over to the developer. The update has been available since June, and most XB1 developers should have it.

It’s not a full blown Dx12 update, which may require some time for developers to adapt their game code to the more refined/improved DX12 libraries. From my understanding, the June update is a simple release of the Kinect resources back to the developer… and the developer choosing the best usage for those resources (i.e. performance stabilization or resolution bump).

I never said it would be difficult to implement. The point was in the article Df doesnt make it clear if they are guessing if it is used or if they know. Take the recent announcement about Diablo 3. It comes out only a week earlier yet the res upgrade is implemented in a day one patch not on the disk. I question if it is used mainly because of the short amount of time since the SDK released, the fact that discs are made weeks in advance before launch and the minimum increase in res. Why bother with such a small increase. Df doesnt clarify anything about the code they are using for analysis.
 
Status
Not open for further replies.
Back
Top