Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
There are two problems with this. The first is that compute shader GPU programming is very fragile regarding to performance. You need to know exactly how the GPU operates in order to write fast code for it. Slow GPU code can easily be 10x+ slower than optimized one (stealing most of the GPU resources from the rendering). Game play programmers in general do not know low level GPU details well enough, and do not have experience of the GPU programming languages used (HLSL). Game play programming teams tend to have more junior programmers than rendering teams, and game play programmers in general are less interested in high performance multithreaded programming than rendering programmers.

The second problem is of course latency, and this makes things even harder. GPU runs asynchronously from the CPU, and you should be prepared to have at least half a frame latency (on consoles, on PC 2+ frames) to get the compute results back. Writing (bug free) asynchronous code is significantly harder than writing synchronous code. Game play programmers prefer fast iteration and prototyping. Writing asynchronous GPGPU code and debugging+optimizing it before it works properly completely ruins your iteration time for game play prototyping.

In Bullet physics PS4 implementation non gameplay physics can be done on GPU or CPU, gameplay physics only on CPU. And they said the reason is synchronisation.
 
I thought the CUs are supposed to be suitable for such tasks

It adds latency, and it is a problem for gameplay. For world simulation and non gameplay physics it is not a problem... Guerrilla said after KZ SF all visibility raycasting will be done on GPU...
 
I'm sorry to see that some of my posts here have come across as offensive; that wasn't my intention and I'd like to apologize to everyone.

I'd also like to clarify my standing, so let's get back to the starting point.

Halo 5 is a shooter that has been designed for 60fps, with fast response to player input, and it has made certain trade offs in order to achieve this goal.
The original question was, could the developer add a user selectable option to run the game at 30fps and use the freed up resources to deliver better graphics.

My opinions were the following:
- Gameplay systems were designed to facilitate the higher refresh rate and the resulting faster input latency; so reducing the framerate would probably have a significant impact on player experience, even in standalone single player. These effects would escalate a lot further in online coop and multiplayer, presenting a drastically different user experience and quite possibly leading to significant disadvantages for players who chose 30fps.
- Graphics systems were also designed with certain processing, bandwidth and memory bottlenecks in mind, which would restrict the rendering engine from taking full advantage of the increased per frame rendering time, leading to an inferior experience compared to games that were aiming for 30fps from the start.
- Proper testing of all of the game's functionality at the two different framerates and resolutions would add significant human and technical/material resource overhead, the cost of which could probably not be justified by additional sales or better user ratings.

Thus, my conclusion was that offering user selectable frame rate for Halo 5 would make very little sense for the publisher.

I also do acknowledge that, despite the above points, there are several games released which do offer a user choice between 30fps and 60fps. However, these games are very different from the fast paced, online enabled shooter that Halo 5 aims to be, and thus they are not a viable argument for 343i to implement user selectable framerates. This does not mean that a user selectable framerate is always going to be a bad choice; but my opinion is that it makes no sense for Halo 5, and thus it is not reasonable to get into lengthy discussions about it.

That'd be all. And again, I apologize if I came across as offensive, it was not my intention!
 
Oh, and another point... While we do not have any statistics about how frame rate could affect player performance in games like The Last of Us or Halo5, we do know that practically all other FPS games show that players with higher frame rates and lower input latency do have an advantage - after all, everyone played competitive Quake1 to 3 at the lowest possible detail settings (some of which also allowed for other advantages like better visibility for enemies). So I believe we can safely extrapolate that players at 30fps in both coop and competitive multiplayer would be at a disadvantage, particularly when it was designed for higher refresh rates.
 
Oh, and another point... While we do not have any statistics about how frame rate could affect player performance in games like The Last of Us or Halo5, we do know that practically all other FPS games show that players with higher frame rates and lower input latency do have an advantage - after all, everyone played competitive Quake1 to 3 at the lowest possible detail settings (some of which also allowed for other advantages like better visibility for enemies). So I believe we can safely extrapolate that players at 30fps in both coop and competitive multiplayer would be at a disadvantage, particularly when it was designed for higher refresh rates.

In UC4 the single player is 30 fps but the multiplayer will be 60 fps. But it means two completely different design. I agree with you for Halo 5.
 
http://www.eurogamer.net/articles/d...ars-battlefront-ps4-beta-performance-analysis

"Overall, first impressions suggest a solid turnout for the PS4 beta build. Outside of issues with matchmaking when using the partner system (as noted during a live-stream, where Eurogamer colleague Ian Higton faced an unending loading screen), the state of its visuals and frame-rate are promising. How it compares to the unseen Xbox One version will be interesting as well - something we intend to pursue once the beta launches publicly."
 
http://www.eurogamer.net/articles/d...ars-battlefront-ps4-beta-performance-analysis

"Overall, first impressions suggest a solid turnout for the PS4 beta build. Outside of issues with matchmaking when using the partner system (as noted during a live-stream, where Eurogamer colleague Ian Higton faced an unending loading screen), the state of its visuals and frame-rate are promising. How it compares to the unseen Xbox One version will be interesting as well - something we intend to pursue once the beta launches publicly."
What are the chances of X1 being 720p again?
 
We all have to wait for the final product to know 100%. The xbox resolution will be announced on DF soon I imagine. But it's still beta.
 
Face-Off: Uncharted 2: Among Thieves on PS4

To begin with, we see the expected improvements right up front. That means a full 1080p resolution coupled with an excellent post-process anti-aliasing solution that manages to dodge in-surface aliasing while minimising flicker and blur. In addition, anisotropic filtering is utilised across the game with a variable level of quality. We see some surfaces operating with what looks similar to 16x AF while other, less important details seem to go as low as 4x. Even at its lowest level, it's still a substantial improvement over the trilinear filtering used on PlayStation 3. Image quality is simply excellent all around here.
 
Status
Not open for further replies.
Back
Top