Digital Foundry Article Technical Discussion [2020]

Status
Not open for further replies.
It depends a lot on the gamer and what they expect out of their experience. I play at a locked 60 in pretty much everything, so a 5 FPS drop is massively noticable to me, especially if it happens erratically and somewhat often. That's the equivalent of a 10 FPS drop at 120 Hz.

It's basically enough to throw off your aim if your reflexes expect a consist level of input-feedback loop. The same would go for me with a 30 FPS title dropping 2-3 FPS from time to time.

It's also why I dislike VRR so much, the input-feedback loop is constantly being thrown off because of the constantly variable framerate. Hence why I consider variable resolutely infinitely better than VRR. And I'll almost always prefer screen tearing with unlocked framerate + no vsync over VRR.

But, as noted most gamers, especially console gamers are unlikely to notice as they don't have the option to adjust the settings of a game to ensure a locked framerate, so variable drops in framerate as long as they aren't too big mostly go unnoticed. Similar to how most console gamers dismissed the benefits of 60 FPS rendering WRT to gameplay and presentation ... until they've spent a good amount of time playing at 60 in a larger variety of genres than just racing games and fighting games.

But, because of the inability to tweak settings, console gamers will always have to live with a game with variable framerate for current games if the developer doesn't prioritize a locked framerate.

Of course, if they do that, the screenshot console warriors will jump on the it in a heartbeat and claim that performance is being left on the table that could have gone into graphics. For them, screenshots are far more important than good gameplay. :p (I kid...mostly).

Regards,
SB
I'm not sure I have felt that issue playing 120+ with gsync. I only feel it if the dips are 30fps or more. But generally speaking, once you break 100+, having consistency is more important than the number. I'd rather have 100-105 consistent with gsync, then 144Hz that can dip as low as 90Hz with gsync. You can feel the latter, you're unlikely to feel the former.

Both consoles seem fairly okay with holding around their numbers aside from the bugged drops.
 
Ofcourse it would enhance things alot but the performance cost would probably too high.

They said the game development began before the devkit have raytracing and it is a thing they will explore in the future.

Shadows cost less than reflection in RT.
 
Last edited:
It depends a lot on the gamer and what they expect out of their experience. I play at a locked 60 in pretty much everything, so a 5 FPS drop is massively noticable to me, especially if it happens erratically and somewhat often. That's the equivalent of a 10 FPS drop at 120 Hz.

It's basically enough to throw off your aim if your reflexes expect a consist level of input-feedback loop. The same would go for me with a 30 FPS title dropping 2-3 FPS from time to time.

It's also why I dislike VRR so much, the input-feedback loop is constantly being thrown off because of the constantly variable framerate. Hence why I consider variable resolutely infinitely better than VRR. And I'll almost always prefer screen tearing with unlocked framerate + no vsync over VRR.

Ya. Personally I am super sensitive to input latency which is why only the PC as my main platform will do. The ability to control how frames get delivered to the display regardless of the developer/hardware intent is very valuable to me.

Also, I highly recommend SpecialK to anyone looking to overcome game engine bugs to achieve performance consistency in games. Its an amazing tool.
02aa26057e231d5cb4c7fb6a908fde30c1dc52d1.png
 
Ya. Personally I am super sensitive to input latency which is why only the PC as my main platform will do. The ability to control how frames get delivered to the display regardless of the developer/hardware intent is very valuable to me.

Somewhat encumbered with DX12/Vulkan not exposing much of that control to the end user however.
 
This is a symptom of the ancient approach of the master game loop being bound to frame rate. There's no reason that user input needs to be tied to the frequency of the display, nor should it be. Games can internally run the core logic at a rate different to the rendering system, racing sims have done this for a while and most games will have other constant processes (like audio) which will not change because of frame rate changes, whether it's VRR, dropped frames etc.

This is only the case in some games. You can see this on PC by uncapping the framerate and disabling vsync. While you can still see, say 60 FPS, you gain benefits from the game running at a higher framerate. The Linus Tech Tips video I linked a while back explains this as well as showing it via blind tests with both professional and casual gamers. Both groups of people benefitted greatly from a Hz cap (the monitor refresh) combined with the game running at greater than 200 FPS.

This is due to your input being displayed at the closest possible frame to when it happened, rather than there being up to half a frame of divergence between input and output. Basically while the human guinea pigs don't get all of the benefits of 240 FPS, they did get the benefit of more accurate input-output feedback. This tighter correlation between input and display of this input means there is less overshooting or undershooting of getting to your target.

This is more relevant to mouse and keyboard controls as there is a 1:1 relationship between aiming and mouse movements. It's less relevant when using a console controller but still perceptable.

The input-output feedback loop isn't about the logic of the game, but how quickly the game can process and display input from the player. In cross-platform games that also have a PC port, this is rarely limited on the PC side.

How does VRR feel to you, can you explain the frame time feel (assuming a minimum of 60fps game target) ? I don't think I have seen much critical discussion on it so would be good to hear from a non fan.

Take what I just explained to DSoup and imagine how this changes with variable versus fixed framerate. At a max 60 FPS, this means that display-input variance of potentially half a frame will be greater than with a locked framerate.

This means it's easier to overshoot or undershoot the target. While the presentation of the game appears smooth the feedback from precise controls feels far more variable and it's much harder to consistently hit your target.

The effect is obviously more pronounced with a lower framerate, but if the magnitude of the variance is similar (IE 5 FPS at 60 Hz and 10 FPS at 120 Hz) the feel will be similar as will inaccuracies due to that.

Regards,
SB
 
The input-output feedback loop isn't about the logic of the game, but how quickly the game can process and display input from the player.

And the game logic is part of that. If you pull left on the stick, the game logic has to work out all the repercussions of this. Can you actually move left or are you obstructed? If you're obstructed is it a wall that prevents you moving or is it an NPC you can shove a bit, what's the impact of shoving them? Lots of small movements and the appropriate animations and sounds get initiated.

The rendering system should be completely independent and it wouldn't surprise me if console game code is doing this better than PC code for the simple reason that to extract any performance out of PS4 and XBO developers have had to aggressively spin processes across the 7 low-clocked cores.
 
I've been saying this for over a year but the Xbox Series S is just a waste of resources. If the Xbox Series S has to be supported developers have to make massive cutbacks. Unlike in the past when the PC version was completely different today there is no extra handling when creating content, assets, etc. Just because of this console a lot of games will suffer. MS should crush the Xbox Series S right away.

In this generation the enlargement of the RAM was already ridiculously small and because of the the Xbox Series S it is almost non existent.

I'm not sure I have felt that issue playing 120+ with gsync. I only feel it if the dips are 30fps or more. But generally speaking, once you break 100+, having consistency is more important than the number. I'd rather have 100-105 consistent with gsync, then 144Hz that can dip as low as 90Hz with gsync. You can feel the latter, you're unlikely to feel the former.

Both consoles seem fairly okay with holding around their numbers aside from the bugged drops.

Unfortunately VRR does not go with BFI and there are also many players who think that BFI is even more important.

This is why many games have higher settings for tessellation to help prevent this. The 60 fps Mode in Demon Souls has this exact same swimming effect not as readily visible im its 30 fps Mode.

Exactly. In Ghost Recon Breakpoint I do not see swimming but Tesselation costs a lot of speed there.
 
Last edited:
Well MS is banking on the XSS being their success story. But I do agree, it is a bit too weak for my liking. 5TF + BW matching or little bit below XBX would be pretty sweet.
 
And the game logic is part of that. If you pull left on the stick, the game logic has to work out all the repercussions of this. Can you actually move left or are you obstructed? If you're obstructed is it a wall that prevents you moving or is it an NPC you can shove a bit, what's the impact of shoving them? Lots of small movements and the appropriate animations and sounds get initiated.

The rendering system should be completely independent and it wouldn't surprise me if console game code is doing this better than PC code for the simple reason that to extract any performance out of PS4 and XBO developers have had to aggressively spin processes across the 7 low-clocked cores.

There's lots of various systems running in a game, a blanket statement of game logic doesn't tell us very much.

There's logic that deals with hit detection, logic that deals with NPCs (if the game has any), logic that deals with sound generation and propagation, logic that deals with physics simulation, logic that deals with controller input and how it affects what is rendered.

While much of that is connected and some pieces rely on other pieces, they are not all 1:1 relationships.

If a game runs at 60 FPS and is never expected to run at higher than 60 FPS, sure, almost all game logic can operate at 60 FPS.

If a game can run at 60, 120, 240, 360 FPS, then some aspects of the game's logic must run at higher frequencies. On PC, this means that most of the game logic dealing with controller input and feedback from that controller input (render out to screen) run at higher than 60 times per second. Hence, even when the display (monitor) is capped at 60 Hz, game logic can run at 200+ Hz, and as I've shown this is still beneficial even when the display cannot display more than 60 images per second.

That's because, as I've stated, and as explained in the LTT video, control input can more closely match display output and display output can more close reflect what is happening.

I don't have the time at the moment to go through and timestamp it for you (have an appointment to go put snow tires on, oh joy :p), but here's the video if you need a refresher.


It's not just theory as they do a blind test to see if there actually are real benefits or if it is only a placebo effect of perceived improvements.

The result is that not just professional gamers have better results in game from 60 Hz display combined with games running at 120-240 FPS (uncapped framerates with vsync off), but that casual gamers also notice and have improved results in games.

Regards,
SB
 
Take what I just explained to DSoup and imagine how this changes with variable versus fixed framerate. At a max 60 FPS, this means that display-input variance of potentially half a frame will be greater than with a locked framerate.

This means it's easier to overshoot or undershoot the target. While the presentation of the game appears smooth the feedback from precise controls feels far more variable and it's much harder to consistently hit your target.

The effect is obviously more pronounced with a lower framerate, but if the magnitude of the variance is similar (IE 5 FPS at 60 Hz and 10 FPS at 120 Hz) the feel will be similar as will inaccuracies due to that.

Regards,
SB

Does it work out that way with big lurches back and forward or because each frame only last until the next frame is ready you get streaks of consistent frame times even is they are not the target.

So when the engine is stressed to say 55 fps do you get a fairly stable frame time of 18.1ms for that duration?

I do hope DF can work a good way to analyse and then visualise this sort of thing.
 
Does it work out that way with big lurches back and forward or because each frame only last until the next frame is ready you get streaks of consistent frame times even is they are not the target.

So when the engine is stressed to say 55 fps do you get a fairly stable frame time of 18.1ms for that duration?

I do hope DF can work a good way to analyse and then visualise this sort of thing.

I'd imagine that if you could lock to 55 then it would be the same. Consistency matters.

Both consistency and higher framerates combine to give greater control, responsiveness and most importantly precision of control.

So, while relative drops in framerates can give similarly inconsistent behavior at various framerates, lower framerates are inherently less precise/responsive so the cumulative effect is generally worse at lower framerates.

For each person there is a point of diminishing returns, however. Once you get around 240-360 FPS, as long as the variations aren't too bad, the overall responsiveness at that level tends to hide variances better than 60 FPS. IE - a blip here and there is still going to result in you getting more consistent feedback per tenth of a second than at a locked 60 FPS. Again, assuming the drops aren't for an extended duration (60 consecutive frames of 10-30 FPS drops at 360 FPS will certainly be noticeable, but 2 frames dropping 10-30 FPS at 360 would be mostly not be noticeable).

At that point VRR starts to become more compelling.

For myself, when I was using VRR, I would use it only to prevent tearing of when my framerate dropped from 60 FPS to 59 FPS. Anything more than that and it started to impact my enjoyment of the game.

Regards,
SB
 
I have no major problem with DF comparisions but I do think an issue is WHEN do you do the comparison? Do you do it with the first game code at launch? Or do you wait a few months when they may have been patches that may show significant differences (improvements)?

Perhaps they could label comparisons "Initial code test". Then with follow ups "Patch code test"....or something of the sort.

speaking of which...

 
From a totally selfish point of view I like them to do multiple assessments so I can see what has changed, progress, decisions made between versions.
Was settings reduced, improved, fps evened out by more aggressive DR etc

Here is some of the patch notes:

"Graphics/Performance Mode Introduction

Added an option to the game that allows players to choose between Performance or Visual Quality.

Feature breakdown:

  • This option is available for Xbox Series X|S & PlayStation 5.
  • Choosing Performance allows the game to adapt the resolution and graphic settings to maintain 60 FPS.
  • Choosing Quality enables the game to run maximum resolution and graphic settings while maintaining 30 FPS.
  • Default values since the launch of the game are as follows:
  • Xbox Series X / PlayStation 5: Performance
  • Xbox Series S: Quality
Performance and Stability

  • Improved stability and performance.
  • (Xbox Series) Improved experience on Xbox Series S | X consoles including screen tearing
  • (PC) Addressed a VRAM/RAM leakage issue when alt tabbing to desktop.
 
@Silent_Buddha Best case setup with a gsync/freesync/adaptive sync monitor is to cap your frame rate about 3-5 frames below the refresh rate so you never get tearing or the vsync penalty to input latency. Then make sure your gpu never exceeds 95-98% use. If you're not using a variable refresh monitor, then disable vsync and cap your frames at somewhere in the 95-98% range. I'll play competitive games with an overlay up showing my gpu usage, and if I see it hit 100% I'll lower my cap and repeat until I rarely see it hit 100%.

Edit:

Shows different configurations for best latency that will work on all gpus and then scenarios use nvidia reflex

Shows the impact refresh rate has input latency even when playing with frame rate below the max refresh rate

Older video (Pre Nvidia Reflex) showing how capping your framerate for gpu load less than 98% with low latency modes OFF provided the lowest input lag (now you can use Nvidia reflex in games that support it with uncapped frames and still have low latency)
 
Last edited:
There's lots of various systems running in a game, a blanket statement of game logic doesn't tell us very much.

There's logic that deals with hit detection, logic that deals with NPCs (if the game has any), logic that deals with sound generation and propagation, logic that deals with physics simulation, logic that deals with controller input and how it affects what is rendered.

The game logic is all of the code that deals with the world's systems; AI behaviours, animation, simulating physics, time of day. Imagine GTA V, Fallout or Skyrim and not using the controller, the world rumbles on regardless. If you disable the rendering, the world rumbles on regardless. On top you have the visual renderer, audio system and any I/O - user input, network activity.

That's because, as I've stated, and as explained in the LTT video, control input can more closely match display output and display output can more close reflect what is happening.

Neither input, nor game logic (a timing metric most games refer to as 'ticks') needs to match the rate which the renderer operates. The reason people perceive better response time at higher frame rates is because the higher the framerate, the greater the number of opportunities there are the the game logic to do its work and the renderer to display it on screen.

Because the framerate can vary, some things need to run faster or slower. E.g. if you're running a 30fps (33ms) game you need the audio to be updating at least twice as fast (16ms) because laggy audio is incredibly noticeable. Likewise if you're running the framerate at 60fps of anything higher, you don't need GTA's pedestrian AI to run at that speed because NPC would be re-evaluating their place in the world faster than they can take a step forward but you would run vehicle AI faster - oh no the player has swerved in front of me, brake! In games where physics simulation is important (not GTA), you want to run this faster.
 
It looks like every 3rd-party game utilizes dynamic resolution if it can't lock at native 4k.

Is it possible that 3rd-party are preparing for mid-gen consoles in the future? They don't need to
patch their games if mid-gen consoles come. It just needs to maximize the dynamic resolution.
 
Youtube Disclaimer:

Once again, this video was mastered at 4K quality - but we have no idea when YouTube's encoders will deliver a full 4K encode. Or indeed the higher quality 1080p version that usually comes with it. We have logged support requests with YouTube last week and have still received no positive feedback - and can only apologise for the problems with the platform.​
 
Status
Not open for further replies.
Back
Top