Very interesting video, and complicates minimizing input lag.
Essentially you can get much less input lag by capping your frame rate to keep gpu utilization under approximately 99%. The new anti-lag and ultra-low latency modes from AMD/Nvidia will actually make input lag worse in the case where your gpu is not near 99% utilization. Staying below 95% is actually preferable to 99% with the anti-lag/ultra-low-latency options. This was found to be the case in Overwatch, Battlefield V and PUBG. These were tested with vsync off so there is no vsync penalty.
eg.
Overwatch, 99% utilization, 81fps (no cap), low latency OFF = 72ms avg
Overwatch, 99% utilization, 81fps (no cap), low latency ULTRA = 63ms avg
Overwatch, 77% utilization, 60fps (60fps cap), low latency ULTRA = 46ms avg (lower fps should be higher latency)
Overwatch, 77% utilization, 60fps (60fps cap), low latency OFF = 41ms avg (So ULTRA low latency hurts when you're not near 100%)
Further examples showed best case is less than 95% utilization, low latency modes off for both AMD and Nvidia.
Does anyone understand architecturally why this would be true? The GPUs must be buffering/queuing internally which adds this latency, and it is very significant.
Edit: or is it possible it's queuing in the driver that adds the latency, but I thought that's what the low latency setting was supposed to turn off, so I'm not sure why enabling low latency would not be as effective as just capping to a lower frame rate ... confused.
Edit: I'd also be curious to see this tested with CSGO. People typically try to push max frames 300+ assuming it lowers input lag, but capping at 300 or less might actually improve input lag.
Last edited: