Input lag increases when GPU utilization reaches ~99% - Battlenonsense findings

Scott_Arm

Legend

Very interesting video, and complicates minimizing input lag.

Essentially you can get much less input lag by capping your frame rate to keep gpu utilization under approximately 99%. The new anti-lag and ultra-low latency modes from AMD/Nvidia will actually make input lag worse in the case where your gpu is not near 99% utilization. Staying below 95% is actually preferable to 99% with the anti-lag/ultra-low-latency options. This was found to be the case in Overwatch, Battlefield V and PUBG. These were tested with vsync off so there is no vsync penalty.

eg.
Overwatch, 99% utilization, 81fps (no cap), low latency OFF = 72ms avg
Overwatch, 99% utilization, 81fps (no cap), low latency ULTRA = 63ms avg
Overwatch, 77% utilization, 60fps (60fps cap), low latency ULTRA = 46ms avg (lower fps should be higher latency)
Overwatch, 77% utilization, 60fps (60fps cap), low latency OFF = 41ms avg (So ULTRA low latency hurts when you're not near 100%)

Further examples showed best case is less than 95% utilization, low latency modes off for both AMD and Nvidia.

Does anyone understand architecturally why this would be true? The GPUs must be buffering/queuing internally which adds this latency, and it is very significant.

Edit: or is it possible it's queuing in the driver that adds the latency, but I thought that's what the low latency setting was supposed to turn off, so I'm not sure why enabling low latency would not be as effective as just capping to a lower frame rate ... confused.

Edit: I'd also be curious to see this tested with CSGO. People typically try to push max frames 300+ assuming it lowers input lag, but capping at 300 or less might actually improve input lag.
 
Last edited:
nooooo.
LOL!
Hopefully more studies on this one.

Yah, thinking about this is very complicated. If you're making a competitive game and want to ensure the users have the best experience, you pretty much have to frame cap internally to ensure they never fully utilize their gpu. Probably not as hard with a console because you could frame limit to 60, 120 and adjust all of your rendering quality to leave yourself some headroom. On PC ... this just gets weird. Maybe you could make a benchmark that does a test, finds the worst case and then caps your fps just below the worst case? Otherwise doing weird shit with performance counters? I'm not sure if it's viable to adjust things in real-time based on performance counters.
 
Yah, thinking about this is very complicated. If you're making a competitive game and want to ensure the users have the best experience, you pretty much have to frame cap internally to ensure they never fully utilize their gpu. Probably not as hard with a console because you could frame limit to 60, 120 and adjust all of your rendering quality to leave yourself some headroom. On PC ... this just gets weird. Maybe you could make a benchmark that does a test, finds the worst case and then caps your fps just below the worst case? Otherwise doing weird shit with performance counters? I'm not sure if it's viable to adjust things in real-time based on performance counters.
I bought a gsync monitor 144hz and make my setup to have the highest fps count possible.
This would be ... a slap in my face lol to get better latency to reduce the fps and bring it down
 
I bought a gsync monitor 144hz and make my setup to have the highest fps count possible.
This would be ... a slap in my face lol to get better latency to reduce the fps and bring it down

If you want to be even more confused: https://www.blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/3/

Best case with gsync is to cap your fps around 3 fps lower than your displays max refresh rate and enable in the Nvidia drivers. So if it's a 144Hz display, cap at 141 and turn vsync on. Gives you the best tear-free image with some frame pacing control. Otherwise if you have vsync on and hit your refresh rate, you're basically just vsynced like on a normal monitor and have an input lag penalty. Or if you vsync off you'll get tearing above 144.

So with gsync/freesync cap below your refresh rate or 95% utilization (whichever is lower) and turn vsync on for gsync (not sure about freesync for vsync on).

It's almost to the point where you have to buy a monster gpu so you can cap to your worst case and still have decent frame times. If I cap to my worst case on a GTX1060, I'll be playing everything capped at like 70-80fps.
 
I bought a gsync monitor 144hz and make my setup to have the highest fps count possible.
This would be ... a slap in my face lol to get better latency to reduce the fps and bring it down
Well in your case wouldn't you be adjusting in-game settings to reduce the GPU usage whilst maintaining the 144hz?
 
Well in your case wouldn't you be adjusting in-game settings to reduce the GPU usage whilst maintaining the 144hz?

I guess if you set a cap of 141 or whatever he has, he can just try to keep lowering settings until the game never drops below 141, and then you can probably assume it's never hitting 100% utilization.
 
Well in your case wouldn't you be adjusting in-game settings to reduce the GPU usage whilst maintaining the 144hz?
I never make it to 144 usually, if I maxed out 144 fps at all scenarios this would work, I'll fluctuate between 90-144. Which means, I'd have to cap around 85?
 
I never make it to 144 usually, if I maxed out 144 fps at all scenarios this would work, I'll fluctuate between 90-144. Which means, I'd have to cap around 85?

Yah, probably. It just seems really dumb. You'll end up with your gpu being utilized at 60% most of the time, just to stay under 95% in the worst case ... I don't know if it's worth it unless you're playing something competitive like ranked modes.
 
Wow, thank you! Y'all made me glad I've gotten old and my reflexes have gotten worse, I can't tell that much of a difference in that small a variance in input lag. LOL
 
Play on a TV, disable game mode, enable all the fancy TV features at default.

You'll be amazed how could people play a video game as laggy as that and thinks everything's fine

My brother got a new tv. He was like, "I'm not very good at video games anymore." Then I turned all the bullshit features off and he was like, "Oh, I'm fine ... I've just had my tv misconfigured for years."
 
My brother got a new tv. He was like, "I'm not very good at video games anymore." Then I turned all the bullshit features off and he was like, "Oh, I'm fine ... I've just had my tv misconfigured for years."

thats why i wish rather than manufacturer only adds FILM MODE button, they also should add GAME/PC MODE button on the remote. Because all the processing crap reduce your enjoyment much more on games than a movie. On a movie you'll only notice the soap opera effect and inter-frame artifacts. But on a video game, you'll see all of those PLUS horrible lag hahaha.

btw why TVs don't simply PRIORITIZE MINIMUM LATENCY when it detects game console? PS4, PSVR PU BOX, all already sends device info thru HDMI.
 
Back
Top