Input lag increases when GPU utilization reaches ~99% - Battlenonsense findings

Discussion in 'Architecture and Products' started by Scott_Arm, Sep 19, 2019.

  1. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,332
    Likes Received:
    3,824


    Very interesting video, and complicates minimizing input lag.

    Essentially you can get much less input lag by capping your frame rate to keep gpu utilization under approximately 99%. The new anti-lag and ultra-low latency modes from AMD/Nvidia will actually make input lag worse in the case where your gpu is not near 99% utilization. Staying below 95% is actually preferable to 99% with the anti-lag/ultra-low-latency options. This was found to be the case in Overwatch, Battlefield V and PUBG. These were tested with vsync off so there is no vsync penalty.

    eg.
    Overwatch, 99% utilization, 81fps (no cap), low latency OFF = 72ms avg
    Overwatch, 99% utilization, 81fps (no cap), low latency ULTRA = 63ms avg
    Overwatch, 77% utilization, 60fps (60fps cap), low latency ULTRA = 46ms avg (lower fps should be higher latency)
    Overwatch, 77% utilization, 60fps (60fps cap), low latency OFF = 41ms avg (So ULTRA low latency hurts when you're not near 100%)

    Further examples showed best case is less than 95% utilization, low latency modes off for both AMD and Nvidia.

    Does anyone understand architecturally why this would be true? The GPUs must be buffering/queuing internally which adds this latency, and it is very significant.

    Edit: or is it possible it's queuing in the driver that adds the latency, but I thought that's what the low latency setting was supposed to turn off, so I'm not sure why enabling low latency would not be as effective as just capping to a lower frame rate ... confused.

    Edit: I'd also be curious to see this tested with CSGO. People typically try to push max frames 300+ assuming it lowers input lag, but capping at 300 or less might actually improve input lag.
     
    #1 Scott_Arm, Sep 19, 2019
    Last edited: Sep 20, 2019
  2. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    8,141
    Likes Received:
    6,405
    nooooo.
    LOL!
    Hopefully more studies on this one.
     
    orangpelupa likes this.
  3. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,332
    Likes Received:
    3,824
    Yah, thinking about this is very complicated. If you're making a competitive game and want to ensure the users have the best experience, you pretty much have to frame cap internally to ensure they never fully utilize their gpu. Probably not as hard with a console because you could frame limit to 60, 120 and adjust all of your rendering quality to leave yourself some headroom. On PC ... this just gets weird. Maybe you could make a benchmark that does a test, finds the worst case and then caps your fps just below the worst case? Otherwise doing weird shit with performance counters? I'm not sure if it's viable to adjust things in real-time based on performance counters.
     
  4. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    8,141
    Likes Received:
    6,405
    I bought a gsync monitor 144hz and make my setup to have the highest fps count possible.
    This would be ... a slap in my face lol to get better latency to reduce the fps and bring it down
     
    orangpelupa likes this.
  5. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,332
    Likes Received:
    3,824
    If you want to be even more confused: https://www.blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/3/

    Best case with gsync is to cap your fps around 3 fps lower than your displays max refresh rate and enable in the Nvidia drivers. So if it's a 144Hz display, cap at 141 and turn vsync on. Gives you the best tear-free image with some frame pacing control. Otherwise if you have vsync on and hit your refresh rate, you're basically just vsynced like on a normal monitor and have an input lag penalty. Or if you vsync off you'll get tearing above 144.

    So with gsync/freesync cap below your refresh rate or 95% utilization (whichever is lower) and turn vsync on for gsync (not sure about freesync for vsync on).

    It's almost to the point where you have to buy a monster gpu so you can cap to your worst case and still have decent frame times. If I cap to my worst case on a GTX1060, I'll be playing everything capped at like 70-80fps.
     
    Lightman and orangpelupa like this.
  6. Malo

    Malo Yak Mechanicum
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    7,129
    Likes Received:
    3,185
    Location:
    Pennsylvania
    Well in your case wouldn't you be adjusting in-game settings to reduce the GPU usage whilst maintaining the 144hz?
     
    Scott_Arm likes this.
  7. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,332
    Likes Received:
    3,824
    I guess if you set a cap of 141 or whatever he has, he can just try to keep lowering settings until the game never drops below 141, and then you can probably assume it's never hitting 100% utilization.
     
  8. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    8,141
    Likes Received:
    6,405
    I never make it to 144 usually, if I maxed out 144 fps at all scenarios this would work, I'll fluctuate between 90-144. Which means, I'd have to cap around 85?
     
  9. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,332
    Likes Received:
    3,824
    Yah, probably. It just seems really dumb. You'll end up with your gpu being utilized at 60% most of the time, just to stay under 95% in the worst case ... I don't know if it's worth it unless you're playing something competitive like ranked modes.
     
    orangpelupa likes this.
  10. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    17,330
    Likes Received:
    1,822
    Location:
    Winfield, IN USA
    Wow, thank you! Y'all made me glad I've gotten old and my reflexes have gotten worse, I can't tell that much of a difference in that small a variance in input lag. LOL
     
  11. orangpelupa

    orangpelupa Elite Bug Hunter
    Legend Veteran

    Joined:
    Oct 14, 2008
    Messages:
    7,306
    Likes Received:
    1,385
    Play on a TV, disable game mode, enable all the fancy TV features at default.

    You'll be amazed how could people play a video game as laggy as that and thinks everything's fine
     
  12. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,332
    Likes Received:
    3,824
    My brother got a new tv. He was like, "I'm not very good at video games anymore." Then I turned all the bullshit features off and he was like, "Oh, I'm fine ... I've just had my tv misconfigured for years."
     
  13. milk

    Veteran Regular

    Joined:
    Jun 6, 2012
    Messages:
    3,057
    Likes Received:
    2,638
    God is this depressing.
     
  14. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,332
    Likes Received:
    3,824
    The auto low-latency mode on new tvs will save us ... if we all buy new tvs again.
     
    digitalwanderer and BRiT like this.
  15. orangpelupa

    orangpelupa Elite Bug Hunter
    Legend Veteran

    Joined:
    Oct 14, 2008
    Messages:
    7,306
    Likes Received:
    1,385
    thats why i wish rather than manufacturer only adds FILM MODE button, they also should add GAME/PC MODE button on the remote. Because all the processing crap reduce your enjoyment much more on games than a movie. On a movie you'll only notice the soap opera effect and inter-frame artifacts. But on a video game, you'll see all of those PLUS horrible lag hahaha.

    btw why TVs don't simply PRIORITIZE MINIMUM LATENCY when it detects game console? PS4, PSVR PU BOX, all already sends device info thru HDMI.
     
    digitalwanderer likes this.
  16. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,332
    Likes Received:
    3,824
    That's what hdmi 2.1 auto low latency mode is going to do.
     
    Kej, Globalisateur, milk and 3 others like this.
  17. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    16,350
    Likes Received:
    5,327
    One reason I always use a PC to test input and display latency on TV's (basically to ensure game mode is on AND working). Just moving your mouse around will instantly show you whether game mode is on and working.

    Regards,
    SB
     
  18. Globalisateur

    Globalisateur Globby
    Veteran Regular

    Joined:
    Nov 6, 2013
    Messages:
    3,015
    Likes Received:
    1,729
    Location:
    France
    I want to believe. But wait until they add (by default) fancy features in order to differentiate from the competition (and hopefully sell more).

    Fancy features that will actually increase the input lag like...low latency modes or super high framerates... :shock::runaway:
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...