Shouldn't Vertical Synchronization Reduce Power Usage? Intel Disagrees.

Discussion in 'Tools and Software' started by Alexko, Jan 18, 2021.

Tags:
  1. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,532
    Likes Received:
    957
    I wanted to reduce the power usage of a game, so I figured I'd ask Intel's Graphics Command Center to always enable VSync for it. Yet the tool tells me that this actually increases power usage versus letting the application control it. I thought maybe this had something to do with driver overhead, but the tool also says that forcibly disabling VSync decreases power usage, which is also counterintuitive, as both disabling VSync and adding driver overhead should increase power usage.

    Shouldn't VSync reduce the framerate and decrease power usage? After all, AMD does something similar to save power: https://www.amd.com/en/technologies/frtc
    I think NVIDIA has a similar feature.

    Am I missing something? Or did the Intel devs get things mixed up?

    Sorry to make a whole thread for something so small, but I couldn't think of another place to post it.
     
    digitalwanderer and BRiT like this.
  2. Putas

    Regular Newcomer

    Joined:
    Nov 7, 2004
    Messages:
    549
    Likes Received:
    192
    Yes, frame rate limiters should decrease consumption.
     
    digitalwanderer likes this.
  3. Globalisateur

    Globalisateur Globby
    Veteran Regular Subscriber

    Joined:
    Nov 6, 2013
    Messages:
    4,257
    Likes Received:
    3,149
    Location:
    France
    I don't know about vsync, but adaptive vsync (like freesync) was actually designed to reduce power consumption of laptops.
     
    digitalwanderer likes this.
  4. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,532
    Likes Received:
    957
    Thanks, happy to confirm I'm not crazy! :)

    I guess either Intel's iGPUs behave very strangely, or there's a mistake in the Command Center.
     
  5. glow

    Newcomer

    Joined:
    May 6, 2019
    Messages:
    38
    Likes Received:
    28
    Well, sort of. The original tech that was leveraged into that, was designed for slower refreshes, mostly for static images over eDP (e.g, ~1Hz or less). It would power down the PHYs (saving power) while spinning up a buffer and some logic (consuming power) to "replay" the last received buffer to the display. Adaptive Sync doesn't really have the power saving advantages of gating the high speed PHYs.
     
    Alexko likes this.
  6. Infinisearch

    Veteran Regular

    Joined:
    Jul 22, 2004
    Messages:
    779
    Likes Received:
    146
    Location:
    USA
    My take on the possibility.
    CPU frame rate limiter or limitations.
    GPU frame rate limitations.
    Non exactness of frame time on CPU and/or GPU.

    If CPU frame time hovers around a fixed unit close to the monitor refresh (and at least double buffer) and the GPU can basically keep pace, then if you enable vsync wouldn't modern GPU's try to boost more to maintain the GPU frame rate? This would use more power comparatively to vsync being turned off in this and similar cases.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...