FreeSync working on a GeForce

Discussion in 'PC Hardware, Software and Displays' started by Kaotik, Aug 28, 2018.

  1. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    7,616
    Likes Received:
    1,301
    Location:
    Finland


    Long story short, you need to have AMD APU as primary graphics, enable FreeSync on it and run the cable from display to your mobo instead of the discrete GeForce. Once you set GeForce to be the preferred GPU from NVIDIA control panel (Win10 1803 GPU selector should also work), GeForce handles the rendering but uses AMD APU as the display controller (like Optimus in laptops), which then syncs the display refresh rate to FPS to enable FreeSync
     
  2. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    14,352
    Likes Received:
    1,834
    Wow that reminds me of the hassle of using an ati card for rendering and a nv card for physx
     
    BRiT and Lightman like this.
  3. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    7,616
    Likes Received:
    1,301
    Location:
    Finland
    It also works with discrete GeForce and discrete Radeon, tested at least on GTX 1080 Ti + Radeon Pro WX4100
     
    ToTTenTranz, BRiT and Lightman like this.
  4. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    15,223
    Likes Received:
    4,098
    Oh wow, thank you for this. Damn, now I wish the Ryzen 1600x had integrated graphics so I could try this out. Still not great to not have adaptive sync natively supported on NV GPUs and it would mean choosing a CPU with integrated graphics, but at least it is something.

    Now, I wonder how long it'll take NV to be spiteful and attempt to shut this down.

    Regards,
    SB
     
    BRiT likes this.
  5. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    8,960
    Likes Received:
    3,597
    Can nvidia or AMD disable this in future driver revisions?
     
  6. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    14,352
    Likes Received:
    1,834
    You could get a bottom of the range amd card

    edit:
    Maybe not the cheapest amd card I can find is a Radeon R7 240 for £70

    edit 2:
    R5 230 1GB £35
     
    #6 Davros, Aug 29, 2018
    Last edited: Aug 29, 2018
    Silent_Buddha likes this.
  7. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,018
    Likes Received:
    805
    Location:
    still camping with a mauler
    If you want VRR on NVIDIA it’d be cheaper to buy a freesync monitor and a little AMD card for output lol.
     
    green.pixel, Silent_Buddha and BRiT like this.
  8. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    8,960
    Likes Received:
    3,597
    Are there benchmarks being done with these setups?

    There should be some additional latency by sending the framebuffer back from the Geforce into the PCIe bus and again through the Radeon.
    I know this already what happens in laptops, but in that case it's only dGPU -> APU -> display. Here it's Geforce dGPU -> CPU/APU -> Radeon -> Display
     
  9. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    7,616
    Likes Received:
    1,301
    Location:
    Finland
    PCPer tested this, of course perfect apples to apples comparison is impossible
    https://www.pcper.com/reviews/Graphics-Cards/AMD-FreeSync-Working-NVIDIA-GPUs-Some-Strings-Attached

    In their results, dGPU > APU > FreeSync was 2.9ms slower than APU > FreeSync
    They include dGPU > G-Sync result too but different display etc so it's not apples to apples
     
    ToTTenTranz and pharma like this.
  10. HMBR

    Regular

    Joined:
    Mar 24, 2009
    Messages:
    408
    Likes Received:
    93
    Location:
    Brazil
    also hardware unboxed tested average FPS while using the solution and there is a drop in performance but it's not all that great


    in any case, this trick of using a weak GPU with Freesync support to output the game rendered by the geforce would be VERY interesting if Intel IGPs supported adaptive sync, but while Intel plans to support it it's not coming anytime soon.
     
    Lightman likes this.
  11. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    15,223
    Likes Received:
    4,098
    Well, for owners of NV cards, it's still cheaper to buy a cheap AMD card to pair with any of a plethora of AdaptiveSync (FreeSync) monitors AND Televisions than to buy a G-sync monitor of similar specifications. :p

    If I weren't too busy to do much gaming, much less tinkering with my PC, I'd look into buying a cheap AMD card just to try this out as I already own an AdaptiveSync monitor.

    In a similar way, I don't see why you couldn't use an NV card to pass through the video signal from an AMD card to a G-sync monitor, thus avoiding the vendor lock-in if you didn't want to upgrade to another NV card.

    Regards,
    SB
     
  12. HMBR

    Regular

    Joined:
    Mar 24, 2009
    Messages:
    408
    Likes Received:
    93
    Location:
    Brazil
    this solution only works well with an AMD 2200G/2400G because windows natively supports the IGP as a low power GPU and outputting the dGPU via it,

    with a descrete AMD card you need the game specifically to support it, and most games don't
    maybe some driver/windows hack an solve it, but for now I don't see it as viable outside of the 2200/2400G scenario
     
    ToTTenTranz likes this.
  13. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    7,616
    Likes Received:
    1,301
    Location:
    Finland
    Do we know it's not coming anytime soon? Chris Hook just went on record last week on saying they're still planning to support it, and I'd put my money on 10nm being the culprit here.
    Intel is using clearly outdated IGPs on terms of display outputs on every 14nm product, and only explanation I can come up for it is that they planned to have major update on 10nm but with the delays it's been pushed back further time and time again and not backported to new 14nm models
     
  14. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    8,960
    Likes Received:
    3,597
    Yes, Intel announced they intended to adopt Adaptive Sync back in 2015. That was shortly after they had introduced Gen9 with Skylake, so they were probably talking about Gen10 which was supposed to appear next year in 2016 with Cannon Lake.

    [​IMG]

    All their consumer releases have been tiny incremental updates to Skylake architecture-wise. Since 2015 they mostly decreased their profits per-waffer/mm^2 by increasing core count and L3.


    I honestly think the original plans for Canon Lake might be mostly scrapped at this point, at least the ones from 2016:
    [​IMG]

    They can't launch the U-series 15W 2C+GT2 because those would be easily defeated by Raven Ridge and Picasso. Maybe they can use that silicon for Y-series 4.5W but Picasso might cover that TDP range with 4-cores and 8 CUs.

    Curiously, this last roadmap showed all Coffee Lake models coming with GT3e and that obviously didn't happen either. There's only a couple of SKUs with GT3e and those are 20-28W U-series 4-cores, not 2C+GT3e. The rest are all 2/4/6/8-cores+GT2.
    I guess Ryzen really did make Intel scramble their plans a lot, not just the problems with 10nm.



    Regardless, in 2019 we will probably see Gen10 with Adaptive Sync, and Intel's discrete graphics coming 2020 will definitely have that feature too.
     
  15. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    7,616
    Likes Received:
    1,301
    Location:
    Finland
    Actually they do have U-series out already, it consists of one model that has IGP disabled, but at least rumormill says it's disabled to improve terrible yields or doesn't work at all. They're only selling it to laptops and nucs bundled with a discrete Radeon
     
  16. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    2,977
    Likes Received:
    93
    I'm not sure Gen 10 supports Adaptive Sync. But I don't think it matters because my belief (and it's of course all speculation) is Gen 10 is effectively dead anyway. Because Cannon Lake is Gen 10, manufactured on a apparently unfixable 10nm process. We'll probably never see any real products from that process, outside a couple of those Cannon Lake chips, half disabled and sold at a loss to not freak out investors. (Should any such chips with enabled IGP exist, they should be supported for quite a while already with the open-source linux drivers, funnily enough...)
    Ice Lake, slated for next year, already has gen 11 graphics (I'm quite certain it will be manufactured on a different 10nm process, albeit intel might well call this still just 10nm).
     
  17. Sxotty

    Veteran

    Joined:
    Dec 11, 2002
    Messages:
    4,775
    Likes Received:
    264
    Location:
    PA USA
    This is pretty cool. I would do it I think. I don't game much now either, but my monitor is 12 years old, I have 2500k and a 950 so am upgrade is certainly something I have been thinking about.
     

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...