NVIDIA adopts Adaptive-sync (was: FreeSync working on a GeForce)

Status
Not open for further replies.

Kaotik

Drunk Member
Legend
Supporter
NVIDIA has officially adopted Adaptive-sync.
They've tested 400 displays so far, of which 12 got official "G-sync compatible" stamp aka works automatically, on the rest user can manually enable support and see how it goes. NVIDIA claims they're going to test every single Adaptive-sync (FreeSync) display out there to grant more stamps if applicable.

https://blogs.nvidia.com/blog/2019/01/06/g-sync-displays-ces/


--
Original:


Long story short, you need to have AMD APU as primary graphics, enable FreeSync on it and run the cable from display to your mobo instead of the discrete GeForce. Once you set GeForce to be the preferred GPU from NVIDIA control panel (Win10 1803 GPU selector should also work), GeForce handles the rendering but uses AMD APU as the display controller (like Optimus in laptops), which then syncs the display refresh rate to FPS to enable FreeSync
 
Last edited:
Oh wow, thank you for this. Damn, now I wish the Ryzen 1600x had integrated graphics so I could try this out. Still not great to not have adaptive sync natively supported on NV GPUs and it would mean choosing a CPU with integrated graphics, but at least it is something.

Now, I wonder how long it'll take NV to be spiteful and attempt to shut this down.

Regards,
SB
 
Can nvidia or AMD disable this in future driver revisions?
 
Are there benchmarks being done with these setups?

There should be some additional latency by sending the framebuffer back from the Geforce into the PCIe bus and again through the Radeon.
I know this already what happens in laptops, but in that case it's only dGPU -> APU -> display. Here it's Geforce dGPU -> CPU/APU -> Radeon -> Display
 
Are there benchmarks being done with these setups?

There should be some additional latency by sending the framebuffer back from the Geforce into the PCIe bus and again through the Radeon.
I know this already what happens in laptops, but in that case it's only dGPU -> APU -> display. Here it's Geforce dGPU -> CPU/APU -> Radeon -> Display
PCPer tested this, of course perfect apples to apples comparison is impossible
https://www.pcper.com/reviews/Graphics-Cards/AMD-FreeSync-Working-NVIDIA-GPUs-Some-Strings-Attached

In their results, dGPU > APU > FreeSync was 2.9ms slower than APU > FreeSync
They include dGPU > G-Sync result too but different display etc so it's not apples to apples
 
also hardware unboxed tested average FPS while using the solution and there is a drop in performance but it's not all that great

in any case, this trick of using a weak GPU with Freesync support to output the game rendered by the geforce would be VERY interesting if Intel IGPs supported adaptive sync, but while Intel plans to support it it's not coming anytime soon.
 
Well, for owners of NV cards, it's still cheaper to buy a cheap AMD card to pair with any of a plethora of AdaptiveSync (FreeSync) monitors AND Televisions than to buy a G-sync monitor of similar specifications. :p

If I weren't too busy to do much gaming, much less tinkering with my PC, I'd look into buying a cheap AMD card just to try this out as I already own an AdaptiveSync monitor.

In a similar way, I don't see why you couldn't use an NV card to pass through the video signal from an AMD card to a G-sync monitor, thus avoiding the vendor lock-in if you didn't want to upgrade to another NV card.

Regards,
SB
 
this solution only works well with an AMD 2200G/2400G because windows natively supports the IGP as a low power GPU and outputting the dGPU via it,

with a descrete AMD card you need the game specifically to support it, and most games don't
maybe some driver/windows hack an solve it, but for now I don't see it as viable outside of the 2200/2400G scenario
 
in any case, this trick of using a weak GPU with Freesync support to output the game rendered by the geforce would be VERY interesting if Intel IGPs supported adaptive sync, but while Intel plans to support it it's not coming anytime soon.
Do we know it's not coming anytime soon? Chris Hook just went on record last week on saying they're still planning to support it, and I'd put my money on 10nm being the culprit here.
Intel is using clearly outdated IGPs on terms of display outputs on every 14nm product, and only explanation I can come up for it is that they planned to have major update on 10nm but with the delays it's been pushed back further time and time again and not backported to new 14nm models
 
Do we know it's not coming anytime soon? Chris Hook just went on record last week on saying they're still planning to support it, and I'd put my money on 10nm being the culprit here.
Intel is using clearly outdated IGPs on terms of display outputs on every 14nm product, and only explanation I can come up for it is that they planned to have major update on 10nm but with the delays it's been pushed back further time and time again and not backported to new 14nm models

Yes, Intel announced they intended to adopt Adaptive Sync back in 2015. That was shortly after they had introduced Gen9 with Skylake, so they were probably talking about Gen10 which was supposed to appear next year in 2016 with Cannon Lake.

dTnswld.jpg


All their consumer releases have been tiny incremental updates to Skylake architecture-wise. Since 2015 they mostly decreased their profits per-waffer/mm^2 by increasing core count and L3.


I honestly think the original plans for Canon Lake might be mostly scrapped at this point, at least the ones from 2016:
X41RBS4.jpg


They can't launch the U-series 15W 2C+GT2 because those would be easily defeated by Raven Ridge and Picasso. Maybe they can use that silicon for Y-series 4.5W but Picasso might cover that TDP range with 4-cores and 8 CUs.

Curiously, this last roadmap showed all Coffee Lake models coming with GT3e and that obviously didn't happen either. There's only a couple of SKUs with GT3e and those are 20-28W U-series 4-cores, not 2C+GT3e. The rest are all 2/4/6/8-cores+GT2.
I guess Ryzen really did make Intel scramble their plans a lot, not just the problems with 10nm.



Regardless, in 2019 we will probably see Gen10 with Adaptive Sync, and Intel's discrete graphics coming 2020 will definitely have that feature too.
 
They can't launch the U-series 15W 2C+GT2 because those would be easily defeated by Raven Ridge and Picasso. Maybe they can use that silicon for Y-series 4.5W but Picasso might cover that TDP range with 4-cores and 8 CUs.
Actually they do have U-series out already, it consists of one model that has IGP disabled, but at least rumormill says it's disabled to improve terrible yields or doesn't work at all. They're only selling it to laptops and nucs bundled with a discrete Radeon
 
Regardless, in 2019 we will probably see Gen10 with Adaptive Sync, and Intel's discrete graphics coming 2020 will definitely have that feature too.
I'm not sure Gen 10 supports Adaptive Sync. But I don't think it matters because my belief (and it's of course all speculation) is Gen 10 is effectively dead anyway. Because Cannon Lake is Gen 10, manufactured on a apparently unfixable 10nm process. We'll probably never see any real products from that process, outside a couple of those Cannon Lake chips, half disabled and sold at a loss to not freak out investors. (Should any such chips with enabled IGP exist, they should be supported for quite a while already with the open-source linux drivers, funnily enough...)
Ice Lake, slated for next year, already has gen 11 graphics (I'm quite certain it will be manufactured on a different 10nm process, albeit intel might well call this still just 10nm).
 
This is pretty cool. I would do it I think. I don't game much now either, but my monitor is 12 years old, I have 2500k and a 950 so am upgrade is certainly something I have been thinking about.
 
LinusTechTips tested this but went a little bit further - they tested also G-Sync on Radeon and it works the exact same way - just pick the right GPU for rendering and use the other one for output.

 
What was the software that Intel bundled with z77 (and maybe other) motherboards which allowed users to connect monitors to the IGP and still gaming using descrete cards? It allowed for the use of quicksync while using an add in card, but I think it started life as the software that could split rendering across nVidia and AMD GPUs, initially with the companies PCIe bridge chip which was included on some MSI boards? If that could copy video outputs between GPU's surely someone could knock up a utility to enable this in Windows with 2 descrete cards?
 
What was the software that Intel bundled with z77 (and maybe other) motherboards which allowed users to connect monitors to the IGP and still gaming using descrete cards? It allowed for the use of quicksync while using an add in card, but I think it started life as the software that could split rendering across nVidia and AMD GPUs, initially with the companies PCIe bridge chip which was included on some MSI boards? If that could copy video outputs between GPU's surely someone could knock up a utility to enable this in Windows with 2 descrete cards?
Lucid Hydra I think. Or at least Hydra was one of their similar products

But like Linus said in the video, this is based on DXGI, part of DirectX, you don't need any extra hardware or software involved. Also I'm pretty sure someone tested this already to work with discrete cards too?
 
Status
Not open for further replies.
Back
Top