NVIDIA opens GSync

More details here:

G-SYNC Compatible testing validates that the monitor does not show blanking, pulsing, flickering, ghosting or other artifacts during VRR gaming. They also validate that the monitor can operate in VRR at any game frame rate by supporting a VRR range of at least 2.4:1 (e.g. 60Hz-144Hz), and offer the gamer a seamless experience by enabling VRR by default.

Yeah, marketing speech. Not interesting. (Edit : the compatible monitors may have stellar AdaptiveSync support . Which btw is not always the case with AMD. But if you can only count the supported monitors using your thumbs, it's still a failure)

I'd have no issue with that IF nVidia will gradually expand the list of compatible monitors on their own and on merit, without requiring money or other incentives. I personally won't be giving them the benefit of doubt on this one.

https://www.nvidia.com/en-us/geforce/news/g-sync-ces-2019-announcements/

So most FreeSync monitors fail the validation on account of not supporting a good range of VRR.

AMD requires the same thing with FreeSync 2.

FreeSync2 is a slighly different thing. I don't understand the tech enoungh to asses if your analogy is good. In any case, the number of monitors impacted by that is significantly smaller. And my point is about hurting the adoption of the baseline AdaptiveSync technology.

Check my edit. I'm also wondering if AMD doesn't behave similarly with monitors not "supporting" Freesync but which have Adaptive Sync working. If they do of course it's just as bad
 
I'd have no issue with that IF nVidia will gradually expand the list of compatible monitors on their own and on merit
They said they will expand the list as they continue testing.
FreeSync2 is a slightly different thing. I don't understand the tech enough to asses if your analogy is good.
AMD required monitor makers to apply for a FreeSync 2 certification program, because AMD required LFC and HDR600 as baseline features. Without this certification, makers were creating chaotic freesync 2 monitors which didn't meet AMD's standards.
 
They said they will expand the list as they continue testing.

AMD required monitor makers to apply for a FreeSync 2 certification program, because AMD required LFC and HDR600 as baseline features. Without this certification, makers were creating chaotic freesync 2 monitors which didn't meet AMD's standards.
Actually it doesn't require HDR600, it's requirements fall somewhere in the middle of HDR400 and HDR600 on those portions of the spec (aka there are displays which qualify for FreeSync 2 HDR but not for HDR600).
The bigger difference is that FreeSync 2 HDR also implements proprietary(?) API which game developers can use to implement tonemapping all at once on the GPU instead of tonemapping once on GPU and then adjusting on display.
 
But if you can only count the supported monitors using your thumbs, it's still a failure
All FreeSync monitors can work through the NVIDIA driver, did you get that part?
Now with an NVIDIA GPU you can run any FreeSync and GSync monitor. With an AMD GPU you are locked to only FreeSync ones. Hardly a failure in my books.
 
Great news. I'm wondering if the HDMI specs soon requiring VRR had something to do with Nvidia finally opening up support for Adaptive Sync.

I think the certification is still a good idea. There was obviously a huge disparity in quality for Adaptive Sync panels and Nvidia's GSync was focused on minimum requirements for meeting the quality levels, so I can see why so many "failed". I'm glad they still allow manual enabling of the feature irrespective of certification.

Very positive move for the PC gaming industry.
 
The bigger difference is that FreeSync 2 HDR also implements proprietary(?) API which game developers can use to implement tonemapping all at once on the GPU instead of tonemapping once on GPU and then adjusting on display.
AMD's approach to this is to relieve the monitor makers from the burden of doing any HDR work, as they've shown time and time again, they are willing to do nothing, especially at lower tier prices. So AMD made the choice to put the burden on the GPU, instead of the monitor, through that proprietary API, which requires the support of the game as well.

NVIDIA on the other hand is doing all the work on the monitor through the FPGA module, it was a necessity as well, as GSync HDR requires very demanding features: HDR1000, 120Hz+ (with a range from 1Hz to max Hz), frequency dependent variable overdrive, and FALD. These requirements absolutely demand a great deal of work from the monitor's controller, and thus the GSync module is truly necessary here. For comparison there are no FreeSync 2 display with these capabilities whatsoever.


The big question is - do FreeSync 2 displays work too? Samsung for example lists couple of their displays supporting FreeSync 2 and specifically mention FreeSync isn't on the list. While FreeSync 2 includes all the functionality of FreeSync, it's also more than just that
There are two possibilities here:
-The NVIDIA GPU will demand the monitor to do the real HDR work, and since the monitor lacks any kind of module, it will not be able to, so the NVIDIA GPU will only drive the VRR portion of the display.

-Since there are no FreeSync 2 displays with such demanding capabilities, the NVIDIA GPU will manage to drive both the HDR and VRR work, but not the advanced tonemapping features (that require the proprietary API)

I am leaning toward the second possibility, but I am open for corrections or suggestions.
 
Last edited:
AMD's approach to this is to relieve the monitor makers from the burden of doing any HDR work, as they've shown time and time again, they are willing to do nothing, especially at lower tier prices. So AMD made the choice to put the burden on the GPU, instead of the monitor, through that proprietary API, which requires the support of the game as well.

NVIDIA on the other hand is doing all the work on the monitor through the FPGA module, it was a necessity as well, as GSync HDR requires very demanding features: HDR1000, 120Hz+ (with a range from 1Hz to max Hz), frequency dependent variable overdrive, and FALD. These requirements absolutely demand a great deal of work from the monitor's controller, and thus the GSync module is truly necessary here. For comparison there are no FreeSync 2 display with these capabilities whatsoever.
Source for the refresh rate range? Can't find literally anything suggesting such and previously G-Sync's minimum has been IIRC 40 Hz
HDR1000 doesn't require G-Sync nor FPGA-module, first HDR1000 display in fact has neither and supports Adaptive-sync (not sure if it supports FreeSync 2 HDR though)
 
Source for the refresh rate range? Can't find literally anything suggesting such and previously G-Sync's minimum has been IIRC 40 Hz
Nope, GSync has always maintained 1Hz to max Hz range. See here:
https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/
HDR1000 doesn't require G-Sync nor FPGA-module, first HDR1000 display in fact has neither and supports Adaptive-sync (not sure if it supports FreeSync 2 HDR though)
HDR1000 doesn't. But GSync HDR requires a module to operate HDR1000 + 120Hz+ VRR (1-120Hz range) + FALD + overdrive all at the same time. So far no monitor but GSync HDR certified ones ever managed to cram all of these features together.
 
Last edited:
AMD's approach to this is to relieve the monitor makers from the burden of doing any HDR work, as they've shown time and time again, they are willing to do nothing, especially at lower tier prices. So AMD made the choice to put the burden on the GPU, instead of the monitor, through that proprietary API, which requires the support of the game as well.

NVIDIA on the other hand is doing all the work on the monitor through the FPGA module, it was a necessity as well [...]

So does anyone now then why do nVidia GPUs suffer a performance penalty (~5% was it? definitely non-negligible) with HDR on while AMD's dont?
Or was that a glitch that has been since fixed. If not, it seems counter-intuitive to me that AMD use their GPU for HDR work and don't lose any performance and nVidia doesn't use the GPU yet somehow they loose performance
 
Nope, GSync has always maintained 1Hz to max Hz range. See here:
https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/
It definitely hasn't always been, just re-checked and at least when FreeSync was introduced it was 30 Hz minimum (and 144 Hz max, but that's been changed earlier already), while FreeSync/Adaptive-sync was 9-240Hz (and rest is up to display and panel manufacturers capabilities)

HDR1000 doesn't. But GSync HDR requires a module to operate HDR1000 + 120Hz+ VRR (1-120Hz range) + FALD + overdrive all at the same time. So far no monitor but GSync HDR certified ones ever managed to cram all of these features together.
It needs a module, but does it really need it to do those? I'm willing to bet we're going to see quite a few displays doing all that without fancy modules.
 
Great news. I'm wondering if the HDMI specs soon requiring VRR had something to do with Nvidia finally opening up support for Adaptive Sync.

I think the certification is still a good idea. There was obviously a huge disparity in quality for Adaptive Sync panels and Nvidia's GSync was focused on minimum requirements for meeting the quality levels, so I can see why so many "failed". I'm glad they still allow manual enabling of the feature irrespective of certification.

Very positive move for the PC gaming industry.

Yup as soon as I saw LG's announcement of HDMI 2.1 TVs with support for Adaptive Sync VRR and 4k@120 Hz (this resolution and refresh rate is what makes VRR interesting), it was good-bye any possibility of me ever buying another NV card if NV didn't pull that stick out of their butt and support Adaptive Sync.

While I have little to no use for VRR at 60 Hz and below, I'm much more interested in it in the 60-120 Hz range. And now that I'm used to using a 49" 4k display as my main display, I'm not going back to a smaller one if I can help it. Those LG TVs are looking mighty attractive as a replacement for my current display.

Regardless of the reason, it's good to see NV finally support it.

Regards,
SB
 
The bigger difference is that FreeSync 2 HDR also implements proprietary(?) API which game developers can use to implement tonemapping all at once on the GPU instead of tonemapping once on GPU and then adjusting on display.
AMD's approach to this is to relieve the monitor makers from the burden of doing any HDR work, as they've shown time and time again, they are willing to do nothing, especially at lower tier prices. So AMD made the choice to put the burden on the GPU, instead of the monitor, through that proprietary API, which requires the support of the game as well.

AMD does have a proprietary API called AGS (AMD GPU Services), but AFAIK all it offers for FreeSync2 HDR support is:

1) AGSDisplayInfo stuct, which returns the FreeSync2 support flag and display chromaticity information (which is coded in DisplayID blocks 0x21 Display Parameters or 0x02 Color characteristics, EDID base block section Chromaticity coordinates and extension block 0xFB Additional white point descriptor, and their CTA-861G InfoFrame equivalents for HDMI displays; and

2) AGSDisplaySettings struct to enable Freesync2_scRGB display mode.

https://github.com/GPUOpen-LibrariesAndSDKs/AGS_SDK
https://gpuopen-librariesandsdks.github.io/ags/amd__ags_8h.htm

Everything else, including the actual tone-mapping step with color space conversion, is up to the application to implement.


BTW you can also get display chromaticity information and set display color space by using standard Windows APIs, Windows.Graphics.Display and IDXGIOutput6::GetDesc1 / IDXGISwapChain3::SetColorSpace1 - so I'm not really sure if you need to use a proprietary API to enable this in your application... AGS SDK is maintained by Rys, so it's probably a good idea to ask him for details:
I'm now almost 2 years in to that new job, happy to answer any questions about it
:cool:
 
What do you think the chances are of Nvidia releasing compatability results for all the monitors they test? It would be very nice to see how monitors fail to meet G Sync standards and by how far when lookin to buy a new one.
 
It definitely hasn't always been, just re-checked and at least when FreeSync was introduced it was 30 Hz minimum (and 144 Hz max, but that's been changed earlier already), while FreeSync/Adaptive-sync was 9-240Hz (and rest is up to display and panel manufacturers capabilities)
30Hz minimum for NVIDIA was just an expression, in effect it was 1Hz, when the refresh rate falls below 30Hz, NVIDIA repeats frames to keep it above 30Hz, this process occurs even if the display falls to 1Hz.

As for FreeSync, it doesn’t determine the range of variable refresh rates, AdaptiveSync does. Also, AMD is using the maximum and minimum refresh range published by the VESA standards body, not an actual implementation or even a planned implementation from any monitor or display, in contrast with the range of GSync 1Hz- 144 Hz, which is actually seen in shipping displays.

AMD had to implement LFC to be on equal footings with NVIDIA on this one, still their range is much worse than NVIDIA in most displays even with LFC, so when a GSync monitor will repeat frames under 30Hz, most FreeSync and FreeSync 2 monitors will do it under 40Hz or more.
It needs a module, but does it really need it to do those?
Yes, those features require a complicated display controller.
I'm willing to bet we're going to see quite a few displays doing all that without fancy modules.
I highly doubt that, NVIDIA had to scale their GSync HDR module to insane complexities in order to support those features, a process they wouldn't do if it wasn't absolutely needed. Display manufacturers will face the same dilemma, and some of them will have to scale up their controllers accordingly.
 
NVIDIA on the other hand is doing all the work on the monitor through the FPGA module, it was a necessity as well, as GSync HDR requires very demanding features: HDR1000, 120Hz+ (with a range from 1Hz to max Hz), frequency dependent variable overdrive, and FALD. These requirements absolutely demand a great deal of work from the monitor's controller

Relative to the power of even budget GPUs, it's not a lot of work. Relative to the power of GPUs someone who cares about GSync would buy, it's nothing.
 
30Hz minimum for NVIDIA was just an expression, in effect it was 1Hz, when the refresh rate falls below 30Hz, NVIDIA repeats frames to keep it above 30Hz, this process occurs even if the display falls to 1Hz.

As for FreeSync, it doesn’t determine the range of variable refresh rates, AdaptiveSync does. Also, AMD is using the maximum and minimum refresh range published by the VESA standards body, not an actual implementation or even a planned implementation from any monitor or display, in contrast with the range of GSync 1Hz- 144 Hz, which is actually seen in shipping displays.

AMD had to implement LFC to be on equal footings with NVIDIA on this one, still their range is much worse than NVIDIA in most displays even with LFC, so when a GSync monitor will repeat frames under 30Hz, most FreeSync and FreeSync 2 monitors will do it under 40Hz or more.

Yes, those features require a complicated display controller.

I highly doubt that, NVIDIA had to scale their GSync HDR module to insane complexities in order to support those features, a process they wouldn't do if it wasn't absolutely needed. Display manufacturers will face the same dilemma, and some of them will have to scale up their controllers accordingly.
If they went into the trouble of setting a standard that accepts variability from 30Hz all the way up to 100Hz+ why did they not go just the tiny extra step of making the lower bound 24Hz so it could neatly encompass film framerates within the standard itself.
<cheesy rant>
Why do I sometimes feel like consumers care more about this shit than the actual damn engineers and manufacturers who make their living out of it?
</cheesy rant>
 
Back
Top