NVIDIA opens GSync

Discussion in 'Graphics and Semiconductor Industry' started by DavidGraham, Jan 7, 2019.

Tags:
  1. entity279

    Veteran Regular Subscriber

    Joined:
    May 12, 2008
    Messages:
    1,201
    Likes Received:
    402
    Location:
    Romania
    Yeah, marketing speech. Not interesting. (Edit : the compatible monitors may have stellar AdaptiveSync support . Which btw is not always the case with AMD. But if you can only count the supported monitors using your thumbs, it's still a failure)

    I'd have no issue with that IF nVidia will gradually expand the list of compatible monitors on their own and on merit, without requiring money or other incentives. I personally won't be giving them the benefit of doubt on this one.

    FreeSync2 is a slighly different thing. I don't understand the tech enoungh to asses if your analogy is good. In any case, the number of monitors impacted by that is significantly smaller. And my point is about hurting the adoption of the baseline AdaptiveSync technology.

    Check my edit. I'm also wondering if AMD doesn't behave similarly with monitors not "supporting" Freesync but which have Adaptive Sync working. If they do of course it's just as bad
     
  2. tongue_of_colicab

    Veteran

    Joined:
    Oct 7, 2004
    Messages:
    3,352
    Likes Received:
    582
    Location:
    Japan
    Isn't this just VESA's adaptive sync labeled as gsync by Nvidia not to look completely daft that their over priced monitors didn't take off?

    The way I understand it should work fine with every monitor that supports adaptive sync. But no doubt Nvidia will try to come up with some scheme to get a couple of bucks out of it.
     
  3. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,575
    Likes Received:
    2,130
    They said they will expand the list as they continue testing.
    AMD required monitor makers to apply for a FreeSync 2 certification program, because AMD required LFC and HDR600 as baseline features. Without this certification, makers were creating chaotic freesync 2 monitors which didn't meet AMD's standards.
     
    A1xLLcqAgt0qc2RyMz0y likes this.
  4. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    7,949
    Likes Received:
    1,655
    Location:
    Finland
    Actually it doesn't require HDR600, it's requirements fall somewhere in the middle of HDR400 and HDR600 on those portions of the spec (aka there are displays which qualify for FreeSync 2 HDR but not for HDR600).
    The bigger difference is that FreeSync 2 HDR also implements proprietary(?) API which game developers can use to implement tonemapping all at once on the GPU instead of tonemapping once on GPU and then adjusting on display.
     
  5. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,575
    Likes Received:
    2,130
    All FreeSync monitors can work through the NVIDIA driver, did you get that part?
    Now with an NVIDIA GPU you can run any FreeSync and GSync monitor. With an AMD GPU you are locked to only FreeSync ones. Hardly a failure in my books.
     
    A1xLLcqAgt0qc2RyMz0y likes this.
  6. entity279

    Veteran Regular Subscriber

    Joined:
    May 12, 2008
    Messages:
    1,201
    Likes Received:
    402
    Location:
    Romania
    Of course. You may have missed the context. I was talking about how enabling AdapiveSync by default on a few select monitors is hurting the adoption of the technology. The 'failure' qualification I applied to this context
     
    ToTTenTranz and DavidGraham like this.
  7. SimBy

    Regular Newcomer

    Joined:
    Jun 21, 2008
    Messages:
    496
    Likes Received:
    131
    Bye GSync. It was nice knowing you,
     
  8. Malo

    Malo YakTribe.games
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    6,677
    Likes Received:
    2,718
    Location:
    Pennsylvania
    Great news. I'm wondering if the HDMI specs soon requiring VRR had something to do with Nvidia finally opening up support for Adaptive Sync.

    I think the certification is still a good idea. There was obviously a huge disparity in quality for Adaptive Sync panels and Nvidia's GSync was focused on minimum requirements for meeting the quality levels, so I can see why so many "failed". I'm glad they still allow manual enabling of the feature irrespective of certification.

    Very positive move for the PC gaming industry.
     
    Dr Evil and BRiT like this.
  9. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,575
    Likes Received:
    2,130
    AMD's approach to this is to relieve the monitor makers from the burden of doing any HDR work, as they've shown time and time again, they are willing to do nothing, especially at lower tier prices. So AMD made the choice to put the burden on the GPU, instead of the monitor, through that proprietary API, which requires the support of the game as well.

    NVIDIA on the other hand is doing all the work on the monitor through the FPGA module, it was a necessity as well, as GSync HDR requires very demanding features: HDR1000, 120Hz+ (with a range from 1Hz to max Hz), frequency dependent variable overdrive, and FALD. These requirements absolutely demand a great deal of work from the monitor's controller, and thus the GSync module is truly necessary here. For comparison there are no FreeSync 2 display with these capabilities whatsoever.


    There are two possibilities here:
    -The NVIDIA GPU will demand the monitor to do the real HDR work, and since the monitor lacks any kind of module, it will not be able to, so the NVIDIA GPU will only drive the VRR portion of the display.

    -Since there are no FreeSync 2 displays with such demanding capabilities, the NVIDIA GPU will manage to drive both the HDR and VRR work, but not the advanced tonemapping features (that require the proprietary API)

    I am leaning toward the second possibility, but I am open for corrections or suggestions.
     
    #29 DavidGraham, Jan 7, 2019
    Last edited: Jan 7, 2019
    pharma likes this.
  10. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    7,949
    Likes Received:
    1,655
    Location:
    Finland
    Source for the refresh rate range? Can't find literally anything suggesting such and previously G-Sync's minimum has been IIRC 40 Hz
    HDR1000 doesn't require G-Sync nor FPGA-module, first HDR1000 display in fact has neither and supports Adaptive-sync (not sure if it supports FreeSync 2 HDR though)
     
    pharma likes this.
  11. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,575
    Likes Received:
    2,130
    Nope, GSync has always maintained 1Hz to max Hz range. See here:
    https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/
    HDR1000 doesn't. But GSync HDR requires a module to operate HDR1000 + 120Hz+ VRR (1-120Hz range) + FALD + overdrive all at the same time. So far no monitor but GSync HDR certified ones ever managed to cram all of these features together.
     
    #31 DavidGraham, Jan 7, 2019
    Last edited: Jan 7, 2019
    pharma likes this.
  12. entity279

    Veteran Regular Subscriber

    Joined:
    May 12, 2008
    Messages:
    1,201
    Likes Received:
    402
    Location:
    Romania
    So does anyone now then why do nVidia GPUs suffer a performance penalty (~5% was it? definitely non-negligible) with HDR on while AMD's dont?
    Or was that a glitch that has been since fixed. If not, it seems counter-intuitive to me that AMD use their GPU for HDR work and don't lose any performance and nVidia doesn't use the GPU yet somehow they loose performance
     
  13. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    7,949
    Likes Received:
    1,655
    Location:
    Finland
    It definitely hasn't always been, just re-checked and at least when FreeSync was introduced it was 30 Hz minimum (and 144 Hz max, but that's been changed earlier already), while FreeSync/Adaptive-sync was 9-240Hz (and rest is up to display and panel manufacturers capabilities)

    It needs a module, but does it really need it to do those? I'm willing to bet we're going to see quite a few displays doing all that without fancy modules.
     
  14. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    15,492
    Likes Received:
    4,408
    Yup as soon as I saw LG's announcement of HDMI 2.1 TVs with support for Adaptive Sync VRR and 4k@120 Hz (this resolution and refresh rate is what makes VRR interesting), it was good-bye any possibility of me ever buying another NV card if NV didn't pull that stick out of their butt and support Adaptive Sync.

    While I have little to no use for VRR at 60 Hz and below, I'm much more interested in it in the 60-120 Hz range. And now that I'm used to using a 49" 4k display as my main display, I'm not going back to a smaller one if I can help it. Those LG TVs are looking mighty attractive as a replacement for my current display.

    Regardless of the reason, it's good to see NV finally support it.

    Regards,
    SB
     
    Dr Evil likes this.
  15. DmitryKo

    Regular

    Joined:
    Feb 26, 2002
    Messages:
    607
    Likes Received:
    411
    Location:
    55°38′33″ N, 37°28′37″ E
    AMD does have a proprietary API called AGS (AMD GPU Services), but AFAIK all it offers for FreeSync2 HDR support is:

    1) AGSDisplayInfo stuct, which returns the FreeSync2 support flag and display chromaticity information (which is coded in DisplayID blocks 0x21 Display Parameters or 0x02 Color characteristics, EDID base block section Chromaticity coordinates and extension block 0xFB Additional white point descriptor, and their CTA-861G InfoFrame equivalents for HDMI displays; and

    2) AGSDisplaySettings struct to enable Freesync2_scRGB display mode.

    https://github.com/GPUOpen-LibrariesAndSDKs/AGS_SDK
    https://gpuopen-librariesandsdks.github.io/ags/amd__ags_8h.htm

    Everything else, including the actual tone-mapping step with color space conversion, is up to the application to implement.


    BTW you can also get display chromaticity information and set display color space by using standard Windows APIs, Windows.Graphics.Display and IDXGIOutput6::GetDesc1 / IDXGISwapChain3::SetColorSpace1 - so I'm not really sure if you need to use a proprietary API to enable this in your application... AGS SDK is maintained by Rys, so it's probably a good idea to ask him for details:
    :cool2:
     
    DavidGraham likes this.
  16. sir doris

    Regular

    Joined:
    May 9, 2002
    Messages:
    648
    Likes Received:
    108
    What do you think the chances are of Nvidia releasing compatability results for all the monitors they test? It would be very nice to see how monitors fail to meet G Sync standards and by how far when lookin to buy a new one.
     
  17. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,104
    Likes Received:
    3,404
    gsync should die forever. The industry basically forced them to support an open standard because of the next gen of televisions.
     
    ToTTenTranz and Sxotty like this.
  18. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,575
    Likes Received:
    2,130
    30Hz minimum for NVIDIA was just an expression, in effect it was 1Hz, when the refresh rate falls below 30Hz, NVIDIA repeats frames to keep it above 30Hz, this process occurs even if the display falls to 1Hz.

    As for FreeSync, it doesn’t determine the range of variable refresh rates, AdaptiveSync does. Also, AMD is using the maximum and minimum refresh range published by the VESA standards body, not an actual implementation or even a planned implementation from any monitor or display, in contrast with the range of GSync 1Hz- 144 Hz, which is actually seen in shipping displays.

    AMD had to implement LFC to be on equal footings with NVIDIA on this one, still their range is much worse than NVIDIA in most displays even with LFC, so when a GSync monitor will repeat frames under 30Hz, most FreeSync and FreeSync 2 monitors will do it under 40Hz or more.
    Yes, those features require a complicated display controller.
    I highly doubt that, NVIDIA had to scale their GSync HDR module to insane complexities in order to support those features, a process they wouldn't do if it wasn't absolutely needed. Display manufacturers will face the same dilemma, and some of them will have to scale up their controllers accordingly.
     
    Dr Evil likes this.
  19. MfA

    MfA
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    6,552
    Likes Received:
    396
    Relative to the power of even budget GPUs, it's not a lot of work. Relative to the power of GPUs someone who cares about GSync would buy, it's nothing.
     
  20. milk

    Veteran Regular

    Joined:
    Jun 6, 2012
    Messages:
    2,654
    Likes Received:
    2,125
    If they went into the trouble of setting a standard that accepts variability from 30Hz all the way up to 100Hz+ why did they not go just the tiny extra step of making the lower bound 24Hz so it could neatly encompass film framerates within the standard itself.
    <cheesy rant>
    Why do I sometimes feel like consumers care more about this shit than the actual damn engineers and manufacturers who make their living out of it?
    </cheesy rant>
     
    pharma, Kej, DavidGraham and 2 others like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...