Variable Refresh Rate topic *spawn*

Discussion in 'Architecture and Products' started by dskneo, Jun 5, 2021.

Thread Status:
Not open for further replies.
  1. dskneo

    Regular

    Joined:
    Jul 25, 2005
    Messages:
    816
    Likes Received:
    298
    Its a big if, but it has happened before in the GPU space. I am referring to gsync vs vesa standard (freesync). Ultimately, what the majority of the industry adopts (based on consumer choices), ends up having support from all sides. We are very early in this "upsampling" war, and the sooner the industry converges in one type of solution, the better for us.
     
  2. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,211
    It's the other way around actually, GSync absorbed FreeSync in the end, NVIDIA sorted through all FreeSync monitors and put their badge on the decent ones, while discarding all the trash. Monitor makers happily traded FreeSync badges with GSync Compatible badges, why? because GSync were associated with quality, while FreeSync is not.

    Premium monitors (HDR1000/4K/120Hz/Strong Local Dimming/Variable Overdrive) remained GSync Ultimate exclusive, all Mini-LED monitors are still GSync exclusive as well, OLEDs TVs too.
     
    Kyyla, xpea, HLJ and 2 others like this.
  3. Rootax

    Veteran

    Joined:
    Jan 2, 2006
    Messages:
    2,400
    Likes Received:
    1,845
    Location:
    France

    I agree with that on PC.

    On the TV side, it seem that "generic" vrr won.
     
    DavidGraham likes this.
  4. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,235
    Likes Received:
    4,259
    Location:
    Guess...
    Indeed, I'm looking at my "Freesync Premium Pro" monitor right now with it's nice shiny G-SYNC logo proudly displayed.
     
    HLJ likes this.
  5. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,211
    It's important to understand the underlying history behind the events that transpired, away from the ridiculous noise of the open FreeSync won over the proprietary GSync.

    GSync was working as a driver level solution since day one, but only in laptops space, on desktops it needed a module because the usual run of the mill display controllers were absolute trash, they couldn't sustain the required specifications that NVIDIA deemed necessary for quality: Full Variable Refresh Rate (1Hz to 120Hz+), Variable Pixel Overdrive and Ultra Low Motion Blur (ULMB).

    And since laptop monitors can't hope to keep pace with desktops quality wise, a driver level solution was deemed enough, it would also have the advantage of saving some display power (this is inherent to any driver level variable refresh rate solution).

    On the desktops side, the G-Sync module was a way to force monitor makers to up their standards, and as time went by their standards indeed went up, several monitors shipped with quality display controllers, those could already be considered equivalent to the GSync module, so the driver level solution was eventually adapted to work with them, and we had GSync compatible displays.

    With the advent of HDR however, the situation changed again, controllers needed to become more advanced to handle HDR colors + variable refresh rate, so a new module was necessary, the GSync Ultimate module, to handle HDR1000+, Full Variable Refresh Rate Range (1Hz to 120Hz), Full Local Dimming, Variable Pixel Overdrive. The displays that can handle all of that are all GSync exclusive, because they only rely on the module to handle all of that. You can be sure that once regular display controllers will be up to the task the need for a separate GSync Ultimate module will disappear.

    An NVIDIA GPU remains the only solution to handle any PC monitor, it works on all FreeSync monitors + the regular GSync monitors + the advanced GSync Ultiamte monitors + OLED TVs too. Any other GPU can't do that. The GSync badge is blasted over the decent monitors. Hence why GSync truly absorbed FreeSync.
     
    #5 DavidGraham, Jun 5, 2021
    Last edited: Jun 5, 2021
    pharma and OlegSH like this.
  6. dskneo

    Regular

    Joined:
    Jul 25, 2005
    Messages:
    816
    Likes Received:
    298
    Oled tvs are hdmi 2.1 VRR, not gsync.

    Gsync also didn't absorve Freesync. Saying it like that is like absolving a criminal. The tech is freesync, we all know it. And the example was given as a precedent of free standards winning over closed boxes.

    What Nvidia did was trying to kill brand recognition of the competition by stealing brands and associating them with Geforce only. Its GPP. Its deeply anti-consumer, and people are getting desensitized torwards it. The fact gforce is using these practices and managed to positioned itself to be the only gpu to run on every monitor should not be celebrated.

    Anyways, its outside of the scope.
     
  7. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,240
    Likes Received:
    3,394
    Both Gsync and Freesync were "absolved" in a similar fashion by respective VESA and HDMI h/w level protocols. Gsync switched from it's own signaling to VESA adaptive sync on DP. Freesync switched from it's own signaling to HDMI VRR on HDMI. None of them were "open" in any way or form.
     
    Picao84, HLJ, pharma and 1 other person like this.
  8. troyan

    Regular

    Joined:
    Sep 1, 2015
    Messages:
    603
    Likes Received:
    1,122
    I replayed Anthem and i forgot that nVidia and Bioware increased the sharpness level of DLSS 1.0 to the moon which is the opposite of Battlefield 5. And DLSS 1.0 still looks good.

    Actually it was AMD who branded an open standard with Freesync.
     
    pharma, Picao84 and HLJ like this.
  9. HLJ

    HLJ
    Regular

    Joined:
    Aug 26, 2020
    Messages:
    529
    Likes Received:
    869
    NVIDIA has been playing AMD well.
    NVIDIA has resched tje point where they can level their software "eco-system" to leverage their hardware to full effect against AMD.

    Tesselation...made AMD go from "the beast called Tesselation" to "BaaaWaaah...too much tesselation!!!"
    GSync...made AMD scramble and put their label on a VESA standard.
    DXR...AMD is still playing catchup and hoping games only use "DXR light".
    DLSS...caught AMD (and me) totally of guard...their reponse is really no competitor to DLSS 2.1.
     
    PSman1700 and Rootax like this.
  10. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,462
    Location:
    Finland
    You forgot the part where AMD created that VESA standard
     
    Silent_Buddha likes this.
  11. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,211
    I said absorb not absolve, absorb means swallow inside.
    Only GSync supports them for now.
    There is no situation where FreeSync came on top of GSync, nill, zero. It's the opposite actually, FreeSync was so massively backed into a corner that it forced AMD to rename the brand thrice, and still that wasn't enough. The only brand that got the recognition here is GSync. And they still hold the quality and exclusivity card unscathed.
     
    PSman1700 and xpea like this.
  12. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,462
    Location:
    Finland
    I wouldn't call throwing G-Sync Ultimate requirements down the toilet "unscathed", regardless of what cause it.
     
    Wesker likes this.
  13. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,211
    That NEVER happened.
     
  14. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,462
    Location:
    Finland
    Sure it didn't. NVIDIA just listed DisplayHDR 1000 as requirement for funsies instead of actually requiring it and DisplayHDR 600 "G-Sync Ultimate" displays starting to appear after mentions of DisplayHDR 1000 disappeared from the place of "Lifelike HDR" were just a coincidence :rolleyes:
     
    dskneo and Wesker like this.
  15. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,211
    Those displays still retains the GSync Ultimate module, still retains the same features (Full Variable Refresh Rate Range, Local Dimming, Variable Pixel Overdrive), their HDR quality is 600 and above, and are still exclusive to NVIDIA GPUs, so I don't really understand where throwing the requirements down the toilet is coming from?!
     
    PSman1700 likes this.
  16. WhiningKhan

    Regular

    Joined:
    Sep 4, 2002
    Messages:
    537
    Likes Received:
    522
    ?
    I'm right now looking at my LG OLED CX 55 with "AMD Freesync Premium" shown in settings menu.
     
  17. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,502
    Likes Received:
    24,397
    I think the optimized game settings is to use the actual industry standard VRR and not use the vendor specific functions. These are the console specific settings, but I'd think they would carry over to PC. Right?

    PS5 Settings


    Series X Settings


    Mitigate VRR Gamma Issues
     
  18. orangpelupa

    orangpelupa Elite Bug Hunter
    Legend

    Joined:
    Oct 14, 2008
    Messages:
    10,466
    Likes Received:
    3,186
    Btw turns out for some display, the VRR range can be changed with CRU so it became wider.
     
  19. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,211
    I stand corrected, support for FreeSync was added via a firmware update in late 2020, it works along side GSync Compatibility now.
     
    PSman1700 likes this.
  20. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,462
    Location:
    Finland
    It was just pointing out that they had to lower their requirements to get any traction for G-Sync Ultimate. That's not unscathed.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...