Variable Refresh Rate topic *spawn*

Status
Not open for further replies.

dskneo

Regular
That's a biiig condition. In a lot of spaces, proprietary is still wining because of this, competitors can't came close with open tech...

Its a big if, but it has happened before in the GPU space. I am referring to gsync vs vesa standard (freesync). Ultimately, what the majority of the industry adopts (based on consumer choices), ends up having support from all sides. We are very early in this "upsampling" war, and the sooner the industry converges in one type of solution, the better for us.
 
Its a big if, but it has happened before in the GPU space. I am referring to gsync vs vesa standard (freesync)
It's the other way around actually, GSync absorbed FreeSync in the end, NVIDIA sorted through all FreeSync monitors and put their badge on the decent ones, while discarding all the trash. Monitor makers happily traded FreeSync badges with GSync Compatible badges, why? because GSync were associated with quality, while FreeSync is not.

Premium monitors (HDR1000/4K/120Hz/Strong Local Dimming/Variable Overdrive) remained GSync Ultimate exclusive, all Mini-LED monitors are still GSync exclusive as well, OLEDs TVs too.
 
It's the other way around actually, GSync absorbed FreeSync in the end, NVIDIA sorted through all FreeSync monitors and put their badge on the decent ones, while discarding all the trash. Monitor makers happily traded FreeSync badges with GSync Compatible badges, why? because GSync were associated with quality, while FreeSync is not.

Premium monitors (HDR1000/4K/120Hz/Strong Local Dimming) remained GSync Ultimate exclusive, all Mini-LED monitors are still GSync exclusive as well, OLEDs too.


I agree with that on PC.

On the TV side, it seem that "generic" vrr won.
 
It's the other way around actually, GSync absorbed FreeSync in the end, NVIDIA sorted through all FreeSync monitors and put their badge on the decent ones, while discarding all the trash. Monitor makers happily traded FreeSync badges with GSync Compatible badges, why? because GSync were associated with quality, while FreeSync is not.

Premium monitors (HDR1000/4K/120Hz/Strong Local Dimming) remained GSync Ultimate exclusive, all Mini-LED monitors are still GSync exclusive as well, OLEDs too.

Indeed, I'm looking at my "Freesync Premium Pro" monitor right now with it's nice shiny G-SYNC logo proudly displayed.
 
  • Like
Reactions: HLJ
Indeed, I'm looking at my "Freesync Premium Pro" monitor right now with it's nice shiny G-SYNC logo proudly displayed.
It's important to understand the underlying history behind the events that transpired, away from the ridiculous noise of the open FreeSync won over the proprietary GSync.

GSync was working as a driver level solution since day one, but only in laptops space, on desktops it needed a module because the usual run of the mill display controllers were absolute trash, they couldn't sustain the required specifications that NVIDIA deemed necessary for quality: Full Variable Refresh Rate (1Hz to 120Hz+), Variable Pixel Overdrive and Ultra Low Motion Blur (ULMB).

And since laptop monitors can't hope to keep pace with desktops quality wise, a driver level solution was deemed enough, it would also have the advantage of saving some display power (this is inherent to any driver level variable refresh rate solution).

On the desktops side, the G-Sync module was a way to force monitor makers to up their standards, and as time went by their standards indeed went up, several monitors shipped with quality display controllers, those could already be considered equivalent to the GSync module, so the driver level solution was eventually adapted to work with them, and we had GSync compatible displays.

With the advent of HDR however, the situation changed again, controllers needed to become more advanced to handle HDR colors + variable refresh rate, so a new module was necessary, the GSync Ultimate module, to handle HDR1000+, Full Variable Refresh Rate Range (1Hz to 120Hz), Full Local Dimming, Variable Pixel Overdrive. The displays that can handle all of that are all GSync exclusive, because they only rely on the module to handle all of that. You can be sure that once regular display controllers will be up to the task the need for a separate GSync Ultimate module will disappear.

An NVIDIA GPU remains the only solution to handle any PC monitor, it works on all FreeSync monitors + the regular GSync monitors + the advanced GSync Ultiamte monitors + OLED TVs too. Any other GPU can't do that. The GSync badge is blasted over the decent monitors. Hence why GSync truly absorbed FreeSync.
 
Last edited:
It's the other way around actually, GSync absorbed FreeSync in the end, NVIDIA sorted through all FreeSync monitors and put their badge on the decent ones, while discarding all the trash. Monitor makers happily traded FreeSync badges with GSync Compatible badges, why? because GSync were associated with quality, while FreeSync is not.

Premium monitors (HDR1000/4K/120Hz/Strong Local Dimming/Variable Overdrive) remained GSync Ultimate exclusive, all Mini-LED monitors are still GSync exclusive as well, OLEDs TVs too.

Oled tvs are hdmi 2.1 VRR, not gsync.

Gsync also didn't absorve Freesync. Saying it like that is like absolving a criminal. The tech is freesync, we all know it. And the example was given as a precedent of free standards winning over closed boxes.

What Nvidia did was trying to kill brand recognition of the competition by stealing brands and associating them with Geforce only. Its GPP. Its deeply anti-consumer, and people are getting desensitized torwards it. The fact gforce is using these practices and managed to positioned itself to be the only gpu to run on every monitor should not be celebrated.

Anyways, its outside of the scope.
 
Both Gsync and Freesync were "absolved" in a similar fashion by respective VESA and HDMI h/w level protocols. Gsync switched from it's own signaling to VESA adaptive sync on DP. Freesync switched from it's own signaling to HDMI VRR on HDMI. None of them were "open" in any way or form.
 
I replayed Anthem and i forgot that nVidia and Bioware increased the sharpness level of DLSS 1.0 to the moon which is the opposite of Battlefield 5. And DLSS 1.0 still looks good.

What Nvidia did was trying to kill brand recognition of the competition by stealing brands and associating them with Geforce only. Its GPP. Its deeply anti-consumer, and people are getting desensitized torwards it. The fact gforce is using these practices and managed to positioned itself to be the only gpu to run on every monitor should not be celebrated.

Anyways, its outside of the scope.

Actually it was AMD who branded an open standard with Freesync.
 
I

Actually it was AMD who branded an open standard with Freesync.

NVIDIA has been playing AMD well.
NVIDIA has resched tje point where they can level their software "eco-system" to leverage their hardware to full effect against AMD.

Tesselation...made AMD go from "the beast called Tesselation" to "BaaaWaaah...too much tesselation!!!"
GSync...made AMD scramble and put their label on a VESA standard.
DXR...AMD is still playing catchup and hoping games only use "DXR light".
DLSS...caught AMD (and me) totally of guard...their reponse is really no competitor to DLSS 2.1.
 
Gsync also didn't absorve Freesync. Saying it like that is like absolving a criminal
I said absorb not absolve, absorb means swallow inside.
Oled tvs are hdmi 2.1 VRR, not gsync.
Only GSync supports them for now.
And the example was given as a precedent of free standards winning over closed boxes.
There is no situation where FreeSync came on top of GSync, nill, zero. It's the opposite actually, FreeSync was so massively backed into a corner that it forced AMD to rename the brand thrice, and still that wasn't enough. The only brand that got the recognition here is GSync. And they still hold the quality and exclusivity card unscathed.
 
That NEVER happened.
Sure it didn't. NVIDIA just listed DisplayHDR 1000 as requirement for funsies instead of actually requiring it and DisplayHDR 600 "G-Sync Ultimate" displays starting to appear after mentions of DisplayHDR 1000 disappeared from the place of "Lifelike HDR" were just a coincidence :rolleyes:
 
Sure it didn't. NVIDIA just listed DisplayHDR 1000 as requirement for funsies instead of actually requiring it and DisplayHDR 600 "G-Sync Ultimate" displays starting to appear after mentions of DisplayHDR 1000 disappeared from the place of "Lifelike HDR" were just a coincidence :rolleyes:
Those displays still retains the GSync Ultimate module, still retains the same features (Full Variable Refresh Rate Range, Local Dimming, Variable Pixel Overdrive), their HDR quality is 600 and above, and are still exclusive to NVIDIA GPUs, so I don't really understand where throwing the requirements down the toilet is coming from?!
 
?
I'm right now looking at my LG OLED CX 55 with "AMD Freesync Premium" shown in settings menu.

I think the optimized game settings is to use the actual industry standard VRR and not use the vendor specific functions. These are the console specific settings, but I'd think they would carry over to PC. Right?

PS5 Settings

Series X Settings

Mitigate VRR Gamma Issues
 
Those displays still retains the GSync Ultimate module, still retains the same features (Full Variable Refresh Rate Range, Local Dimming, Variable Pixel Overdrive), their HDR quality is 600 and above, and are still exclusive to NVIDIA GPUs, so I don't really understand where throwing the requirements down the toilet is coming from?!
It was just pointing out that they had to lower their requirements to get any traction for G-Sync Ultimate. That's not unscathed.
 
Status
Not open for further replies.
Back
Top