Variable Refresh Rate topic *spawn*

Status
Not open for further replies.
You forgot the part where AMD created that VESA standard
That VESA standard was created by VESA for eDP applications. What AMD did is pushed VESA (which is a standardization organization consisting of a couple of dozens companies Nvidia including) to make this same standard an optional part of DP 1.4a specification. Hardly qualifies as "created the standard" don't you think?

Only GSync supports them for now.
Not really, "Freesync" is supported on X/1 series OLEDs and Samsung TVs. I have no idea though if that's actually Freesync (as in the same HDMI Freesync which was made by AMD years ago) or just Freesync s/w layer on top of HDMI VRR (same as Gsync over HDMI).

It was just pointing out that they had to lower their requirements to get any traction for G-Sync Ultimate. That's not unscathed.
"Traction" here meaning that the PC monitor industry can't figure out how to make a proper PC HDR monitor for about five years now? What's that have to do with Nvidia? Do they make monitors too now?
 
That VESA standard was created by VESA for eDP applications. What AMD did is pushed VESA (which is a standardization organization consisting of a couple of dozens companies Nvidia including) to make this same standard an optional part of DP 1.4a specification. Hardly qualifies as "created the standard" don't you think?
At least to my understanding Adaptive-sync while using same technologies is not the same implementation of said technologies as in eDP. AMD created another implementation of them, tested it and submitted it to VESA.
"Traction" here meaning that the PC monitor industry can't figure out how to make a proper PC HDR monitor for about five years now? What's that have to do with Nvidia? Do they make monitors too now?
There has been several displays that spec-wise should fit the bill but chose not to implement G-Sync module. Probably too expensive with all the added costs. DisplayHDR 600 requiring much less makes it more feasible to add G-Sync Module + cooling costs
 
There has been several displays that spec-wise should fit the bill but chose not to implement G-Sync module. Probably too expensive with all the added costs.
There are zero displays which use FALD but don't use Gsync h/w. The problem isn't cost of the module, the problem is the cost of FALD which makes these monitors way too expensive. The relaxation of requirement allowed the display manufactures to put the same h/w into cheaper models - basically it was done because they asked Nv to allow this since having Gsync h/w inside means higher sales usually.
 
None of them were "open" in any way or form.

False.

https://en.wikipedia.org/wiki/FreeSync
The original FreeSync is based over DisplayPort 1.2a, using an optional feature VESA terms Adaptive-Sync.[9] This feature was in turn ported by AMD from a Panel-Self-Refresh (PSR) feature from Embedded DisplayPort 1.0,[10] which allows panels to control its own refreshing intended for power-saving on laptops.[11] AMD FreeSync is therefore a hardware–software solution that uses publicly-available protocols to enable smooth, tearing-free and low-latency gameplay.

FreeSync has also been implemented over HDMI 1.2+ as a protocol extension. HDMI 2.1+ has its own variable refresh rate system.[12]

Until HDMI 2.1 launched, there was G-sync and there was Freesync. One is a closed box, the other one is using an open standard.

HDMI 2.1 converged on a single format for VRR.

Monitors with DP converged on Freesync which nvidia re-branded as gync-compatible. But there are still DP monitors and older TV's that are G-sync exclusive.

Monitors with HDMI <2.1 were always freesync.
 
They have loads of issues with FALD lags
They're specifically designed like that.
Complex algos I know, pain in the butt to handle.
It's expensive to make a proper FALD driving h/w
Dawg a fucking tablet (genuine sub-like-7mm thick slate) now drives a 2k zone FALD setup.
(but handles it the same way FALD monitors do, i.e. haloing city).
what Gsync Ultimate h/w module is.
A fucking Altera FPGA because man are the volumes are literal misery for high-end gaming monitors.
 
I said absorb not absolve, absorb means swallow inside.

I meant exactly what I wrote with the words I chose.

The rest of your post is personal opinion and only reflects normal practices of any company trying to compete with each other products (as opposed to mafia tactics such as GPP).
 
You should go and see what you're talking about before claiming that a tablet is capable of driving a FALD at 144Hz with VRR.

FYI, the new iPad Pro drives a very very highres 120Hz 2k zone miniLED panel just fine.
It even glows like shit on text and clips blacks here and there much the same way uber-expensive joke monitors do...
 
What you've quoted proves what I've said.

Your pre-edited post said " Wikipedia is wrong" (!!!). Now you edited to say "what you quoted proves what I've said".

Ignoring the "elasticity" demonstrated to win arguments at any costs, you said that freesync is far from an open standard. Wikipedia, explicitly sates Freesync uses and I quote:

AMD FreeSync is therefore a hardware–software solution that uses publicly-available protocols
 
False.

https://en.wikipedia.org/wiki/FreeSync


Until HDMI 2.1 launched, there was G-sync and there was Freesync. One is a closed box, the other one is using an open standard.

HDMI 2.1 converged on a single format for VRR.

Monitors with DP converged on Freesync which nvidia re-branded as gync-compatible. But there are still DP monitors and older TV's that are G-sync exclusive.

Monitors with HDMI <2.1 were always freesync.

No, there is HDMI VRR and Freesync. One is a open standard, the other is proprietary. nVidia and Microsoft support HDMI VRR with HDMI 2.0.
 
Your pre-edited post said " Wikipedia is wrong" (!!!). Now you edited to say "what you quoted proves what I've said".
Yeah because what you've quoted is actually right and proves what I've said.

Ignoring the "elasticity" demonstrated to win arguments at any costs, you said that freesync is far from an open standard. Wikipedia, explicitly sates Freesync uses and I quote:
And this is where it's wrong because it doesn't in fact use "publicly-available protocols" in case of HDMI.

FYI, the new iPad Pro drives a very very highres 120Hz 2k zone miniLED panel just fine.
Does it have VRR?
 
No, there is HDMI VRR and Freesync. One is a open standard, the other is proprietary. nVidia and Microsoft support HDMI VRR with HDMI 2.0.

I can't find anything supporting the concept of freesync being proprietary.

Freesync is simply AMD's trademarked implementation of the Vesa Adaptive Sync, which is an open standard.
 
Freesync is simply AMD's trademarked implementation of the Vesa Adaptive Sync, which is an open standard.
There are no VESA Adaptive Sync on HDMI.

Also the fact that something uses "publicly available protocols" doesn't mean that its not proprietary. Can you point me to where I can freely download Freesync s/w layer to freely use it in my project?
 
There are no VESA Adaptive Sync on HDMI.

Also the fact that something uses "publicly available protocols" doesn't mean that its not proprietary.

You cannot change the definition of proprietary to suit you case. Its a trademark. AMD cannot claim propriety of open standards. Wiki sates it uses open standards. But that excuse is indeed better than calling wikipedia "wrong".

The HDMI freesync version does not invalidate any of this discussion, as long as there exists freesync using vesa standards you cannot claim it proprietary. Its a trademark that covers both free and proprietary in older hdmi products.
 
Despite the protests, G-Sync will never become an industry standard. Manufacturers using the G-sync branding on their own displays that already feature an alternative VRR implementation should not be taken as a sign of victory when it is ZERO ADDED work for the other IHVs to support them. On the other hand implementing two different variable refresh rate technology is virtually an admission of defeat since it adds redundancy to the overall SW/HW stack and is harder to maintain two independent standards ...

In the end, no other IHVs will ever implement G-sync and most manufacturers don't care to participate either hence the slow down of releases with this technology. G-sync will inevitably be phased out ...
 
I can't find anything supporting the concept of freesync being proprietary.

Freesync is simply AMD's trademarked implementation of the Vesa Adaptive Sync, which is an open standard.

HDMI Freesync is proprietary. Is not compatible with HDMI VRR.
 
Status
Not open for further replies.
Back
Top