WhiningKhan
Regular
I stand corrected, support for FreeSync was added via a firmware update in late 2020, it works along side GSync Compatibility now.
And so much for "quality and exclusivity" as well on that front.
I stand corrected, support for FreeSync was added via a firmware update in late 2020, it works along side GSync Compatibility now.
and salty fanboys show this more than ever /shrugs
That VESA standard was created by VESA for eDP applications. What AMD did is pushed VESA (which is a standardization organization consisting of a couple of dozens companies Nvidia including) to make this same standard an optional part of DP 1.4a specification. Hardly qualifies as "created the standard" don't you think?You forgot the part where AMD created that VESA standard
Not really, "Freesync" is supported on X/1 series OLEDs and Samsung TVs. I have no idea though if that's actually Freesync (as in the same HDMI Freesync which was made by AMD years ago) or just Freesync s/w layer on top of HDMI VRR (same as Gsync over HDMI).Only GSync supports them for now.
"Traction" here meaning that the PC monitor industry can't figure out how to make a proper PC HDR monitor for about five years now? What's that have to do with Nvidia? Do they make monitors too now?It was just pointing out that they had to lower their requirements to get any traction for G-Sync Ultimate. That's not unscathed.
At least to my understanding Adaptive-sync while using same technologies is not the same implementation of said technologies as in eDP. AMD created another implementation of them, tested it and submitted it to VESA.That VESA standard was created by VESA for eDP applications. What AMD did is pushed VESA (which is a standardization organization consisting of a couple of dozens companies Nvidia including) to make this same standard an optional part of DP 1.4a specification. Hardly qualifies as "created the standard" don't you think?
There has been several displays that spec-wise should fit the bill but chose not to implement G-Sync module. Probably too expensive with all the added costs. DisplayHDR 600 requiring much less makes it more feasible to add G-Sync Module + cooling costs"Traction" here meaning that the PC monitor industry can't figure out how to make a proper PC HDR monitor for about five years now? What's that have to do with Nvidia? Do they make monitors too now?
There are zero displays which use FALD but don't use Gsync h/w. The problem isn't cost of the module, the problem is the cost of FALD which makes these monitors way too expensive. The relaxation of requirement allowed the display manufactures to put the same h/w into cheaper models - basically it was done because they asked Nv to allow this since having Gsync h/w inside means higher sales usually.There has been several displays that spec-wise should fit the bill but chose not to implement G-Sync module. Probably too expensive with all the added costs.
Wat.the problem is the cost of FALD
None of them were "open" in any way or form.
The original FreeSync is based over DisplayPort 1.2a, using an optional feature VESA terms Adaptive-Sync.[9] This feature was in turn ported by AMD from a Panel-Self-Refresh (PSR) feature from Embedded DisplayPort 1.0,[10] which allows panels to control its own refreshing intended for power-saving on laptops.[11] AMD FreeSync is therefore a hardware–software solution that uses publicly-available protocols to enable smooth, tearing-free and low-latency gameplay.
FreeSync has also been implemented over HDMI 1.2+ as a protocol extension. HDMI 2.1+ has its own variable refresh rate system.[12]
Have you seen them? They have loads of issues with FALD lags. It's expensive to make a proper FALD driving h/w - which is what Gsync Ultimate h/w module is.see FALD TVs
What you've quoted proves what I've said.False.
They're specifically designed like that.They have loads of issues with FALD lags
Dawg a fucking tablet (genuine sub-like-7mm thick slate) now drives a 2k zone FALD setup.It's expensive to make a proper FALD driving h/w
A fucking Altera FPGA because man are the volumes are literal misery for high-end gaming monitors.what Gsync Ultimate h/w module is.
I said absorb not absolve, absorb means swallow inside.
You should go and see what you're talking about before claiming that a tablet is capable of driving a FALD at 144Hz with VRR.Dawg a fucking tablet now drives a 2k zone FALD setup.
(but handles it the same way FALD monitors do, i.e. haloing city).
You should go and see what you're talking about before claiming that a tablet is capable of driving a FALD at 144Hz with VRR.
What you've quoted proves what I've said.
AMD FreeSync is therefore a hardware–software solution that uses publicly-available protocols
False.
https://en.wikipedia.org/wiki/FreeSync
Until HDMI 2.1 launched, there was G-sync and there was Freesync. One is a closed box, the other one is using an open standard.
HDMI 2.1 converged on a single format for VRR.
Monitors with DP converged on Freesync which nvidia re-branded as gync-compatible. But there are still DP monitors and older TV's that are G-sync exclusive.
Monitors with HDMI <2.1 were always freesync.
Yeah because what you've quoted is actually right and proves what I've said.Your pre-edited post said " Wikipedia is wrong" (!!!). Now you edited to say "what you quoted proves what I've said".
And this is where it's wrong because it doesn't in fact use "publicly-available protocols" in case of HDMI.Ignoring the "elasticity" demonstrated to win arguments at any costs, you said that freesync is far from an open standard. Wikipedia, explicitly sates Freesync uses and I quote:
Does it have VRR?FYI, the new iPad Pro drives a very very highres 120Hz 2k zone miniLED panel just fine.
No, there is HDMI VRR and Freesync. One is a open standard, the other is proprietary. nVidia and Microsoft support HDMI VRR with HDMI 2.0.
There are no VESA Adaptive Sync on HDMI.Freesync is simply AMD's trademarked implementation of the Vesa Adaptive Sync, which is an open standard.
There are no VESA Adaptive Sync on HDMI.
Also the fact that something uses "publicly available protocols" doesn't mean that its not proprietary.
I can't find anything supporting the concept of freesync being proprietary.
Freesync is simply AMD's trademarked implementation of the Vesa Adaptive Sync, which is an open standard.