AMD demonstrates Freesync, G-sync equivalent?

In the midst of opposing reactions over manufacturer response which stated that (VRR) overdrive functionality is part of FreeSync architecture, I took your post posing as an argument that it is not the case.
What you quoted says nothing about pixel response time compensation.
 
They bless a monitor with a FreeSync label if it satisfies their quality requirements. That doesn't mean the GPU has influence over the internals of the monitor. It doesn't mean that the GPU needs to signal something magic to the monitor. It's a label of quality. And dealing with ghosting is apparently not part of that quality assessment.

If you think this (I fully agree), then why did you say that AMD's impementation "reaches beyond the cable and goes inside the monitor"?

AMD seems to disagree;

It says it very clearly:
  • A compatible AMD Radeon™ graphics card, an enabled AMD Catalyst™ graphics driver (100% GPU side, no interaction with anything else)
  • And an Adaptive-Sync-aware display (it just follows the standard, no magic sauce here)
There is nothing in the DP standard nor in its adaptive sync part to control the monitor's overdrive or any other part of the monitor electronics. Freesync is just AMD's name to whatever they are doing inside the GPU/driver, which right now seems to be nothing special apart from the option to enable or disable V-Sync outside the VRR. They need to add frame duplication to this.

I think some of you are mixing the technical aspects of Freesync/Adaptive Sync and whatever crap a marketing person from AMD or BenQ said. The ghosting problem is clearly on the monitor side, AMD has no control over it and the monitor maker should be the one to take the blame. Then we have the Freesync label (lol, who cares), which means nothing in reality, because there is nothing from AMD on a monitor that has it. Typical AMD marketing fuck up, nothing interesting here.
 
What you quoted says nothing about pixel response time compensation.
It says nothing about panel refresh range too, but 9-240 Hz is FreeSync advantage. /kid

AMD has driver driven LCD overdrive option for some time. Doubt it'll work reliably with VRR. But maybe if it can be made profile based that users can create and shared among them it could help a bit.

It says it very clearly:
  • A compatible AMD Radeon™ graphics card, an enabled AMD Catalyst™ graphics driver (100% GPU side, no interaction with anything else)
  • And an Adaptive-Sync-aware display (it just follows the standard, no magic sauce here)
There is nothing in the DP standard nor in its adaptive sync part to control the monitor's overdrive or any other part of the monitor electronics. Freesync is just AMD's name to whatever they are doing inside the GPU/driver, which right now seems to be nothing special apart from the option to enable or disable V-Sync outside the VRR.
Adding, removing, and rewording parts while saying "it says it very clearly" is contradictory.

They need to add frame duplication to this.
Surprise. Unlike overdrive, frame duplication is part of DP standard or adaptive sync apparently?

I think some of you are mixing the technical aspects of Freesync/Adaptive Sync and whatever crap a marketing person from AMD or BenQ said. The ghosting problem is clearly on the monitor side, AMD has no control over it and the monitor maker should be the one to take the blame. Then we have the Freesync label (lol, who cares), which means nothing in reality, because there is nothing from AMD on a monitor that has it. Typical AMD marketing fuck up, nothing interesting here.
Great, the 'says it very clearly' is 'marketing crap' now. But if NVidia did something like this, it warrant a class action law suit?
 
AMD has driver driven LCD overdrive option for some time.
I suspect you're referring to this, which is only available for video playback:


But that's just pixel shading. Pre-processing the signal sent to the monitor (a digital signal).

Analogue overdrive in an LCD is like sending a pixel of value "280" to the pixel, whereas the graphics card connected to the monitor can only send a maximum of 255. Well that's an extreme example, it wouldn't necessarily always be an "out of range value".

It's possible to come up with an always-on pixel shader that adjusts every frame before delivering it to the monitor, based on the prior frame. It would be better if it also knew what the next frame is (which in video playback is trivial, which is prolly why the functionality in the driver is only for watching video).
 
Having memory enough to store a few lines and having memory enough to store at least a couple frames -- plus all the necessary complexity required for G-Sync like capabilities -- makes a lot of difference for a business where economics of scale is an important aspect. I don't think companies like Novatek, Realtek or MStar will bother design and producing some scaler where the potential user is limited to small portion of 290/290X owners.

And yet the demand is already there as evidenced by far more adaptive sync monitors coming to market than Gsync monitors. Likely because the cost to produce one is only a few USD (probably single digit price differential between scaler with and without adaptive sync support) more than the scaler they are already using. Hence, why not? Versus paying Nvidia 100-150 USD in addition to the scaler that is already being used in the monitor.

Gsync certainly does have an advantage in artificial benchmarks. But multiple review sites have noted virtually no difference when using it in...games. What it was designed for in the first place. And that's despite looking very hard for it after seeing the differences in an artificial test.

I wouldn't be at all surprised if in a year there are more people using adaptive sync on AMD hardware than people using Gsync on Nvidia hardware despite the huge discrepancy in GPU marketshare. Mainly due to the lower price (affording to even those on a budget) and greater choice.

As an Nvidia user (currently) I am certainly NOT interested in buying a Gsync monitor. I would certainly be interested in using an adaptive sync monitor with my card if I could, however.

Regards,
SB
 
I suspect you're referring to this, which is only available for video playback:


But that's just pixel shading. Pre-processing the signal sent to the monitor (a digital signal).

Analogue overdrive in an LCD is like sending a pixel of value "280" to the pixel, whereas the graphics card connected to the monitor can only send a maximum of 255. Well that's an extreme example, it wouldn't necessarily always be an "out of range value".

It's possible to come up with an always-on pixel shader that adjusts every frame before delivering it to the monitor, based on the prior frame. It would be better if it also knew what the next frame is (which in video playback is trivial, which is prolly why the functionality in the driver is only for watching video).

Yes, this overdrive exist in CCC since, well nearly since TFT panel exist. seriously i have never understand how it work.. was here for compensate a bit the ghosting and reverse ghosting on certain condition, but seriously i have allways find worst result when trying to play with it.
 
Surprise. Unlike overdrive, frame duplication is part of DP standard or adaptive sync apparently?

Neither. If done, it's done in the GPU/driver, as processing of the video signal that then gets sent to the monitor using DP. There is a difference between the information being sent, and the medium/standard used to do it.

AMD has driver driven LCD overdrive option for some time.

That option has nothing to do with the monitor overdrive. It's, again, processing of the video signal that gets sent to the monitor.
 
TFT Central reviewed the BenQ monitor. Overall a great gaming monitor, save for FreeSync implementation. Gives impression BenQ and AMD working to fix it.
From a monitor point of view the use of FreeSync creates a problem at the moment on the XL2730Z at the moment. The issue is that the AMA setting does nothing when you connect the screen over DisplayPort to a FreeSync system. This applies whether you are actually using FreeSync or not, you don't even need to have the option ticked in the graphics card settings for the problem to occur. As a result, the setting appears to be in the off state, and changing it to High or Premium in the menu makes no difference to real-World response times or performance. As a result, response times are fairly slow at ~8.5ms G2G and there is a more noticeable blur to the moving image. See the more detailed response time tests in the previous sections for more information, but needless to say this is not the optimum AMA (response time) setting on this screen. For some reason, the combination of FreeSync support and this display disables the AMA function.

AMA/overdrive on/off comparison with RoG Swift:

benq_xl2730z.jpg

asus_rog_swift_pg278q.jpg



Their subjective opinion on the impact of AMA disabled:
However, at the moment (FreeSync) implementation is not perfect, as it disables the AMA (overdrive) function and so leads to reduced response times. In fact at the moment for gaming we feel you are probably better using the screen from an non-FreeSync system or reverting to a pre-FreeSync driver so that you can use the AMA function properly. Once BenQ and AMD fix this issue we will update the review accordingly as it should then mean FreeSync is a real benefit.
 
One question that has to be asked is "has this ghosting been blown out of proportions?"
Many FreeSync reviews went through without anyone noticing it - it wasn't until someone decided to do slowmotion videos that it was even noticed. Is the ghosting really so bad, that it's noticeable for naked eye?
If you go back few years, best monitors were on similar ghosting levels and no-one whined about them ghosting (in fact, every time faster panels are out, the old fastest ones suddenly "start ghosting" just because we know there's faster ones now)
 
One question that has to be asked is "has this ghosting been blown out of proportions?"
Many FreeSync reviews went through without anyone noticing it - it wasn't until someone decided to do slowmotion videos that it was even noticed. Is the ghosting really so bad, that it's noticeable for naked eye?
If you go back few years, best monitors were on similar ghosting levels and no-one whined about them ghosting (in fact, every time faster panels are out, the old fastest ones suddenly "start ghosting" just because we know there's faster ones now)

Blown out of proportion. Even reviews that note the ghosting in artificial testing do not note noticeable ghosting in actual games. Meaning they KNOW it has ghosting, and they KNOW what to look for, but they just don't see it when using actual games or the impact is so slight that they don't notice it when playing the games.

Hell, the Benq has less ghosting than all 3 of my current monitors (all IPS panels). Hell a 4k monitor that is very popular right now due to its low price (~700 USD), large panel (40"), no PWM, and vibrant colors has significantly worse performance in the UFO test and racing car test, but almost all the users of it claim excellent response and gaming performance, in a OCUK thread that I was reading. And some of these people are comparing it to 120 Hz and 144 Hz gaming monitors that they own. The monitor is limited to 60 Hz with a horrible overdrive option that doesn't work.

Similar to die size, it's interesting to compare from a technical standpoint, but for the vast majority of people it's just not relevant.

Regards,
SB
 
AMA/overdrive on/off comparison with RoG Swift:

benq_xl2730z.jpg

asus_rog_swift_pg278q.jpg
That set of pictures isn't showing AMA/overdrive on/off. It's showing best case and worst case with AMA/overdrive on.

The article talks about a firmware fix combined with a driver update by AMD to solve the lack of AMA with FreeSync. I doubt anyone with this monitor will be getting a firmware fix they can install.
 
Hardwarecanucks with their 'long term FreeSync review' is able to come up with some badly needed graphs that show pretty clearly what happened under 40 FPS on the BenQ Freesync monitor.

FREESYNC-9.jpg


AMD response which rules out notion that this monitor runs at its highest refresh rate outside FreeSync range -- as believed by some:
On this particular display (the BenQ XL2730Z), the LCD flickers if you rapidly switch between 40Hz, 144Hz and back to 40Hz. We’ve set the V-Sync rate to 40Hz on this display when you fall below the range, which means you would see factors of 40 as the V-Sync jumps, but flickering is completely eliminated.

This is all subject to change, as we can modify these behaviors in the driver. We’re always looking at stuff like this to find new/better ways to deal with the particulars of an LCD, given that each one has its own characteristics.
 
Sure, for that monitor. Most of us were just countering that it isn't due to the technology, but due to the implementation in the monitor. Prior to knowing, of course, there was all kinds of speculation as to what could or should be happen when you're below the free-sync cut-off.

So it's still applicable that there can be free-sync monitors that operate at max frequency when below the cut-off. The Benq isn't one of them. Perhaps none of the ones currently out do it, perhaps some do.

It's an open specification. So similar to Windows PCs, there can be great variance in how the specification is applied. Likewise, there will be variance between how people handle it. The technologically well versed will obviously configure their game's such that they rarely if ever drop below the cut-off. Those not technologically well versed, aren't likely to notice anyway.

Hell, we still have people that think 30 hz gaming leads to a better experience as it is closer to a cinema experience. :p

Regards,
SB
 
Typical. Trying to provoke? Let's see how quickly moderators respond ...

How exactly am I provoking? This forum was full of people telling everyone G-Sync module was a must cause of secret sauce reasons.

OTOH this is not a G-Sync thread so you're off topic.
 
Back
Top