AMD demonstrates Freesync, G-sync equivalent?

For that one monitor when it is outside of the adaptive sync band, I believe it operates as it would if it was a 144 Hz monitor without adaptive sync. Hence, below 40 Hz you get Vsync at 144 Hz panel refresh or no Vsync at 144 Hz panel refresh. At least they allow you to choose to turn Vsync off if you would rather have tearing than laggy input and/or judder, unlike Gsync.

Regards,
SB
For BenQ it's pretty clear according to this article, that it drops and stays as 40hz monitor when under the VRR range.
From observations and measurements I've taken, the BENQ panel 'sticks' at 40 Hz when game FPS levels drop <40 FPS. For that situation, the BENQ panel behaves like a fixed 40 Hz refresh rate display, and does what you would expect if V-Sync is on or off (judder or tearing). I will say that since it is refreshing at a relatively low rate, that the judder / tearing is more pronounced than it would be on a regular 60 Hz LCD.
If it would work as 144hz monitor in such case it would great as it would have a more stable framerates and tearing and judder wouldn't stay visible as long.
 
I don't see why it would have to revert to a 40hz panel, even if it doesn't do any prediction of the current average frametime.
Sure, if 25ms (40fps) has passed since last refresh it has to resend the current frame, but for a 144hz panel that should only take 7ms, after which we would be able to send out the next frame.
So, with frame times between 25 and 32 ms we would see some judder, but above that it should be perfect again. Of course for a 75hz panel like the LG (spending 13ms on a refresh) it would be worse.
If they start predicting the frame times (like it seems nvidia does from pcp´s description), that would mean they would do the extra refresh before the 40ms has passed, so that they are not in the middle of a refresh when the next frame is ready. But that shouldn't have anything to do with in which end of the cable the frame buffer is located.
 
If 15 months later -- with the backing of VESA, largest scaler makers and largest monitor makers -- they still can't imitate what G-Sync module does, then maybe without similar specialized hardware it is not as simple as some would like to believe.

Besides, why bother with the hassle when reviewers -- with few exceptions here and there -- all state it is "working as expected"? :rolleyes:
 
If 15 months later -- with the backing of VESA, largest scaler makers and largest monitor makers -- they still can't imitate what G-Sync module does, then maybe without similar specialized hardware it is not as simple as some would like to believe.

Besides, why bother with the hassle when reviewers -- with few exceptions here and there -- all state it is "working as expected"? :rolleyes:
Regarding "working as expected": Anandtech's puff piece was a pretty good indicator of their decline. I used to have higher expectations for them, they used to examine technology in depth, rather than halfheartedly throw articles together. I hope Tech Report does a better job.

One thing I am still waiting for someone to do is get a proper high speed camera setup to measure latencies. So many interesting experiments could be done if we were measuring what actually matters to the gaming experience.
 
PCLabs.pl provide side-to-side comparison videos between BenQ FreeSync monitor and ROG Swift on multiple FPS level: 27, 35, 45, 75 and 144.

Original article (in Polish)
Google translation

They also confirm that the ghosting is gone as soon as FreeSync deactivated.
In addition, ghosting on the screen disappears immediately after the FreeSync function is blocked and the monitor will be able to operate with a fixed refresh rate. So what's the reason?
 
This seems to be most critical now because of the high minimum of the VRR window. I suspect it is because the standard has been made very accommodating in order to allow an economy of effort in refreshing hardware based on a wide variety of manufacturers' non-variable designs.
I am not sure if there is a long-term barrier that prevents this from being lowered something that isn't so obviously problematic. The standard's currently theoretical 9 Hz lower bound is below the range where I would consider ghosting to be what I needed to worry about. What does the limit need to drop to in order to mitigate this objection? Sub-20?
 
Without some improvement on the way FreeSync handles sub-minimum frame rate, I think lowering minimum vrr range -- even if the panel has the capability -- is a moot point.

A hypothetical FreeSync monitor with 30Hz minimum refresh rate, while improving vrr range, will have flicker (similar to flickering on old movie projector) when under minimum FPS as an addition to excarberated judder (compared to FreeSync monitor with higher minimum refresh rate.)

And this is unrelated to the ghosting issue.
 
But what happens with this FreeSync monitor and theoretical G-Sync monitor below the window? AMD’s implementation means that you get the option of disabling or enabling VSync. For the 34UM67 as soon as your game frame rate drops under 48 FPS you will either see tearing on your screen or you will begin to see hints of stutter and judder as the typical (and previously mentioned) VSync concerns again crop their head up. At lower frame rates (below the window) these artifacts will actually impact your gaming experience much more dramatically than at higher frame rates (above the window).

G-Sync treats this “below the window” scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. It’s a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module.

http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ
 
Isn't the display matrix a frame-buffer, which can be freshed indefinitely?
It's similar to DRAM. Pixels hold their value, but only for so long. If LCDs start to flicker below 30Hz, that 'so long' is apparently on the order of 33ms before it becomes a visible nuisance.
 
Okay. DRAM doesn't need to be rewritten, just refreshed. If the display matrix can do the same as Jawed suggested; why would it not be the responsibility of the monitor manufacturer to maintin the appropriate refresh interval? Why should it be AMDs responsibility? It really sounds like Nvidia is making R&D and bug-fixing for the monitor makers, in a fashion.
 
The ghosting problem is different than the below the refresh rate problem...right? I think the ghosting problem isn't a freesync problem, but more the monitor makers implementation of the adaptive sync that isn't good enough. The low fps though... I previously asked why the monitor can't just double when it ran below refresh, like 29 become 58, not knowing that it was what Gsync is doing. My question is that can that feature be added transparently or it requires an update to the standard? I believe it can be added transparently, but I'm not 100% sure.
I also can imagine that if it can't be implemented without updating the standard, maybe the GPU can do similar (doubling, tripling, etc) stuff that the Gsync module do? Or is it actually possible for AMD to engineer a driver for current GPU to actually do this? My mind said it should be possible, again, not 100% sure.
 
Okay. DRAM doesn't need to be rewritten, just refreshed. If the display matrix can do the same as Jawed suggested; why would it not be the responsibility of the monitor manufacturer to maintin the appropriate refresh interval? Why should it be AMDs responsibility? It really sounds like Nvidia is making R&D and bug-fixing for the monitor makers, in a fashion.
In order to do that, theoritically you need to read the "value" back from an LCD cell, and calculate/deduct the exact original voltage to be re-applied. There are some problems here that I can think of; Can you actually read the content of a panel cell back? Can you do it fast enough for the whole panel? Does not the "value" change as the liquid crystal slowly untwist overtime? Does the formula to deduct the original voltage from this value is simple enough for the monitor circuitry?

What Jawed describe is likely an implementation of eDP self refresh, which does include a framebuffer inside the monitor TCON. So technically it is not 0 Hz, but 0 FPS. And AMD deserve the flak becuse they are the one who claim and advertise that FreeSync offer the same experience as G-Sync without the need of expensive additional hardware. Whereas the monitor makers does not state anything beyond their respective product supports FreeSync.
 
And yet, somehow, laptops can hold an image with a 0Hz refresh rate ;)
There are cell phones (and I assume laptops as well) that have self-refresh controllers. But those controllers have some kind of RAM (probably eDRAM) in the controller. So they still continuously refresh just the same.
 
Back
Top