NVIDIA opens GSync

Relative to the power of even budget GPUs, it's not a lot of work. Relative to the power of GPUs someone who cares about GSync would buy, it's nothing.

You are correct, but that's quite stupid as well. Variable refresh rate is something that enhances the experience on every performance tiers, so it shouldn't have ever been a "premium feature" in the first place.

This is not like having a large VRAM pool where only high end cards could make use of it, or raytracing hardware that would weigh too much on low/mid-end cards. Anyone can benefit from not going down directly to 30 FPS if the hardware can't maintain the framerate above 60FPS (or dealing with terrible tearing).

I bet even the 15W 4C+GT2 Ice Lake is going to be great for gaming on UMPCs or surface-like tablets. 1TFLOPs GPU + >50GB/s from LPDDR4X + variable refresh rate will rock at 720p-900p.
 
I highly doubt that, NVIDIA had to scale their GSync HDR module to insane complexities in order to support those features, a process they wouldn't do if it wasn't absolutely needed.
I have my doubts, remember "Hey mother board makers SLI is only possible if you buy a controller chip from us and put it on your boards"
everyone (apart from asus) said "Er I think we'll pass" and as if by magic Nvidia engineers suddenly discovered sli would work without the chip possibly the same magic that put a " dont disable sli on boards that manufacturers have paid Nvidia money whitelist" in their drivers.
 
Samsung C32HG70 & GTX 1060:
- Works properly with "Ultimate Engine" FreeSync 2 mode, not so much with "Standard Engine" (Hz is variable in Standard Engine too but seems to be stuck 99% of time @ 13x-144Hz) ((Ultimate Engine VRR range is 48 - 144 Hz, Standard is 90 - 144 Hz I think))
- Doubles/Triples(/even quadruples I think) Hz whenever possible (like 40 and 60 FPS both run at 120 Hz)
- Some flickering in certain situations (so far noticed in Overwatch borderless windowed mode, works fine in fullscreen, pubg works fine both borderless windowed & fullscreen)
- Works with HDR (tested with Hellblade: Senua's Sacrifice)
 
emember "Hey mother board makers SLI is only possible if you buy a controller chip from us and put it on your boards"
everyone (apart from asus) said "Er I think we'll pass" and as if by magic Nvidia engineers suddenly discovered sli would work without the chip possibly the same magic that put a " dont disable sli on boards that manufacturers have paid Nvidia money whitelist" in their drivers.
Totally different beasts, the level of variables required here are too many (HDR1000 + 120Hz + VRR (1-120Hz range) + Full Array Local Dimming + variable overdrive).

In your CF example, multi GPUs has already been shown to work without a specific controller, and it was obvious that NVIDIA was making a marketing stunt. This is not the case with GSync HDR. The new complex GSync HDR module already costs 500$ alone to make, a complexity that is required to handle all those variables. Also there are no VRR monitors supporting those features without a module. Monitor manufacturers are going to need to beef up their controllers if they wish to offer the same level of feature support.
 
GeForce GTX series 980 Ti and below will not get Adaptive Sync Support
Yeah I believe only Pascal and later supports Adaptive Sync.

According to Hardware Unboxed who tested a bunch of their Adaptive Sync monitors (all worked flawlessly), the Nvidia implementation doesn't support HDMI. VRR with Adaptive Sync monitors is only available over DisplayPort. AMD does support VRR over HDMI.
 
The Nvidia implementation doesn't support HDMI. VRR with Adaptive Sync monitors is only available over DisplayPort. AMD does support VRR over HDMI.
Yeah, I read HDMI support wasn't going to make it the first round though may be added in future drivers.
 
The new complex GSync HDR module already costs 500$ alone to make, a complexity that is required to handle all those variables. .

Hmm so that would make it more expensive to produce than most (if not all?) the gpus than nVidia sells to gamers. Source for this?
 
Hmm so that would make it more expensive to produce than most (if not all?) the gpus than nVidia sells to gamers. Source for this?
It's an FPGA from Altera (Intel), it's called Arria 10 GX 480, and comes with 3GB of DDR4-2400 RAM. It's sold separately online for 2000$. But PCPer estimates the cost on NVIDIA to be around 500$.

A mid-range option in the Arria 10 lineup, the GX480 provides 480,000 reprogrammable logic, as well as twenty-four 17.4 Gbps Transceivers for I/O. Important for this given application, the GX480 also supports 222 pairs of LVDS I/O.

DRAM from Micron can also be spotted on this G-SYNC module. We can confirm that this is, in fact, a total of 3 GB of DDR4-2400 memory.
https://www.pcper.com/reviews/Graph...z-G-SYNC-Monitor-True-HDR-Arrives-Desktop/Tea
 
It's an FPGA from Altera (Intel), it's called Arria 10 GX 480, and comes with 3GB of DDR4-2400 RAM. It's sold separately online for 2000$. But PCPer estimates the cost on NVIDIA to be around 500$.

Well at least an FPGA can be replaced by ASICs further into the lifetime of this countroller, driving prices down (one could only hope)

Here's hoping to a decent enough Freesync HDR implementation so we won't be tempted into buying any of these
 
Well at least an FPGA can be replaced by ASICs further into the lifetime of this countroller, driving prices down (one could only hope)
Considering we never got ASICs from previous G-Sync versions, I wouldn't be holding my breath
Here's hoping to a decent enough Freesync HDR implementation so we won't be tempted into buying any of these
We already have it and have had it for quite some time, it's called FreeSync 2 HDR. If something isn't decent it's G-Sync Ultimate or whatever the HDR-version is now called where you need to shove in 2-3 good gaming pc's worth of money just for a display and are coming a year later than you were supposed to
 
Considering we never got ASICs from previous G-Sync versions, I wouldn't be holding my breath

I wouldn't either. The volumes for these monitors will be lower still compared the original G-Sync


We already have it and have had it for quite some time, it's called FreeSync 2 HDR. If something isn't decent it's G-Sync Ultimate or whatever the HDR-version is now called where you need to shove in 2-3 good gaming pc's worth of money just for a display and are coming a year later than you were supposed to

Maybe my phrasing was wrong. I know of FreeSync 2 HDR. I didn't follow how it actually performs though.
And for now, I don't even care. I'd buy into HDR tech once it matures more.. in 2 years maybe. For that hypothetical extra 500 $, G-Sync HDR better be miles ahead of their competition by then. Also, wouldn't hold my breath.
 
We already have it and have had it for quite some time, it's called FreeSync 2 HDR.
When held to the standards of the top features in a monitor, it's called barely there. Till this moment there is barely any FreeSync 2 monitor with HDR1000 + LFC, let alone the other features (Full Array Local Dimming + 144Hz + variable overdrive).

I am all for a cheap monitor with those features, I would buy it in a heart beat, but the fact is: they are not available outside of these ridiculously expensive GSync panels. So I am to wait till further notice.
 
And for now, I don't even care. I'd buy into HDR tech once it matures more.. in 2 years maybe. For that hypothetical extra 500 $, G-Sync HDR better be miles ahead of their competition by then. Also, wouldn't hold my breath.
Well, currently there's four DisplayHDR 1000 certified products, one of them is FreeSync and rest G-Sync. Granted, features don't match really as the FreeSync one is only 60 - 80 Hz and has less backlight zones, but it's also priced at half the g-sync monitors prices while being a lot bigger (Philips 43" under $1k, Asus / Acer 27"'s around $2k (and there's the HP 65" but that thing is priced at $5k)
 
Works fine here:

gsyncmgjac.png


Gsync + Freesync displays work together without issues. My 4K 60hz monitor can do 32-60 hz range without issues, freaks out if I set it to 30-60 (can't handle LFC).
 
Works fine here:

Gsync + Freesync displays work together without issues. My 4K 60hz monitor can do 32-60 hz range without issues, freaks out if I set it to 30-60 (can't handle LFC).
Somewhat related to this you can only have one freesync/adaptive-sync display with NVIDIA, so if you're using more than one freesync capable display you have to pick which display to enable it on
 
Samsung C32HG70 & GTX 1060:
- Works properly with "Ultimate Engine" FreeSync 2 mode, not so much with "Standard Engine" (Hz is variable in Standard Engine too but seems to be stuck 99% of time @ 13x-144Hz) ((Ultimate Engine VRR range is 48 - 144 Hz, Standard is 90 - 144 Hz I think))
- Doubles/Triples(/even quadruples I think) Hz whenever possible (like 40 and 60 FPS both run at 120 Hz)
- Some flickering in certain situations (so far noticed in Overwatch borderless windowed mode, works fine in fullscreen, pubg works fine both borderless windowed & fullscreen)
- Works with HDR (tested with Hellblade: Senua's Sacrifice)
As an update after couple days more experience:
- Overwatch now shows black bar artifacts + blinking always in borderless windowed
- Today alone got 2 or 3 times random mode switch on display while playing Overwatch fullscreen, returned to normal each time
- Sometimes when starting up Overwatch (in fullscreen mode) the display stays in HDR mode, have to alt+tab to desktop and back to get to SDR mode (usually works on 1st try)
- Sometimes when alt+tabbing to fullscreen Overwatch the display stays black and complains about "unoptimal resolution/refresh rate", only solution is alt+enter to windowed mode and manually setting the game back to fullscreen from options (alt+enter back to fullscreen doesn't work)
 
As an update after couple days more experience:
- Overwatch now shows black bar artifacts + blinking always in borderless windowed
- Today alone got 2 or 3 times random mode switch on display while playing Overwatch fullscreen, returned to normal each time
- Sometimes when starting up Overwatch (in fullscreen mode) the display stays in HDR mode, have to alt+tab to desktop and back to get to SDR mode (usually works on 1st try)
- Sometimes when alt+tabbing to fullscreen Overwatch the display stays black and complains about "unoptimal resolution/refresh rate", only solution is alt+enter to windowed mode and manually setting the game back to fullscreen from options (alt+enter back to fullscreen doesn't work)
Have you tried fullscreen windowed?
 
rtings.com has a new article up.

https://www.rtings.com/monitor/tests/motion/g-sync-compatible

Updated Jan 17, 2019 By Adam Babcock

G-SYNC Compatibility Test

Using FreeSync Monitors with an NVIDIA Graphics Card

Since 2013, there have been two competing Variable Refresh Rate technologies: FreeSync, developed by AMD, and G-SYNC, developed by NVIDIA. Since the two technologies are not interchangeable, up until now, it has been important to choose a monitor that uses the same technology as your graphics card. This changed on January 15th, 2019, when NVIDIA released version 417.71 of their driver that enables FreeSync on NVIDIA G-Sync graphics cards.

This update enables FreeSync support on any 10- and 20- series NVIDIA graphics card, but only over DisplayPort. While developing the new driver, NVIDIA tested over 400 different FreeSync displays, and identified 12 monitors that meet their implementation standard.

The new driver allows you to enable FreeSync with any FreeSync display, even if it isn't officially supported. According to NVIDIA, the unsupported monitors displayed a range of FreeSync issues, ranging from minor tearing and blur, to screen blanking or motion duplications. But, also according to NVIDIA themselves, the monitors displayed these issues on AMD and NVIDIA graphics cards. Another issue is in NVIDIA's requirements, as they require FreeSync to be enabled by default on G-SYNC Compatible monitors; since most monitors require you to enable FreeSync in the monitor's OSD, they are automatically disqualified, even if there were no issues.

Note that this driver technically implements the VESA Adaptive Sync standard, not FreeSync. FreeSync is the proprietary AMD implemenation of the VESA Adaptive Sync Standard. Since both use the same standard, this allows NVIDIA cards to work with FreeSync monitors. In this article, we use FreeSync instead of Adaptive Sync for simplicity sake.

Test results
 
Back
Top