Nvidia G-SYNC

Well you have to admit that G-Sync is actually a pretty good name to describe the technology :D

Anyway, at a high level, the fundamental idea behind NVIDIA's G-Sync is pretty simple and straightforward (ie. instead of having the monitor drive timing to the GPU, the GPU drives timing to the monitor). This basic idea has probably been thought about and expressed for many years now (NVIDIA claims that they have been working on G-Sync technology for some years now). The difficulty has been in actually implementing the idea. What spiked_mistborn suggested was to use a framebuffer in the monitor, but he never actually suggested how to conveniently do that.

Edit: the name "G-Sync" is actually not new at all. NVIDIA has been using a Quadro graphics technology called "G-Sync" for some years now. This newer G-Sync monitor technology is obviously much different in comparison, and it appears that NVIDIA has a trademark on the name.
 
Last edited by a moderator:
The funny thing is only that he proposed the GSync name. Otherwise, such discussions (detailing the exact idea of how it works with transfers only occuring with a new rendered frame) can be found several years back. That idea is kind of obvious and actually quite old.
 
The funny thing is only that he proposed the GSync name. Otherwise, such discussions (detailing the exact idea of how it works with transfers only occuring with a new rendered frame) can be found several years back.

In "Psychological Types" of Jung, there is an interesting pass that I still remember and love:
it was the difference between Schopenhauer's theory of the world as 'illusion' and the speech of a mad guys recorded years before him, saying that the world does not exist etc.

...his answer was: the former made a full blown system about it, the latter was just chatting without any real work on.

(btw this is why I *hate* software patents, like Abrash was commenting about ~20 years ago on DDJ about...)
 
It's worth noting, if we talk about Lightboost, that it's an either/or thing. G-Sync or Lightboost. The monitor supports both but not at the same time as Lightboost relies on a regular refresh, at least if you want to implement it easily.

In fact you can use a G-Sync monitor on an AMD card, the G feature is wasted but Lightboost is doable.

Improvements are promised for Lightboost, but I suppose it first focuses on brightness, color, gamma accuracy?, maybe 144Hz operation.
 
It's worth noting, if we talk about Lightboost, that it's an either/or thing. G-Sync or Lightboost. The monitor supports both but not at the same time as Lightboost relies on a regular refresh, at least if you want to implement it easily.
It should be relatively simple to implement a continous blend from backlight strobing (at high frame rates) to almost continuous backlight (at low frame rates). That would mean the LCD typical hold-type blur just grows continously towards lower frame rates.

edit:
It would work like that:
(i) backlight off, pixel matrix is getting new values
(ii) backlight strobe
(iii) as long as there is no new frame, turn on backlight with the average brightness

That way one wouldn't need to know beforehand how long a frame will be displayed, it can be extended indefinitely (as long as the pixels can hold their value) because it is not working with a predermined duty cycle for the backlight.
 
Last edited by a moderator:
Nvidia demoed this at our uni a few weeks ago and it looked pretty sweet in motion. Variable frame rates still cause stutter though ;)

What we really need to fix this problem in software. Game engines should try to predict the scene complexity in the next frame and adjust effects accordingly. Similar to Netflix which is far better than youtube anyday.
 
Nvidia demoed this at our uni a few weeks ago and it looked pretty sweet in motion. Variable frame rates still cause stutter though ;)

What we really need to fix this problem in software. Game engines should try to predict the scene complexity in the next frame and adjust effects accordingly. Similar to Netflix which is far better than youtube anyday.

If the framerate varies wildly, sure. But drops from 60Hz to 55 won't cause all the weird problems that it causes now.

As for predicting the complexity of the next frame, that must be near impossible.
 
There is a huge difference between "exactly predicting what the next frame would look like" and "what the next frame might look like". Sometimes even simple extrapolation of motion vectors would give you the idea about what you might be looking at.

Even for source engine games the frame rate is all round the place from 50-300.
 
I agree that this doesn't excuse developers from choosing techniques and optimizing engines for smooth frame generation, but it does solve the problem that there's a cliff at 16.6ms and if you miss it - even by a single microsecond - it looks like trash. That visual discontinuity needs to go away and I'm glad NVIDIA has done the legwork to make that happen.
 
Guru3D

Nvidia's G-SYNC is likely going to be exclusive to ASUS until late 2014. G-Sync is both a software and a hardware solution that will solve screen tearing and stuttering. A daughter hardware board is placed into a G-Sync enabled monitor which will do something very interesting. It is now reported that ASUS signed an exclusive deal with Nvidia. Something I would not be happy about.
In the first wave G-Sync will be introduced in monitors with a 120 to 144 Hz refresh rate. The first screen was already announced by ASUS, which is the VG248QE with a price tag of roughly 300 EUR (which might go upwards now).
This means that other manufacturers will have to wait until Q3/Q4 of 2013 before they can integrate it into their monitors. In the end if this information is right it will be a bad deal for the consumers as exclusivity drives prices upwards, which is exactly something you do not want to happen with new technology. If true, this is a very poor decision that has been made as it will irritate the other manufacturers like BenQ, Philips and Viewsonic, and next to that the exclusivity will halt sales for sure as people want diversity and not the option of just one monitor brand. So let's hope this was just a false rumor. More info on G-Sync can be found here.
G-sync have been designed for a good part in collaboration with Asus... I can imagine they want some profit from it.
 
Last edited by a moderator:
I was planning to upgrade to a 2560x1440 27" monitor but don't know if that's wise at this point. From all accounts g-sync really makes a difference. Asus exclusivity sucks and it's not clear whether the likes of Dell will even bother. Maybe I'll just go for the 2713hm anyway and upgrade again if gsync gains traction.
 
I`d say go for it now. G-Sync main impact is to unsettle the public enough and force a common standard to emerge.
I guess in around 1.5 - 2 years time you`ll see monitors supporting a good standardized protocoll - and quite possible not G-Sync anymore
 
you say go for it now, and you back that up with a statement saying its better to wait
your thought process is very strange...
 
only depends on how long you want to wait.
Something better is always around the corner, if he want a new monitor now then theres no reason to wait for a g-sync enabled one, since this seems to be a transitional step at a premium price and not a good long term investment.

That better?
 
The first gsync monitors will probably use TN panels to achieve the high refresh rates.

I'll stick to my old 24" until all the dust surrounding stereo 3D, OLED, gsync, etc. etc. settles a bit.
 
I've decided to purchase a G-sync monitor as soon as they are available. I am very sensitive to latency and tearing; this will transform my gaming experience.
 
Back
Top