Nvidia G-SYNC

I was reading reviews of the Eizo Fortis *VA 120hz monitor with non-lightboost-branded scanning backlight tech., and immediately thought of G-Sync. I hope that Nvidia pushes this tech on some of the non-TN companies. Since you can now get a 23-24" IPS or IPS-like monitor for close to $200, I hope someone can make a scanning backlight, G-Sync non-TN 120hz+ monitor in 2014.
I'd prefer VA, IPS glow is annoying.
 
Sooo is it possible to purchase any gsync monitors at retail/etail now? Would really like that..
 
THine's V-by-One HS Brings NVIDIA G-SYNC Technology to 4K Displays.

THine Electronics, Inc., the global leader in high-speed serial interface technology and provider of mixed-signal large-scale integration semiconductors, today announced that its high-speed interface technology, V-by-One HS, will support NVIDIA G-SYNC Technology which synchronizes display refresh rates to the GPU, eliminating screen tearing and minimizing display stutter and input lag.

V-by-One HS Technology supports higher data transmission rates inside digital equipment like monitors while reducing the number of cables and connectors. The enhanced performance and lower costs of V-by-One HS has made it the internal interconnect choice of major 4Kx2K televisions and PC monitors.

index.php


"THine's V-by-One HS technology has penetrated global display markets such as 4Kx2K televisions," said Mr. Kazutaka Nogami, President and CEO of THine Electronics. "We look forward to collaborating with NVIDIA and display OEMs to bring to market new high-resolution displays featuring THine's V-by-One HS and NVIDIA G-SYNC technology."
http://www.guru3d.com/news_story/th..._nvidia_g_sync_technology_to_4k_displays.html
 
I don't understand what this thing does or why it is needed :LOL:, and/or it's an artifact of being confused by discovering a world leader completely unknown to the masses with product and company names using creative capitalization and hyphenation.

But thanks, dear behind-the-scenes company.
With good and reasonably universal support perhaps we'll see such stuff as *Sync working over Displayport over Wifi 60GHz or 5.5GHz.
 
I don't understand what this thing does or why it is needed :LOL:, and/or it's an artifact of being confused by discovering a world leader completely unknown to the masses with product and company names using creative capitalization and hyphenation.

But thanks, dear behind-the-scenes company.
With good and reasonably universal support perhaps we'll see such stuff as *Sync working over Displayport over Wifi 60GHz or 5.5GHz.

Its funny because, i was read threads about this on other forums, and all the time i was, like, but what are we talking about there ? what is this product ? cables ? interconnect like DP ?

Look like a new company ( but leader of the market lol ), is trying to get some attention..
 
So it is definitively a new company who try to enter the market.

I dont say their products are bad, it could be really excellent.. i was just like, but what is this and for what it is made, why i have never heard about them ? ..
 
Last edited by a moderator:
According to some of the info they were offering products back in 2008 so been around for 6 years, possibly longer. So while not an "ancient" company it will have it's 7th birthday shortly! :LOL:

Googling "V-by-One" turns up a lot of hits and after reading a few they seem to be one of those companies not in the "limelight" whose products you might be using everyday.

I think the "V-by-One" Standard may be relatively new ...
 
Last edited by a moderator:
It's simply an alternative to LVDS, I doubt their ICs had any trouble with dynamic changes of the clock speed before this though.
 
Vsync vs Gsync on Asus Swift PG2780; Leaked Nvidia driver offers taste of mobile G-Sync


The difference between G-Sync and V-Sync is captured in the video below, albeit imperfectly — without a high speed camera it’s difficult to perfectly differentiate the two. Nonetheless, the V-Sync video on the left has a subtle jerkiness to it that the G-Sync camera on the right lacks.

We confirmed that the pendulum demo ran perfectly on our Asus notebook, then fired up Skyrim to check a shipping title. Here’s what the frame times look like for Skyrim when G-Sync vs. V-Sync is used — bearing in mind, obviously, that this driver is still in alpha.

Elder Scrolls Skyrim — V-Sync enabled only
V-Sync-Only-640x275.png


Elder Scrolls Skyrim — G-Sync enabled
G-Sync-Enabled-640x256.png


The frame rate still jumps in a handful of places, but delivery and timing are much tighter with the G-Sync solution. That’s the advantage of the technology, and the idea of seeing it in enthusiast hardware is downright exciting. G-Sync just feels better in the majority of cases, especially if you only have a 60 Hz refresh rate on your LCD.

Nvidia has declined to say exactly how they enable G-Sync on desktop hardware or precisely what the mobile solution looks like, which has fueled speculation that their solution is substantially identical to the Adaptive-Sync standard now baked into DisplayPort 1.2a.

Laptop users who want to experiment with the driver can find link in article below -- option for Gsync will appear in Control panel.
http://www.extremetech.com/extreme/198603-leaked-nvidia-driver-offers-taste-of-mobile-g-sync
 
Last edited:
Nvidia's G-Sync goes mobile, adds features

The mobile version of G-Sync is distinct from the desktop version because laptops typically allow the GPU to connect to the display's control logic directly, without a display scaler chip standing in the way. As a result, Nvidia has managed to implement the mobile version of G-Sync without the custom module used in desktop displays. Instead, the GPU directly controls the display's behavior.

At least four different laptop makers will introduce G-Sync-capable laptops this week at the Computex trade show in Taipei, including Gigabyte, MSI, Asus, and Clevo, as indicated above. Gigabyte's offerings even include SLI multi-GPU mojo.

Since the G-Sync module isn't present, the GPU and driver software will combine to make sure these features work correctly. In fact, a GPU shader program will assist with LCD overdrive compensation, which Petersen told us is a "small amount of work for the GPU." Nonetheless, he claimed that the initial tuning of each LCD panel for the proper overdrive behavior in a variable-refresh setting is "a non-trivial effort," one for which Nvidia will continue to assume responsibility under the G-Sync banner.

In addition to expanding to laptops, G-Sync will be coming to a wider variety of desktop displays soon, many of which are likely to be announced or at least shown in some form this week at Computex. Nvidia provided us with the following list of upcoming monitors from just two manufacturers, Asus and Acer.

Nvidia is also taking a page from AMD's FreeSync by adding the ability to disable variable refresh synchronization (vsync) when the frame rate from the graphics card ventures beyond the range of refresh intervals supported by the display. This option is available in Nvidia's latest 352.90 drivers, which we used in our GeForce GTX 980 Ti review. See the screenshot above.

Unlike AMD, though, Nvidia will not let go of synchronization when frame rates drop below the monitor's tolerance. Instead, Nvidia's implementation will only allow tearing when the frame rate exceeds the speed of the display. Doing so makes sense, I think, given the collision-avoidance logic Nvidia has built for low-refresh scenarios; it tends to handle that situation pretty well. Allowing tearing at really high frame rates should make games more responsive by letting the game loop execute as quickly as possible. Folks playing twitch shooters should appreciate this option.

One of the better kept secrets of G-Sync displays is the presence of an ultra-low motion blur (ULMB) mode on some monitors. This mode sacrifices variable refresh, but it promises greater clarity through the use of backlight strobing. I looked at it right here in my review of the Asus PG278Q, if you're curious. Some folks really appreciate the benefits of this mode, and Nvidia has decided to raise its profile by including it as an option in its driver control panel alongside G-Sync variable refresh.

http://techreport.com/review/28361/nvidia-g-sync-goes-mobile-adds-features
 
Yay windowed mode, about time! Now just do it properly (i.e. in the OS, not the driver) and standardize support across GPU/monitor brands and I'm totally there :)
 
NVIDIA HDMI GSYNC

So here's the thing, Nvidia won't really talk about it, but there has been a silent update with the Gsync module, there is now a new revision, and guess what support is added ? Yes, it is HDMI GSYNC capable. It however is limited towards HDMI 1.4 and that does put a few restriction in on Hz and resolution. Nvidia hasn't been keen on GSYNC over HDMI, but ASUS for example has been requesting this a long time, and from what I heard, they have been pushing hard to get this supported over HDMI.

Now I lost track which monitor it was that was HDMI GSYNC enabled, but browsing through the 300+ photos I made I believe it was this model, the new 34"Curved GSYNC Monitor shown below. It is a ROG branded 75Hz prototype IPS panel at a 3440x1440 resolution and does not show a model number just yet.

http://www.guru3d.com/news-story/co...-amd-freesync-and-nvidia-gsync-over-hdmi.html
 
Yay windowed mode, about time! Now just do it properly (i.e. in the OS, not the driver) and standardize support across GPU/monitor brands and I'm totally there :)
Eh, not sure I would want variable vsynch in windowed, as that would fuck with my mouse pointer. While you wouldn't necessarily have a mouse pointer in some games (IE, shooters), I never play anything like that in windowed mode as that has tended to screw with brightness/gamma settings, making it hard to see things in-game (and the window borders are disruptive to immersion and steal viewport area), so I tend to have a mouse pointer onscreen in windowed games; MMOs, RTSes, diablo-like hack-and-slash games and so on have been found suitable for windowed play, and they all have a mouse pointer.

So...color me sceptical on this one. :)

Anyway, good on Nvidia for this one, instead of charging 150 dollars for a not really necessary FPGA/DRAM module, now they'll be able to charge 150 dollars for nothing. Profit$$$!
 
as that would fuck with my mouse pointer
That's another reason why the OS needs to be involved. Note that the OS does already play with refresh rate for some media applications, so this isn't unheard of.

(and the window borders are disruptive to immersion and steal viewport area), so I tend to have a mouse pointer onscreen in windowed games; MMOs, RTSes, diablo-like hack-and-slash games and so on have been found suitable for windowed play, and they all have a mouse pointer.
It's not so much about actually running in a window as it is that exclusive full-screen mode is dying/dead. Everything ultimately needs to run through the OS compositor/swap chain path so the OS needs to understand and control stuff like variable vsync. Anything NVIDIA does in their driver - which I'm sure it's very clever - is fundamentally the wrong place to do it.
 
Hello,
I can't start a new thread, so im posting this here.

not everyone might understand but anyone familiar with using a fps limitter in game will understand.

first sorry for my poor english, it's my 2nd language. Please read all my post, i hope you will understand what im asking. Im a sli user and gsync @ 144hz user. I have a sli of titan x and the rog swift monitor. I have high fps in games but with sli, in many games i still see some stutter/micro-stutter, even in the 90+ fps range. Single card is generally butter smooth. I discover how much using RTSS as a fps limitter was awesome. For exemple, in GTA5, with my setting, in town, im in the 90 fps range, but still see some stutter. Putting fps limitter to 80 fps help a lot. In fact, with sli, i discover that if you set a fps limitter and the fps never go below that fps, sli feel smooth. So i run locked at 80 fps, and its smooth. But when i go outside of the city, fps go below 80, and even at 70 fps, i start to see some stutter/micro-stutter again. So if i take down the fps limitter to 65fps, every thing is smooth again. But if then i go at the worse case in the game, fps can go as low as 55 fps. Then again, if i put a fps limitter to 50, smooth again. So to have a 100% smooth experience, i would have to limit at 50fps. But that is a bit low, when i can get 90+ fps elsewhere in the game.

So what im asking to programmer, is there a way make a dynamic fps limitter?
Like it would go down as fps go down, and it would go up only when enought power to not fall back so quick. I don't know if you understand what i mean? A bit like nvidia did with their "smooth vsync" option, in the manner that when it drop to 30hz, it wont go back up to 60hz until it have high enought fps to not fall back to 30hz too soon. That would be fantastic if someone could do that. That is the only way to enjoy a real smooth experience with gsync + sli +144hz. Im very sencitive to any form of stutter, and i can't live without RTSS now, but would be so great if a dynamic fps limitter would be possible.

I was thinking why not limit fps in fonction of the gpu 1 usage. I know sometimes you can be below 99% usage for different reason (ex. cpu bound situation), but i was thinking about something. If you try to maintain, let say 92% gpu 1 usage by the fps limitter, you should then be a few fps below what you can acheive. So if i play GTA5 and im at 90fps at 99% usage, then if the dynamic limitter try to keep gpu usage at 92%, he will limit to maybe 80 fps. If i move out of town and fps start to drop, the gpu usage will start to climb up, then the fps limitter, trying to keep gpu usage at 92% will then lower the fps. So if i dropped to 70 fps, the limitter will now limit to lets say 65 fps. Maybe this could be a way of making a dynamic fps limitter? There could be a minimum limit to maybe 30fps, even if gpu usage is 99%, and a maximun limit, maybe 144fps, if gpu usage is very low (ex. in a video cutscene) I really think this is something needed with gsync + sli + 144hz. Of course fps will be a little lower, but with gsync, high fps doesn't matter if frame pacing is not good. I hope someone could program this. Probably not an easy task, but if anyone think he can do it, let me know please.
Thanks you very much
 
Back
Top