Nvidia G-SYNC

Yeah, Id like it being standard. I`d like it to have been a Standard years ago.

But its not about high framerates, but fractional arbitrary ones. It shouldnt matter if the game can output 30, 60 or 54 fps, it sends the finished frame to the monitor and the monitor takes care of displaying it as fast as possible. It wouldnt have to be a fixed raster of n fps, but just a minimum display latency that the monitor requires. The frame would be displayed shortly after the data is received and then there cant be a new frame for eg. 1/60 seconds - but eq. 58 fps would be possible with even or irregular spacing
 
You're going to need a new monitor regardless whether it's g-sync or something else.
How about I (and 100 million other PC gamers) just continue using the monitors we have right now, using the solutions we have had for the last 25 years?

...So, no, I won't. :)
 
Sounds awesome bit I'm out since I've just dropped £400 on a 3D nvidia monitor which was supposed to be pretty high end. If they release the mod for my monitor then I may go for it but theres no way Im buying another monitor. Hell this monitors the only thing locking me into NV at the moment.
 
This sounds like phenomenal tech, buried under NVIDIA proprietary nonsense, and for that reason it may not gain the traction it is due for quite some time. But make no mistake, this is the future of display technology.

What's up with the misinformation in this thread about Gsync only helping at high framerates? That's the complete opposite of the truth - the Antitruth! :LOL:

Being a Kepler owner, I for one will make sure my next monitor is gsync enabled.
 
A feature that will only work with a few select monitors, probably overpriced ones, and that people will only notice if the game is running at well over 60FPS..
And getting ~100FPS on the latest games requires hardware that is way too expensive to mere mortals.. unless they start lowering IQ to get such framerates.

This is definitely not for me. I'm okay with 50-60FPS + VSync. In fact, I would give more importance to a monitor with a good panel with great contrast, brightness, color reproduction, etc. than something that is just very fast.
Actually, this should be perfect for you.
The tech doesn't mean that the monitor has to get over over 60hz, but to properly display all those awkward moments when framerate is not within a frame (16,6ms).
 
Come again? You need over 144fps for the physics to work properly?!
The physics in games have independent refresh rates.. What does one thing have to do with another?
Quake 3 needed 125fps, 333fps was actually considered cheating, because it was messing up physics calculations and you were jumping higher and further.

CoD had something similar and there were probably other games affected by the same thing.

Adaptive V-Sync helps with only a specific problem. G-Sync is on another level completely.

Also, the GPU implementation shouldn't be a problem. This is definitely not something vendor specific because of a different architecture or anything. I don't see why they would mind, that Sony or MS implement this. It doesn't magically jumps onto AMD cards on the PC.
 
The significance of g-sync is removing differential clock syncing problems. While this is "neat" for a typical PC gaming setup, the real benefit is in VR/AR with head-mounted displays where very high frame-rates are more common/required. I'd hope that you could say use your game engine as a clock source which in turn could be derived from a sampled source (e.g. a MEMs clock, or sampling off of a CMOS sensor).
 
Why not just transfer and flip a frame whenever it is finished rendering...? (And not transfer anything while you're still drawing the next frame.)
 
How about I (and 100 million other PC gamers) just continue using the monitors we have right now, using the solutions we have had for the last 25 years?

...So, no, I won't. :)

Lol, what? Obviously if you don't want the functionality you don't have to buy it. Isn't that the case for every thing ever sold? :D

Btw, I hope this isn't what Mark Rein thought was "the most amazing thing" made by nVidia. If so the man is easily impressed...
 
That's what he meant yes. It appears to be one of those things that you really need to witness for yourself to truly appreciate it though. I think this could be a winner for Nvidia, and AMD will be forced to do something similar.
 
Come again? You need over 144fps for the physics to work properly?!
The physics in games have independent refresh rates.. What does one thing have to do with another?

It may be a reference to Quake 3 which has a so slightly different movement behavior when run at certain framerates like a constant 125fps or 333fps. Yes the engine lets you jump slightly farther. That's just a Quake 3 thing though.

With that kind of monitor you can have the game run at capped 125fps and the monitor will effectively sync to that, to do the same would require creating a custom res (easy on a CRT, on a 120/144Hz LCD I don't know). Other games have different caps, Half-Life 1 engine went up to 100fps, some are capped at 60fps (doom 3) and some at 30 fps even. And you can even choose your own cap in a lot of games, in the console.

Possibly the geforce driver itself can set a cap - what's announced is 144Hz maximum, already.

I wonder about linux support : will nvidia patch Xorg and/or Wayland and allow the driver feature? Can the feature work even when just rendering the desktop (such as a Wayland or Windows 8 one)
 
Why not just transfer and flip a frame whenever it is finished rendering...? (And not transfer anything while you're still drawing the next frame.)

Isn't it what this tech is about?
Sounds it's doing just that, and the novelty is the monitor waits for the frame too instead of drawing a teared frame.

It would do wonders for 60Hz monitors too, or 72Hz (there are a few 72Hz IPS monitors, well at least one)

Having tried Vsync-on on a 85Hz monitor (CRT) I think that yes, "stuttering" describes what I felt, with sharp jumps between 42.5fps and 85fps just by standing upright and spinning by pressing left/right arrows on the keyboard. Triple buffering just gives you more steps.
So, to me high refresh and no vsync is nice (at 60Hz the tearing is more frequent and uglier) but G-sync would appear to just solve everything.
And yes if you have a 60Hz panel and choice of vsync or no vsync you just live with it. It's still smoother than when we played 3D games on Amiga and 386.
 
Btw, I hope this isn't what Mark Rein thought was "the most amazing thing" made by nVidia. If so the man is easily impressed...

Don't judge so quickly something you've never seen. This is a big deal. If anything, that is a pretty intelligent statement by Rein (assuming this is what he was referring to).
 
Last edited by a moderator:
How about I (and 100 million other PC gamers) just continue using the monitors we have right now, using the solutions we have had for the last 25 years?

...So, no, I won't. :)

Well, TR ppl said they were getting new monitors asap. I guess this must be good. With no tearing, no sttuer, no lag, I can see this being radically more useful.
 
Don't judge so quickly something you've never seen. This is a big deal. If anything, that is a pretty intelligent statement by Rein (assuming this is what he was referring to).

I'm sure it's very impressive but I can think of half a dozen things that would have far greater impact on my gaming experience.
 
This seems like a software feature (or at least not part of 3D arch)...will move thread.

There is an actual hardware card that goes into the monitor so it is not a software only feature.


gsync-module_575px.png
 
This sounds absolutely amazing; I'm honestly curious what the changes are on the monitor level, I casually looked into what it would take to do something like this some time ago, and couldn't figure out how you'd get it to work with existing LCD technology without serious issues.

This is the key sentence from the TR article: "Our engineers have figured out how to drive the monitors so the color continues to be vibrant and beautiful, so that the color and gamma are correct as timing fluctuates." - what I'm unsure about as well is what happens when the next frame suddenly takes much longer than the previous one. Does the G-Sync module send the previous frame to the monitor again after X milliseconds, where X depends on past framerate and the associated drive adjustments? So there would still be a small penalty for spikes even if much less perceptible.

Also the power efficiency implications for mobile could be very significant. This is amazing technology and I hope it proliferates as fast and as widely as possible. While I understand why NVIDIA wants to make this proprietary, that's very unfortunate and I really hope it will become a standard sooner rather than later...

I've got a 120Hz ASUS VG278H which is very similar to the 24" model ASUS has committed to making a revision of with G-Sync. It's a great monitor except for inherent issues of TN and the colors which are far from perfect (probably could be improved by manual calibration) so I'll be very tempted to buy at least 2 of the upgraded 24" ones when they come out!
 
Back
Top