Nvidia G-SYNC

This sounds absolutely amazing; I'm honestly curious what the changes are on the monitor level, I casually looked into what it would take to do something like this some time ago, and couldn't figure out how you'd get it to work with existing LCD technology without serious issues.

This is the key sentence from the TR article: "Our engineers have figured out how to drive the monitors so the color continues to be vibrant and beautiful, so that the color and gamma are correct as timing fluctuates." - what I'm unsure about as well is what happens when the next frame suddenly takes much longer than the previous one. Does the G-Sync module send the previous frame to the monitor again after X milliseconds, where X depends on past framerate and the associated drive adjustments? So there would still be a small penalty for spikes even if much less perceptible.
I don't think so. it will take some time to draw the repeated frame and if the next frame completes, you either get a tear or a lag. The only way it would work, afaics, is that the last frame stays as long as the next frame is not ready on the GPU.

Nv says any kepler GPU will work, which means they have been on it for a long time now, and wanted to lead with a huge install base of compatible GPUs.

Also the power efficiency implications for mobile could be very significant. This is amazing technology and I hope it proliferates as fast and as widely as possible. While I understand why NVIDIA wants to make this proprietary, that's very unfortunate and I really hope it will become a standard sooner rather than later...
This is like PSR, but done better. I would think this is something others will be able to come up with as well. And for mobile, you don't need a separate chip, this can be integrated on the SoC itself.
I've got a 120Hz ASUS VG278H which is very similar to the 24" model ASUS has committed to making a revision of with G-Sync. It's a great monitor except for inherent issues of TN and the colors which are far from perfect (probably could be improved by manual calibration) so I'll be very tempted to buy at least 2 of the upgraded 24" ones when they come out!
 
I don't think so. it will take some time to draw the repeated frame and if the next frame completes, you either get a tear or a lag. The only way it would work, afaics, is that the last frame stays as long as the next frame is not ready on the GPU.
The G-Sync module has DRAM so I assumed the frame would simply stay there and all necessary processing would be done on that chip as well. In fact I'm skeptical there's that much going on Kepler itself, although I'm sure there's the odd thing or two anyway.

This is like PSR, but done better. I would think this is something others will be able to come up with as well. And for mobile, you don't need a separate chip, this can be integrated on the SoC itself.
Yup, here's hoping NVIDIA doesn't try to integrate it into Tegra, patent the hell out of it, then sue anyone else trying to do something similar. I'm curious how much prior art there is to this and how much choice there is in the low-level implementation details. But we'll see.
 
They've said support on geforce GTX 650 ti Boost and up, which is really a GTX 660 variant. So it's any Kepler, as long as it has a Displayport.

It turns out the niceties are to be found in Embedded Displayport, which is stuff for a modern laptop to connect to its own internal display
http://en.wikipedia.org/wiki/DisplayPort#eDP
First there's version 1.0 from December 2008. woohoo!
It has advanced power-saving features including seamless refresh rate switching.

Then version 1.3 published in February 2011 "includes a new Panel Self-Refresh (PSR) feature developed to save system power and further extend battery life in portable PC systems".

What's a "Panel Self-Refresh"? A two-year-old article deals with it at Hardware Secrets and there's even a couple pictures.
http://www.hardwaresecrets.com/article/Introducing-the-Panel-Self-Refresh-Technology/1384/1

1382158396.jpg
 
The G-Sync module has DRAM so I assumed the frame would simply stay there and all necessary processing would be done on that chip as well. In fact I'm skeptical there's that much going on Kepler itself, although I'm sure there's the odd thing or two anyway.
There's not a lot on the GPU, but there's a minor bit.
Anand said:
G-Sync is designed to work over DisplayPort (since it’s packet based), with NVIDIA manipulating the timing of the v-blank signal to indicate a refresh. Importantly, this indicates that NVIDIA may not be significantly modifying the DisplayPort protocol, which at least cracks open the door to other implementations on the source/video card side.
Anand said:
If the frame rate drops below 30 fps, the display will present duplicates of each frame.
That could be why the DRAM is there.

Yup, here's hoping NVIDIA doesn't try to integrate it into Tegra, patent the hell out of it, then sue anyone else trying to do something similar. I'm curious how much prior art there is to this and how much choice there is in the low-level implementation details. But we'll see.

I don't think they have much leverage if they tried to patent troll. They are exceedingly coy about this, so must be important secret sauce.
 
As G-Sync technology can enhance the appeal of a wide variety of NVIDIA products, and considering that it is a hardware and software solution, I don't think that NVIDIA has plans to license this technology in the short term. In fact, Tom Peterson seemed to confirm that in the comments section here:

http://blogs.nvidia.com/blog/2013/10/18/g-sync/

That said, who knows what could happen in the future.

Here are some thoughts on G-Sync from John Carmack, Johan @ Dice, and Tim Sweeney:

http://www.youtube.com/watch?v=YvVnDTrfJ6M

Anand (among many others) seems to think this is a game-changing technology. He has a good writeup of G-Sync here:

http://www.anandtech.com/show/7436/nvidias-gsync-attempting-to-revolutionize-gaming-via-smoothness
 
This looks really cool. The bit that worries me is this:

[URL=http://techreport.com/news/25531/nvidia-g-sync-matches-lcd-refresh-rate-to-gpu-render-rate]The Tech Report[/URL] said:
Update: Asus has pledged to offer a G-Sync-enabled version of its VG248QE monitor in the first half of 2014. The monitor will be priced at $399, which is a fair bit more than the $280 asking price attached to the current model.

If it adds $119 to the cost of a monitor, that's just too much. I really hope this leads to some sort of cheap, ubiquitous standard.
 
I've got a 120Hz ASUS VG278H which is very similar to the 24" model ASUS has committed to making a revision of with G-Sync.

Ive got the same monitor but there's no way I'm shelling out to replace it any time soon. Im just hoping they release the mod card for it although I realise the possibility is slim.

But yeah this does sound like it could be a game changer.
 
For that price, I would be just fine with the current "software" solutions. On my 120Hz panel, adaptive v'sync is all I need during gameplay and madVR for watching smooth 24fps movies.
 
If it adds $119 to the cost of a monitor, that's just too much.
Lol wut! If I wasn't religiously anti-drug, I'd say I'd like some of what they're smoking!

You know, I'm not anti-progress, I'm all for newer, better stuff really, and this is better stuff, except that it's Yet More Proprietary Nvidia Shit, which I'm totally against. After years now, where is physx? There's no killer app, basically zero interest, and more importantly, no ubiqutous support for GPU accelerated physics on any platform, much less all of them (these days that means just NV, ATI, Intel.) NV has successfully held back progress through a process of divide-and-not-conquer. Impressive! Well done! *golfclap*

So... Fuck this shit. Seriously.
 
Technology moves forward in a large part by allowing companies to make a profit off it. First as a proprietary technology, only later as a standard component. It has always been this way: Mantle or PhysX or TrueAudio. G-sync is no different.

I get that this can be frustrating initially, but it's hard to argue that the drive to outsmart a competitor hasn't worked out over time. And if you don't like it, you don't even have to buy it: you can still game the way you always have. How cool is that?
 
Technology moves forward in a large part by allowing companies to make a profit off it. First as a proprietary technology, only later as a standard component. It has always been this way: Mantle or PhysX or TrueAudio. G-sync is no different.

I get that this can be frustrating initially, but it's hard to argue that the drive to outsmart a competitor hasn't worked out over time. And if you don't like it, you don't even have to buy it: you can still game the way you always have. How cool is that?

I think NVIDIA could have made more money by implementing PhysX in OpenCL or Compute Shaders. A lot more games would have used it, and since NVIDIA would have been in control and able to finely tune it for their own architecture, it would have favored them in benchmarks, hence the competitive advantage.
 
If it adds $119 to the cost of a monitor, that's just too much. I really hope this leads to some sort of cheap, ubiquitous standard.

At $175 or $119 , this is going to be niche. However, early adopters always pay for the privilege of being first. If this takes off as a niche product, I'm sure Nvidia and the rest of the industry would be motivated to broaden the market...
 
I think NVIDIA could have made more money by implementing PhysX in OpenCL or Compute Shaders. A lot more games would have used it, and since NVIDIA would have been in control and able to finely tune it for their own architecture, it would have favored them in benchmarks, hence the competitive advantage.

OpenCL and ComputeShaders didn't yet exist when GPU PhysX was born. OpenCL even today still isn't ready for prime time.

Even if OpenCL was a viable option and a lot more games had adopted GPU accelerated effects there's absolutely no guarantee that it would disproportionately benefit nVidia. It's quite possible that AMD would've gained a lot off the back of nVidia's investment.

I'm not sure if that's the case with G-Sync though. JHH hyped up the time and effort his people spent on the solution but he himself admitted that it was a relatively simple idea in the end.
 
OpenCL and ComputeShaders didn't yet exist when GPU PhysX was born. OpenCL even today still isn't ready for prime time.

Even if OpenCL was a viable option and a lot more games had adopted GPU accelerated effects there's absolutely no guarantee that it would disproportionately benefit nVidia. It's quite possible that AMD would've gained a lot off the back of nVidia's investment.

I'm not sure if that's the case with G-Sync though. JHH hyped up the time and effort his people spent on the solution but he himself admitted that it was a relatively simple idea in the end.

OpenCL and Compute Shaders are viable options now.

Besides, what is NVIDIA gaining with PhysX now? Some publicity, sure (some good, some bad) but who buys a GeForce instead of a Radeon just for GPU-accelerated PhysX?

And how expensive is it to develop and maintain? Honestly, I'm not sure it's a net positive.
 
OpenCL and Compute Shaders are viable options now.

ComputeShader is just as proprietary as CUDA and OpenCL's maturity is certainly up for debate.

Besides, what is NVIDIA gaining with PhysX now? Some publicity, sure (some good, some bad) but who buys a GeForce instead of a Radeon just for GPU-accelerated PhysX?

And how expensive is it to develop and maintain? Honestly, I'm not sure it's a net positive.

Good questions. I'm sure nVidia pays people a lot of money to know the answers and dictate corporate strategy. PhysX presumably sells more than a few cards to avid gamers and by extension, their mainstream friends. As much as we like to think we're above it all, marketing does matter to mainstream consumers.

I guarantee that if nVidia can convince reviewers to hype up G-Sync it will move quite a few monitors and Geforces too. Same goes for Mantle or TrueAudio or any perceived competitive advantage.
 
Technology moves forward in a large part by allowing companies to make a profit off it.
Nobody wants to stop NV from profiting from their own ideas.

First as a proprietary technology, only later as a standard component. It has always been this way: Mantle or PhysX or TrueAudio. G-sync is no different.
I would argue the exact polar opposite is generally the case when it comes to PCs. Where's Creative and their proprietary sound tech these days? Gone. 3DNow? Dead. RDRAM? Dead as a fucking doornail. Where's pretty much any other proprietary, vendor-specific tech right now? Dead and buried, that's where. Where's intel with their thunderbolt? It lives in macs, sure, but it doesn't exactly prosper. Why? Coz USB is free to use, and thunderbolt costs (a lot of) money. (Shit... "Glide, where art thou noweth?" "I hath beenst slaineth!")

but it's hard to argue that the drive to outsmart a competitor hasn't worked out over time.
It has? Since when?

Could you care to mention some successful examples of stupidly expensive gimmicky proprietary features (because that's what this is) ever outsmarting any competitors?

And if you don't like it, you don't even have to buy it
Thank you; I think I'll do just that, as I prefer to not have $120 hardware dongles in my monitor that locks me in to a single GPU vendor. That sounds like the dumbest move I could ever make, TBH. Pay (much) more, and have all my freedom taken away from me? No thanks!

USB, SSE, x64 and so on became successful and universal because they're NOT proprietary. The same thing goes for the entirety of the PC (except intel's been killing off all of its other competitors one by one over the years, but that's a different discussion.) Proprietary = dead, or at best, languishing. Free, and at least decently useful at its designed task = ubiqutous and popular and successful and...not dead. :p
 
Last edited by a moderator:
OpenCL and Compute Shaders are viable options now.

Besides, what is NVIDIA gaining with PhysX now? Some publicity, sure (some good, some bad) but who buys a GeForce instead of a Radeon just for GPU-accelerated PhysX?

And how expensive is it to develop and maintain? Honestly, I'm not sure it's a net positive.


It creates brand differentiation which is very important even if its not personally important to you.

After years now, where is physx? There's no killer app, basically zero interest, and more importantly, no ubiqutous support for GPU accelerated physics on any platform, much less all of them (these days that means just NV, ATI, Intel.) NV has successfully held back progress through a process of divide-and-not-conquer. Impressive! Well done! *golfclap*

So... Fuck this shit. Seriously.


As to where is that killer app for PHYSX? It doesn't exist and wont exist till game developers find an economic reason to use physics in their game design. Consoles dictate what technologies are used in those design decisions, so who cares that there is no ubiquitous support for GPU accelerated physics on all platforms when the software wouldn't use it regardless if it existed or not?

More so, why nail Nvidia to the cross for this? Why isn't AMD or Intel generously donating their own resources in this endeavor if it was so important?

Its pretty far fetched to blame Nivdia for holding back progress :rolleyes:.
 
Back
Top