Nvidia G-SYNC

Discussion in 'Rendering Technology and APIs' started by DSC, Oct 18, 2013.

  1. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
  2. A1xLLcqAgt0qc2RyMz0y

    Veteran

    Joined:
    Feb 6, 2010
    Messages:
    1,589
    Likes Received:
    1,490
  3. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,298
    Likes Received:
    0
    Location:
    /
    I don't think so. it will take some time to draw the repeated frame and if the next frame completes, you either get a tear or a lag. The only way it would work, afaics, is that the last frame stays as long as the next frame is not ready on the GPU.

    Nv says any kepler GPU will work, which means they have been on it for a long time now, and wanted to lead with a huge install base of compatible GPUs.

    This is like PSR, but done better. I would think this is something others will be able to come up with as well. And for mobile, you don't need a separate chip, this can be integrated on the SoC itself.
    I've got a 120Hz ASUS VG278H which is very similar to the 24" model ASUS has committed to making a revision of with G-Sync. It's a great monitor except for inherent issues of TN and the colors which are far from perfect (probably could be improved by manual calibration) so I'll be very tempted to buy at least 2 of the upgraded 24" ones when they come out!
     
  4. Arun

    Arun Unknown.
    Legend

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    302
    Location:
    UK
    The G-Sync module has DRAM so I assumed the frame would simply stay there and all necessary processing would be done on that chip as well. In fact I'm skeptical there's that much going on Kepler itself, although I'm sure there's the odd thing or two anyway.

    Yup, here's hoping NVIDIA doesn't try to integrate it into Tegra, patent the hell out of it, then sue anyone else trying to do something similar. I'm curious how much prior art there is to this and how much choice there is in the low-level implementation details. But we'll see.
     
  5. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    They've said support on geforce GTX 650 ti Boost and up, which is really a GTX 660 variant. So it's any Kepler, as long as it has a Displayport.

    It turns out the niceties are to be found in Embedded Displayport, which is stuff for a modern laptop to connect to its own internal display
    http://en.wikipedia.org/wiki/DisplayPort#eDP
    First there's version 1.0 from December 2008. woohoo!
    Then version 1.3 published in February 2011 "includes a new Panel Self-Refresh (PSR) feature developed to save system power and further extend battery life in portable PC systems".

    What's a "Panel Self-Refresh"? A two-year-old article deals with it at Hardware Secrets and there's even a couple pictures.
    http://www.hardwaresecrets.com/article/Introducing-the-Panel-Self-Refresh-Technology/1384/1

    [​IMG]
     
  6. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,298
    Likes Received:
    0
    Location:
    /
    There's not a lot on the GPU, but there's a minor bit.
    That could be why the DRAM is there.

    I don't think they have much leverage if they tried to patent troll. They are exceedingly coy about this, so must be important secret sauce.
     
  7. ams

    ams
    Regular

    Joined:
    Jul 14, 2012
    Messages:
    914
    Likes Received:
    0
    As G-Sync technology can enhance the appeal of a wide variety of NVIDIA products, and considering that it is a hardware and software solution, I don't think that NVIDIA has plans to license this technology in the short term. In fact, Tom Peterson seemed to confirm that in the comments section here:

    http://blogs.nvidia.com/blog/2013/10/18/g-sync/

    That said, who knows what could happen in the future.

    Here are some thoughts on G-Sync from John Carmack, Johan @ Dice, and Tim Sweeney:

    http://www.youtube.com/watch?v=YvVnDTrfJ6M

    Anand (among many others) seems to think this is a game-changing technology. He has a good writeup of G-Sync here:

    http://www.anandtech.com/show/7436/nvidias-gsync-attempting-to-revolutionize-gaming-via-smoothness
     
  8. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,541
    Likes Received:
    964
    This looks really cool. The bit that worries me is this:

    If it adds $119 to the cost of a monitor, that's just too much. I really hope this leads to some sort of cheap, ubiquitous standard.
     
  9. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,237
    Likes Received:
    4,260
    Location:
    Guess...
    Ive got the same monitor but there's no way I'm shelling out to replace it any time soon. Im just hoping they release the mod card for it although I realise the possibility is slim.

    But yeah this does sound like it could be a game changer.
     
  10. fellix

    Veteran

    Joined:
    Dec 4, 2004
    Messages:
    3,552
    Likes Received:
    514
    Location:
    Varna, Bulgaria
    For that price, I would be just fine with the current "software" solutions. On my 120Hz panel, adaptive v'sync is all I need during gameplay and madVR for watching smooth 24fps movies.
     
  11. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,176
    Location:
    La-la land
    Lol wut! If I wasn't religiously anti-drug, I'd say I'd like some of what they're smoking!

    You know, I'm not anti-progress, I'm all for newer, better stuff really, and this is better stuff, except that it's Yet More Proprietary Nvidia Shit, which I'm totally against. After years now, where is physx? There's no killer app, basically zero interest, and more importantly, no ubiqutous support for GPU accelerated physics on any platform, much less all of them (these days that means just NV, ATI, Intel.) NV has successfully held back progress through a process of divide-and-not-conquer. Impressive! Well done! *golfclap*

    So... Fuck this shit. Seriously.
     
  12. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    Technology moves forward in a large part by allowing companies to make a profit off it. First as a proprietary technology, only later as a standard component. It has always been this way: Mantle or PhysX or TrueAudio. G-sync is no different.

    I get that this can be frustrating initially, but it's hard to argue that the drive to outsmart a competitor hasn't worked out over time. And if you don't like it, you don't even have to buy it: you can still game the way you always have. How cool is that?
     
  13. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,541
    Likes Received:
    964
    I think NVIDIA could have made more money by implementing PhysX in OpenCL or Compute Shaders. A lot more games would have used it, and since NVIDIA would have been in control and able to finely tune it for their own architecture, it would have favored them in benchmarks, hence the competitive advantage.
     
  14. RecessionCone

    Regular Subscriber

    Joined:
    Feb 27, 2010
    Messages:
    505
    Likes Received:
    189
    At $175 or $119 , this is going to be niche. However, early adopters always pay for the privilege of being first. If this takes off as a niche product, I'm sure Nvidia and the rest of the industry would be motivated to broaden the market...
     
  15. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    OpenCL and ComputeShaders didn't yet exist when GPU PhysX was born. OpenCL even today still isn't ready for prime time.

    Even if OpenCL was a viable option and a lot more games had adopted GPU accelerated effects there's absolutely no guarantee that it would disproportionately benefit nVidia. It's quite possible that AMD would've gained a lot off the back of nVidia's investment.

    I'm not sure if that's the case with G-Sync though. JHH hyped up the time and effort his people spent on the solution but he himself admitted that it was a relatively simple idea in the end.
     
  16. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,541
    Likes Received:
    964
    OpenCL and Compute Shaders are viable options now.

    Besides, what is NVIDIA gaining with PhysX now? Some publicity, sure (some good, some bad) but who buys a GeForce instead of a Radeon just for GPU-accelerated PhysX?

    And how expensive is it to develop and maintain? Honestly, I'm not sure it's a net positive.
     
  17. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    ComputeShader is just as proprietary as CUDA and OpenCL's maturity is certainly up for debate.

    Good questions. I'm sure nVidia pays people a lot of money to know the answers and dictate corporate strategy. PhysX presumably sells more than a few cards to avid gamers and by extension, their mainstream friends. As much as we like to think we're above it all, marketing does matter to mainstream consumers.

    I guarantee that if nVidia can convince reviewers to hype up G-Sync it will move quite a few monitors and Geforces too. Same goes for Mantle or TrueAudio or any perceived competitive advantage.
     
  18. tritosine5G

    Newcomer

    Joined:
    Aug 29, 2010
    Messages:
    143
    Likes Received:
    1
  19. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,176
    Location:
    La-la land
    Nobody wants to stop NV from profiting from their own ideas.

    I would argue the exact polar opposite is generally the case when it comes to PCs. Where's Creative and their proprietary sound tech these days? Gone. 3DNow? Dead. RDRAM? Dead as a fucking doornail. Where's pretty much any other proprietary, vendor-specific tech right now? Dead and buried, that's where. Where's intel with their thunderbolt? It lives in macs, sure, but it doesn't exactly prosper. Why? Coz USB is free to use, and thunderbolt costs (a lot of) money. (Shit... "Glide, where art thou noweth?" "I hath beenst slaineth!")

    It has? Since when?

    Could you care to mention some successful examples of stupidly expensive gimmicky proprietary features (because that's what this is) ever outsmarting any competitors?

    Thank you; I think I'll do just that, as I prefer to not have $120 hardware dongles in my monitor that locks me in to a single GPU vendor. That sounds like the dumbest move I could ever make, TBH. Pay (much) more, and have all my freedom taken away from me? No thanks!

    USB, SSE, x64 and so on became successful and universal because they're NOT proprietary. The same thing goes for the entirety of the PC (except intel's been killing off all of its other competitors one by one over the years, but that's a different discussion.) Proprietary = dead, or at best, languishing. Free, and at least decently useful at its designed task = ubiqutous and popular and successful and...not dead. :razz:
     
    #59 Grall, Oct 19, 2013
    Last edited by a moderator: Oct 19, 2013
  20. steveOrino

    Regular

    Joined:
    Feb 11, 2010
    Messages:
    549
    Likes Received:
    242

    It creates brand differentiation which is very important even if its not personally important to you.


    As to where is that killer app for PHYSX? It doesn't exist and wont exist till game developers find an economic reason to use physics in their game design. Consoles dictate what technologies are used in those design decisions, so who cares that there is no ubiquitous support for GPU accelerated physics on all platforms when the software wouldn't use it regardless if it existed or not?

    More so, why nail Nvidia to the cross for this? Why isn't AMD or Intel generously donating their own resources in this endeavor if it was so important?

    Its pretty far fetched to blame Nivdia for holding back progress :roll:.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...