Nvidia G-SYNC

Discussion in 'Rendering Technology and APIs' started by DSC, Oct 18, 2013.

  1. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,237
    Likes Received:
    4,260
    Location:
    Guess...
    I think it's a bit absurd to call it gimmicky. This technology effectively allows you to play every single game you own or will ever own with the same smoothness as a 100% locked 60fps (or higher) with less input lag - as long as you can generally stay over 40fps or so (with dips below being fine).

    Forgetting about framerates, it has the potential to make the game experience on a mid range NV GPU as good as or better than a high end AMD GPU. That's HUGE! Viewed in that way, the extra $120 sounds like a bargain!
     
  2. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    x64 and SSE meet some definition of proprietary, only three companies made CPUs with them to my knowledge. Importantly, nvidia is barred from using them so they can't make a PC APU. Their being kicked out of chipsets even prevented me from getting a motherboard with nvidia IGP lol (AM3 mobo with geforce GT320M would have been sweet maybe, somewhat)
     
  3. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    Let me know if you get an answer to that. I've been trying for a few years now!
     
  4. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,298
    Likes Received:
    0
    Location:
    /
    That made me smile. :)
     
    #64 rpg.314, Oct 19, 2013
    Last edited by a moderator: Oct 19, 2013
  5. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    Its what use Samsung on their Exynos SOC, if someone need more explanation, you could go watch there too.
     
  6. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    Lol, I'm glad I helped brighten your day.
     
  7. pcchen

    pcchen Moderator
    Moderator Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    3,018
    Likes Received:
    582
    Location:
    Taiwan
    Personally I think it's even better if we can ditch current sequential protocol for display (though this is probably not high on NVIDIA's priority).

    A sequential protocol is a relic from CRT times, as monitors did not have any internal storage. Now, however, LCD monitors already have "built-in" storage. They don't really have to get display information sequentially. If we can make a block based protocol, then it's possible to greatly reduce display latency for a tiler GPU (e.g. you can send a tile to the monitor right after it's done).

    I guess display latency is probably not very high on the list of things to do for monitor vendors, but it'd be nice to have for VR goggles though.
     
  8. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    Tiles appearing at random times is not so much what we're looking for in games I think. What it would be good for though is the power saving scenerio, I have a CPU graph in a corner, it updates at something like 2Hz but that would force a power saving system to send whole screen updates more often. Ditto if you have an animated gif, ad etc. or a music player with the spectrum analyser and little seek bar while the rest of the screen is entirely static.
     
  9. pcchen

    pcchen Moderator
    Moderator Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    3,018
    Likes Received:
    582
    Location:
    Taiwan
    Of course, for fullscreen animations (e.g. in a 3D game) it's not ideal to have tiles appear at random places. But even for such case where a traditional "v-sync" is required, this may still reduce latency.

    In current double buffering system, the GPU renders the whole screen completely, then "flip" and send the front buffer to the monitor. So even without considering the latency introduced by the monitor, we already have two frames latency. Modern LCD monitors generally introduce even more latency.

    Now, if a block based protocol is available, a tiler GPU wouldn't have to do double buffering (or, not have to double buffer the whole screen), as when it complete a block, it can just send it. So the whole latency is reduced to one frame plus one block. The monitor can still display the whole things sequentially (it only have to buffer a line of blocks, which many monitors already have such buffers for post-processing).

    Of course, this disallows certain post-processing techniques as the whole scene would not be available after rendering. But it should still do for most local post-processing techniques (e.g. simple Gaussian blur) by simply introducing a little more latency (a line of blocks, for example).

    However, since this creates so much compatibility headaches, I think we probably won't see this except maybe on highly latency sensitive applications (e.g. VR goggles).
     
  10. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    ... as long as they give the R&D away for free?

    Let's take the case of Creative: they created a new technology that didn't exist at the time. Everybody wanted it. It made a number of people very, very rich, and it validated the need to add sound to the PC as a standard feature. AppleTalk did the same for networking. FireWire did it for a universal plug. Glide introduced the concept of software API on top of hardware, and nudged Microsoft into creating DirectX.

    It's irrelevant that that specific implementation faded. What matters is that somebody came up with a new technology and that it validated the need for it for a group of customers.

    Going back to Creative: there were only bleeps before the first sound blaster, and nobody could care less. What do you suggest they should have done as a startup: go to Intel and Microsoft and demand a standard API for something that doesn't exist? For something for which nobody has quite figured out yet what needs to be done? Please help me out here: what should they have done, years before they became extremely successful, to make sure that they wouldn't become victim of integrated sound being simple good enough?

    Yeah, they all became successful after somebody else showed them the way. And, no, it doesn't always work out. There are many ideas that turn out to be not so revolutionary in demand after all (e.g HW PhysX, Thunderbolt, ...)

    G-sync as currently implemented may not become the standard of choice. But without Nvidia showing first that it is a significant improvement over the current state of the art, it's pretty much a given that it would take much longer before it becomes the default way of doing things. Some people will be willing to spend the, lets be honest, relatively small extra sum to get this new feature. And probably rave about it. And then monitor makers and Intel and AMD will get together to bring the price down and make the tech available to everyone.
     
    #70 silent_guy, Oct 19, 2013
    Last edited by a moderator: Oct 20, 2013
  11. tritosine5G

    Newcomer

    Joined:
    Aug 29, 2010
    Messages:
    143
    Likes Received:
    1
    I think photoluminescent TN can do over 10.000:1 real contrast probably all-round better than IPS..

    In a similar vein of research- http://www.tue.nl/en/publication/ep/p/d/ep-uid/122088/ :
    - I'm not sure polarizers are needed to begin with?
     
    #71 tritosine5G, Oct 20, 2013
    Last edited by a moderator: Oct 20, 2013
  12. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    An interesting point concerning G-Sync :

    http://www.eurogamer.net/articles/digitalfoundry-nvidia-g-sync-the-end-of-screen-tear-in-pc-gaming

    Also, a slow motion demonstration :
    http://www.youtube.com/watch?v=NffTOnZFdVs
     
  13. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    GPU PhysX was instroduced in 8 games this year alone, which is a larger number than ever. There are 4 games already lined up for next year too .
     
  14. Sinistar

    Sinistar I LIVE
    Regular Subscriber

    Joined:
    Aug 11, 2004
    Messages:
    660
    Likes Received:
    74
    Location:
    Indiana
  15. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    Nope, no Ubisoft game has ever had GPU PhysX.
     
  16. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    17,884
    Likes Received:
    5,334
    Just to point out, eax started off as an open standard, its main rival a3d was proprietary only when creative had won the battle did they make it proprietary.

    Not true either, creative were successful by supporting an open standard and offering a superior open standard.
     
  17. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    I had a SoundBlaster 1, which was compatible with the terrible AdLib card (only FM synthesis) but introduced PCM which changed everything (even if only 8 bit.) There were not standards at all in 1989. Competitors simply started to make chips that were HW register compatible.

    The first sound API was only introduced by Microsoft for Win95. EAX came a decade later later.
     
  18. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    17,884
    Likes Received:
    5,334
    standards are different than api's and api's didnt really exist in the dos days
    Adlib was certainly a standard as was general midi and soundblaster
     
  19. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    Creative bought Ensoniq which had developed Sound Blaster emulation for PCI cards running under pure DOS, and sold lots of rebadged Ensoniq cards. Which would be fine, but they freaking monopolized that feature (usable on SB Live and Audigy 1 too)

    So all the competing cards and integrated sound were permantently barred from old style Sound Blaster compatibilty (with ISA, nearly all sound cards were simply transparently Sound Blaster or SB Pro, SB Pro 2 etc. compatible by physically behaving like one, as says silent guy)
    This just made my life worse. I hate them for closing and destroying what was a decade-long, universal industry standard (I had a DOS/XP dual boot :razz:, useful to run low-level tools but could have seen some gaming if it were not limited to silence and PC speaker.. And Dosbox was too slow in those days)
    BTW the Creative/Ensoniq cards would give you General Midi and Sound Blaster compatibility, but not Adlib (OPL2) or OPL3.
     
    #79 Blazkowicz, Oct 20, 2013
    Last edited by a moderator: Oct 20, 2013
  20. nutball

    Veteran Subscriber

    Joined:
    Jan 10, 2003
    Messages:
    2,492
    Likes Received:
    979
    Location:
    en.gb.uk
    Adlib and Soundblaster were de facto standards. Everybody else did it that way because the first/biggest mover did it that way. Like - oh - NVIDIA G-SYNC.

    MIDI was very different, it was a standard developed across the industry by discussion between the numerous competing companies involved.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...