Nvidia G-SYNC

Could you care to mention some successful examples of stupidly expensive gimmicky proprietary features (because that's what this is) ever outsmarting any competitors?

I think it's a bit absurd to call it gimmicky. This technology effectively allows you to play every single game you own or will ever own with the same smoothness as a 100% locked 60fps (or higher) with less input lag - as long as you can generally stay over 40fps or so (with dips below being fine).

Forgetting about framerates, it has the potential to make the game experience on a mid range NV GPU as good as or better than a high end AMD GPU. That's HUGE! Viewed in that way, the extra $120 sounds like a bargain!
 
x64 and SSE meet some definition of proprietary, only three companies made CPUs with them to my knowledge. Importantly, nvidia is barred from using them so they can't make a PC APU. Their being kicked out of chipsets even prevented me from getting a motherboard with nvidia IGP lol (AM3 mobo with geforce GT320M would have been sweet maybe, somewhat)
 
More so, why nail Nvidia to the cross for this? Why isn't AMD or Intel generously donating their own resources in this endeavor if it was so important?

Its pretty far fetched to blame Nivdia for holding back progress :rolleyes:.

Let me know if you get an answer to that. I've been trying for a few years now!
 
They've said support on geforce GTX 650 ti Boost and up, which is really a GTX 660 variant. So it's any Kepler, as long as it has a Displayport.

It turns out the niceties are to be found in Embedded Displayport, which is stuff for a modern laptop to connect to its own internal display
http://en.wikipedia.org/wiki/DisplayPort#eDP
First there's version 1.0 from December 2008. woohoo!

Then version 1.3 published in February 2011 "includes a new Panel Self-Refresh (PSR) feature developed to save system power and further extend battery life in portable PC systems".

What's a "Panel Self-Refresh"? A two-year-old article deals with it at Hardware Secrets and there's even a couple pictures.
http://www.hardwaresecrets.com/article/Introducing-the-Panel-Self-Refresh-Technology/1384/1

1382158396.jpg

Its what use Samsung on their Exynos SOC, if someone need more explanation, you could go watch there too.
 
Personally I think it's even better if we can ditch current sequential protocol for display (though this is probably not high on NVIDIA's priority).

A sequential protocol is a relic from CRT times, as monitors did not have any internal storage. Now, however, LCD monitors already have "built-in" storage. They don't really have to get display information sequentially. If we can make a block based protocol, then it's possible to greatly reduce display latency for a tiler GPU (e.g. you can send a tile to the monitor right after it's done).

I guess display latency is probably not very high on the list of things to do for monitor vendors, but it'd be nice to have for VR goggles though.
 
Tiles appearing at random times is not so much what we're looking for in games I think. What it would be good for though is the power saving scenerio, I have a CPU graph in a corner, it updates at something like 2Hz but that would force a power saving system to send whole screen updates more often. Ditto if you have an animated gif, ad etc. or a music player with the spectrum analyser and little seek bar while the rest of the screen is entirely static.
 
Of course, for fullscreen animations (e.g. in a 3D game) it's not ideal to have tiles appear at random places. But even for such case where a traditional "v-sync" is required, this may still reduce latency.

In current double buffering system, the GPU renders the whole screen completely, then "flip" and send the front buffer to the monitor. So even without considering the latency introduced by the monitor, we already have two frames latency. Modern LCD monitors generally introduce even more latency.

Now, if a block based protocol is available, a tiler GPU wouldn't have to do double buffering (or, not have to double buffer the whole screen), as when it complete a block, it can just send it. So the whole latency is reduced to one frame plus one block. The monitor can still display the whole things sequentially (it only have to buffer a line of blocks, which many monitors already have such buffers for post-processing).

Of course, this disallows certain post-processing techniques as the whole scene would not be available after rendering. But it should still do for most local post-processing techniques (e.g. simple Gaussian blur) by simply introducing a little more latency (a line of blocks, for example).

However, since this creates so much compatibility headaches, I think we probably won't see this except maybe on highly latency sensitive applications (e.g. VR goggles).
 
Nobody wants to stop NV from profiting from their own ideas
... as long as they give the R&D away for free?

I would argue the exact polar opposite is generally the case when it comes to PCs. Where's Creative and their proprietary sound tech these days? Gone. 3DNow? Dead. RDRAM? Dead as a fucking doornail. Where's pretty much any other proprietary, vendor-specific tech right now? Dead and buried, that's where. Where's intel with their thunderbolt? It lives in macs, sure, but it doesn't exactly prosper. Why? Coz USB is free to use, and thunderbolt costs (a lot of) money. (Shit... "Glide, where art thou noweth?" "I hath beenst slaineth!")
Let's take the case of Creative: they created a new technology that didn't exist at the time. Everybody wanted it. It made a number of people very, very rich, and it validated the need to add sound to the PC as a standard feature. AppleTalk did the same for networking. FireWire did it for a universal plug. Glide introduced the concept of software API on top of hardware, and nudged Microsoft into creating DirectX.

It's irrelevant that that specific implementation faded. What matters is that somebody came up with a new technology and that it validated the need for it for a group of customers.

Going back to Creative: there were only bleeps before the first sound blaster, and nobody could care less. What do you suggest they should have done as a startup: go to Intel and Microsoft and demand a standard API for something that doesn't exist? For something for which nobody has quite figured out yet what needs to be done? Please help me out here: what should they have done, years before they became extremely successful, to make sure that they wouldn't become victim of integrated sound being simple good enough?

USB, SSE, x64 and so on became successful and universal because they're NOT proprietary. The same thing goes for the entirety of the PC (except intel's been killing off all of its other competitors one by one over the years, but that's a different discussion.) Proprietary = dead, or at best, languishing. Free, and at least decently useful at its designed task = ubiqutous and popular and successful and...not dead. :p
Yeah, they all became successful after somebody else showed them the way. And, no, it doesn't always work out. There are many ideas that turn out to be not so revolutionary in demand after all (e.g HW PhysX, Thunderbolt, ...)

G-sync as currently implemented may not become the standard of choice. But without Nvidia showing first that it is a significant improvement over the current state of the art, it's pretty much a given that it would take much longer before it becomes the default way of doing things. Some people will be willing to spend the, lets be honest, relatively small extra sum to get this new feature. And probably rave about it. And then monitor makers and Intel and AMD will get together to bring the price down and make the tech available to everyone.
 
Last edited by a moderator:
TN could beat up IPS rather easily ,

funny enough, the way to do it is restricting viewing angles, and gain it back:

http://adsabs.harvard.edu/abs/2000SPIE.3955...70C
I think photoluminescent TN can do over 10.000:1 real contrast probably all-round better than IPS..

In a similar vein of research- http://www.tue.nl/en/publication/ep/p/d/ep-uid/122088/ :
Highly-polarized photoluminescent polymer films based on poly(2,5-dialkoxy-p-phenyleneethynylene)s (PPE) were fabricated for use in a new family of liq. crystal displays (LCDs). As one relevant example, a back-lit twisted-nematic configuration of an LCD was built, in which one of the absorbing polarizers was replaced by a polarized PL film, characterized by a dichroic ratio in excess of 70. Such devices can exhibit a substantial improvement in brightness, contrast and viewing angle, since the polarized photoluminescent films can combine two sep. features, i.e. the functions of a polarizer and an efficient color filter.
- I'm not sure polarizers are needed to begin with?
 
Last edited by a moderator:
An interesting point concerning G-Sync :

Indeed, traditional bar-chart driven GPU reviews could almost become somewhat meaningless - even if an AMD card outperforms an Nvidia competitor by 10 or 15 per cent, if it doesn't have G-Sync, the problems with tearing and judder will still be there, but they won't be on what traditional metrics will tell you is the weaker card. By extension, we can also foresee that this will be a problem for Nvidia too - if the perceptual difference between, say, 45fps and 52fps, is ironed out to a great extent by G-Sync, we wonder if there will be enough differentiation in the product line-up to make the more expensive offering worth the additional premium. What is needed here is more testing on the impact of frame-rate differences on the gameplay experience - G-Sync may solve tearing and stuttering issues, but it can't address input lag, which does tend to be impacted by lower frame-rates (though this too can be addressed to a certain extent by developers).
http://www.eurogamer.net/articles/digitalfoundry-nvidia-g-sync-the-end-of-screen-tear-in-pc-gaming

Also, a slow motion demonstration :
http://www.youtube.com/watch?v=NffTOnZFdVs
 
Let's take the case of Creative:
Just to point out, eax started off as an open standard, its main rival a3d was proprietary only when creative had won the battle did they make it proprietary.

Going back to Creative: there were only bleeps before the first sound blaster,
Not true either, creative were successful by supporting an open standard and offering a superior open standard.
 
Just to point out, eax started off as an open standard, its main rival a3d was proprietary only when creative had won the battle did they make it proprietary.

Not true either, creative were successful by supporting an open standard and offering a superior open standard.
I had a SoundBlaster 1, which was compatible with the terrible AdLib card (only FM synthesis) but introduced PCM which changed everything (even if only 8 bit.) There were not standards at all in 1989. Competitors simply started to make chips that were HW register compatible.

The first sound API was only introduced by Microsoft for Win95. EAX came a decade later later.
 
standards are different than api's and api's didnt really exist in the dos days
Adlib was certainly a standard as was general midi and soundblaster
 
Creative bought Ensoniq which had developed Sound Blaster emulation for PCI cards running under pure DOS, and sold lots of rebadged Ensoniq cards. Which would be fine, but they freaking monopolized that feature (usable on SB Live and Audigy 1 too)

So all the competing cards and integrated sound were permantently barred from old style Sound Blaster compatibilty (with ISA, nearly all sound cards were simply transparently Sound Blaster or SB Pro, SB Pro 2 etc. compatible by physically behaving like one, as says silent guy)
This just made my life worse. I hate them for closing and destroying what was a decade-long, universal industry standard (I had a DOS/XP dual boot :p, useful to run low-level tools but could have seen some gaming if it were not limited to silence and PC speaker.. And Dosbox was too slow in those days)
BTW the Creative/Ensoniq cards would give you General Midi and Sound Blaster compatibility, but not Adlib (OPL2) or OPL3.
 
Last edited by a moderator:
standards are different than api's and api's didnt really exist in the dos days
Adlib was certainly a standard as was general midi and soundblaster

Adlib and Soundblaster were de facto standards. Everybody else did it that way because the first/biggest mover did it that way. Like - oh - NVIDIA G-SYNC.

MIDI was very different, it was a standard developed across the industry by discussion between the numerous competing companies involved.
 
Back
Top