Nvidia G-SYNC

Actually that is not true at all. G-Sync doesn't require 60+ fps. In fact, one of the key selling points of G-Sync is that gameplay is smoother when framerate drops below 60 fps. G-Sync and 4K are also not mutually exclusive. I'm sure John Carmack is looking forward to both.

I meant that as to say that he has always shown a very big concern with smoothness in gaming.
Before G-Sync, that translated into 60 or 120FPS vsynced.

However, I will still reserve judgment to see if G-Sync is much better (or better at all) than Adaptive V-Sync in sub-60FPS framerates.



Carmack is focusing on PC, iOS, and Android. He is not focusing on next gen consoles at all at this time.
:LOL:
Coincidence?

Nonetheless, Doom 4 will have to be released for the next-gens, otherwise it's suicide for the franchise. So he will have to eventually look at the code of the console versions.


On a side note, you are incredibly condescending towards these industry veterans. G-Sync has been universally praised as a game-changing technology (literally and figuratively) by not just these three individuals but by numerous tech hardware review sites as well.

I have an opinion over the statements of 3 persons that appeared at a nVidia PR event to help promote a product. Credit given where credit is due: all of them are part of the most influential persons in videogames in the world. Then again, all of them are humans and make mistakes, and as such I'm entitled to have an opinion over them being right, half-right or wrong.

G-Sync will be a game-changing technology if or when it becomes widely adopted. No matter how much ass-licking or honest and positive opinions it gets.
Until then, it's a niche feature.
 
Hopefully it gets licensed and AMD, Intel would support it with a mere driver update.

Variable Refresh Rate requires driving the display differently. I don't believe this is possible with a driver change. This is why I don't believe Gsync will work when driven by an Optimus laptop.
 
nVidia bought AGEIA in February 2008. OpenCL 1.0 was officially released 10 months later.

Sigh. You're comparing an API specification to an actual implementation. When do you suppose the original spec for CUDA was finalized? Hint: the 8800GTX launched in 2006.
 
Variable Refresh Rate requires driving the display differently.
Yeah, but the display driver and the control electronics is sitting in the display. Displayport already goes a long way of decoupling the data transmission from the actual display. It really depends on details how flexible one can configure the display pipelines in the GPU. It could be possible to enable support on existing displayport capable GPUs.
 
Variable Refresh Rate requires driving the display differently. I don't believe this is possible with a driver change. This is why I don't believe Gsync will work when driven by an Optimus laptop.

That's a real concern but I don't know. I have a feeling that from the GPU side it's just about using the features already in eDP (which I hope current laptops are using), maybe sending a magic packet through the AUX channel stuff like that.
Hacking video modes has always been done, from using a non-standard CGA palette, VGA Mode X, 8/16bit computers that changed video modes in the middle of a frame, setting up any crazy/unintended res.. All in the fixed refresh, scanning paradigm though.

The biggest thing nvidia has done is using its relations with monitor vendors they built with "3D Vision" - which we can argue was proprietary/vendor locked-in and that may piss us off.

If there's a tiny bit of hardware or wiring needed in the GPU's output section, I'll be glad to be proved wrong.
 
ToTTenTranz said:
I have an opinion over the statements of 3 persons that appeared at a nVidia PR event to help promote a product. Credit given where credit is due: all of them are part of the most influential persons in videogames in the world. Then again, all of them are humans and make mistakes, and as such I'm entitled to have an opinion over them being right, half-right or wrong.
Yes, but a large portion of the tech world actually cares about their opinion...
 
I remain doubtful.
Variable frame rate on video implies terrible stuttering. Once you've seen it, you can never ignore it.
Bluray without 24Hz (or any multiple of) stutters.
TV with telecine content @ 30Hz stutters.
The only way g-sync would work for me is if LCD were able to change their refresh rate from 60Hz to another constant framerate. Let's say 40Hz or 50Hz.
Now, video games are not "exactly" like camera content, output frames should be temporally correct when they're displayed, so maybe it removes the stuttering. I'll have to test it to be convinced.
 
I would say in principle yes. But I guess one can give large part of the credit to MS' swapchain design in DirectX, which is designed as a queue. :cry:
Wonderful ! First the draw calls crap and now this ! Microsoft is looking like a handicap more than anything right about now !
 
With 100Hz refresh and vsync on, you immediately fall back to 50Hz when you don't quite make it.
I've seen multiple people mention frame rate getting cut in half when the refresh rate isn't met, but this isn't necessarily the case. To use your example let's consider a 100 Hz refresh rate with frame times taking 13 ms (frame time for 75 Hz).

The first frame misses the 10 ms sample point and is delayed to 20 ms resulting in 7 ms of latency. The second frame is ready at 26 ms so it's displayed at 30 ms. The pattern continues with 2 frames being displayed every 30 ms. I've ignored precision, but that's approximately 66 Hz. More accurate calculations would result in a frame rate of 75 Hz.

Granted there is stuttering and you need triple buffering, but with a high enough refresh rate the result should still be pretty good.

Note that I'm not arguing against g-sync just pointing out that having the frame rate get cut in half is an oversimplification of what current technology is capable of.
 
I've seen multiple people mention frame rate getting cut in half when the refresh rate isn't met, but this isn't necessarily the case.
It's the case for the instantaneous frame rate. :p
But, yes, averaged over time, it's not the case for triple buffering.
 
Yes that ridiculous way of talking in frequency instead of duration or period is beyond me.

Your screen has a frequency (dumbly inherited from CRT that's true, never got why DVI-D didn't get rid of it, let alone HDMI & DisplayPort), and your frames a duration (time to compute).
Picture a 60/120/144 Hz signal and plot frame times to see how things may go, talking about instantaneous frequency is dumb.
 
... and CRT TVs with HDMI.
My father in law has one of these, and it's one of the irritating ones where the sound input doesn't work on that port.

As a current AMD owner, I'm super-excited for this g-sync tech and I hope it (or the logical successor to it) becomes a standard implementation. This is the way digital displays should've worked, without question. Here's hoping that the right people catch on and that it moves to ubiquity soon...
 
So does anyone know what happens when your rendering speed exceeds your monitors refresh rate under G-Sync? Lets say you have a 60hz monitor and your frame rate is 100-130 fps?
 
So does anyone know what happens when your rendering speed exceeds your monitors refresh rate under G-Sync? Lets say you have a 60hz monitor and your frame rate is 100-130 fps?

So the 60hz monitor doesn't exist under G-Sync with the way I understand it.

I am sure that the monitor has a maximum refresh rate though. With a CRT, you are limited by the speed of the electron beam that scans to create the picture. But you don't have the same limitation with an LCD. As a matter of fact, with backlit LCD monitors, I think you would be limited only by the switching speed of the crystals themselves + any processing you need to do to disperse the signal. I think it is theoretically possible to build LCDs with refresh rates of 600Hz currently but it isn't done because it isn't useful.

Put another way, for most games it would be pretty hard to hit the actual refresh rate of an LCD monitor that isn't arbitrarily tied to a given frequency. That being said, I'm sure there is a maximum built into the hardware that stops rendering above a certain framerate (much like you can currently fix your fps to 60 if you want in most games).
 
So does anyone know what happens when your rendering speed exceeds your monitors refresh rate under G-Sync? Lets say you have a 60hz monitor and your frame rate is 100-130 fps?

Adaptive V-Sync already deals with locking/syncing when reaching a max framerate, and allowing an arbitrary framerate if you don't reach that max.
G-Sync can probably leverage that existing, already working driver code and way of doing things.

If for some reason nothing works, i.e. ill-behaving or incompatible old game there's always fallback to vsync off or maybe vsync on.
 
Back
Top