AMD demonstrates Freesync, G-sync equivalent?

I'm curious if twitch streaming and h264 recording support variable frame rate as well. Because if not, I'm not sure how well the world will solve this discrepancy. You play, and all is nice, then you watch your own playback, and it goes to hell.?
If you sample on the GPU side the frames at a fixed rate, and use those to encode (which still sending variable rate to the monitor of course), you'd revert right back to a vsync-on situation.
 
Indeed. In effect perceived and reported "butter-smooth" and "non-tearing" will be setup-specific time-local incidents, which can't be distributed to other f-sync fellows. If fe. raptr would be able to pass the individual frame timings to the capture medium, then I believe f-sync would become a really contagious experience.
 
It has more to do with the container than the encoding format. MKV, for example, supports variable frame rate. But recording/storage is one thing, whether the player supports replaying variable frame rate is another.
 
Once it is in the container it's possible to cure the displaying anytime getting a f-sync monitor. Imagine all images would have been stored at 6 bits because TFTs didn't support any more. Uhm, well, at times I do suffer from 8bit images on my 10bit display. :p Putting the frame-times is comparatively cheaper.
 
Im guessing if a video of a freesync monitor running looks ok streaming will be ok.

It will be "okay" the same as now. But f-sync wanted to solve something, and preferrably not for the subjective moment in time for yourself only. If the experience is shareable I think it will produce much needed enthusiasm for accelerated adoption. Same goes for VR. If you could stream 3D to your buddies, it would produce a great deal of additional enthusiasm. But that is much more difficult than passing "just" variable frame-times.
 
Terminology clarification: monitors are "Adaptive Sync". FreeSync is what AMD GPUs do. Monitors may be FreeSync certified, but they don't have to be, in order to work with FreeSync.
 
Maybe judder free motion will encourage devs to deactivate motion blur in some games.

Theoretically the dose of motion blur should be dependent of the actual framerate and can be used in some games to hide judder (replaced by an artistic blur). :yep2:
 
As I understand freesync, it reverts to the monitors defaults refresh setting; 60, 75, 120 hz or whatever it is, when the framerate gets below a certain fps. It seems in most cases to be around 30-40 hz/fps. In these cases the free sync control setting can handle this in two different ways.

1. Let v-sync take over. = good against tearing but bad for input lag.
2. V-sync off = Good for input lag but bad for tearing

Correct?

But here's a thought.
Lets say you have monitor that can do variable refresh between 40 to 80 hz but instead of go v-sync at 39 hz/fps, you just double the refresh rate to 78 hz?
This way you could in theory even have a decent experience at 25 fps since the refresh rate then would be 50 hz without the risk of flickering or dropouts.

Yes?
 
No. Free-sync == variable v-sync. It passes the frame-present event as a clock signal to the monitor, refreshing the contents of the screen at those times. If the monitor has to pulse the display-panel to maintain consistant state between the content refreshes is somewhat unrelated and up to the monitor producer. If this requires extra frame-buffers in the monitor, or if the display-panel matrix is used as a frame-buffer is also up to the producer. In theory games could render a frame ahead and determine the variable time a frame has to be displayed for correct variable shutter.
 
As I understand freesync, it reverts to the monitors defaults refresh setting; 60, 75, 120 hz or whatever it is, when the framerate gets below a certain fps. It seems in most cases to be around 30-40 hz/fps. In these cases the free sync control setting can handle this in two different ways.
To add to the previous post, the mechanism is such that during the display initialization (Boot up or resolution change) Freesync will read the full range of timings available to the panel from the EDID and store that; if a frame finishes rendering anywhere within that interval the VBLANK signal will be sent and the frame displayed frame updated.
 
You need 1/39 second to compute that image, but would want to display it within 1/78 second? sorry to put it into those terms.

Ethatron's theoretical solution is interesting but.. are you introducing a frame of latency where the latency is variable? so the input lag get dynamic and that feels pretty unholy :p
 
@Blazkowicz Input lag is already dynamic because it includes the the delta-time between render-started and frame-shown. Yes, you get one frame more lag, but the lag is not more dynamic. You'd get perfect double-buffer rendering, nothing magically more, with all the inherit disadvantages of DB. :)
If you want to look into some magic, then you could take the latency of input-devices into account - I mean the latency to actually change the direction of your input, which is pretty high for an analogue stick fe., even the key-up/down is not instantaneous and the keyboard could measure key-press analogue and you'd know that a key will be pressed before it passes over the activation threshold - and predict the input for the next frame. It might need a bit of R&D, but the limitations in reaction of input devices and humans seem significant enough to try to make something out of it.
 
No. Free-sync == variable v-sync. It passes the frame-present event as a clock signal to the monitor, refreshing the contents of the screen at those times. If the monitor has to pulse the display-panel to maintain consistant state between the content refreshes is somewhat unrelated and up to the monitor producer. If this requires extra frame-buffers in the monitor, or if the display-panel matrix is used as a frame-buffer is also up to the producer. In theory games could render a frame ahead and determine the variable time a frame has to be displayed for correct variable shutter.

Allright.
I use a 290x with a 120 hz Eizo Foris monitor which I set to cap at 60 fps via frame limiting in "Rivatuner server" It turns out that I get the overall "smoothest" gaming experience this way. Of course, it would have been even better if I could cap at 120 fps, but newer AAA-titles are very demanding and sometimes unoptimized (I'm looking at you Far cry 4).

Anyways, since I maxlimit at 60 fps but using 120 hz monitor will mean that the monitor refreshes twice the speed my graphics card can spit out new frames right? But how does that actually work? Is it the monitors hardware that is smart enough to know when to "reuse" the last frame? Or is it the graphics card that double buffer? Otherwise I would experience flickering right?
 
Allright.
I use a 290x with a 120 hz Eizo Foris monitor which I set to cap at 60 fps via frame limiting in "Rivatuner server" It turns out that I get the overall "smoothest" gaming experience this way. Of course, it would have been even better if I could cap at 120 fps, but newer AAA-titles are very demanding and sometimes unoptimized (I'm looking at you Far cry 4).

Anyways, since I maxlimit at 60 fps but using 120 hz monitor will mean that the monitor refreshes twice the speed my graphics card can spit out new frames right? But how does that actually work? Is it the monitors hardware that is smart enough to know when to "reuse" the last frame? Or is it the graphics card that double buffer? Otherwise I would experience flickering right?
The monitor reports to the GPU that it's a 120Hz capable monitor. When the user selects a 120Hz refresh rate, the GPU has to send images to the monitor at 120Hz, at precise timings, otherwise the monitor may complain.

Since you've set a frame cap at 60fps, the front-end of the GPU will throttle rendering, but the back-end will send an image twice.

So, yes, you have two buffers, one to display and one to render to, but I don't think that's generally considered double buffering? IMO double buffering has always been reserved for the case where vsync is enabled. When you use a frame limiter, you're not using vsync.
 
Thanks for clarifying Silent Guy :)
So, if I understand this correctly, you could at least in theory use the same technic to reuse the last frame displayed in my earlier example with 39 fps /78 hz in a similar way. But in this case each specifik monitors logic has to tell the graphics driver that it will need a double frame for a wide span of frame rate cycles.

I realize that this probably not what freesync is. But maybe it could work.
 
I'm not sure what it is or how it works, but I got to see a live demo of it on my recent visit to AMD and it was most impressive to me! The smoothness of v-sync without the harsh framerate restrictions of v-sync is what it looks like in real life, and I likes it!
 
Looks like BenQ is the first one to actually deliver FreeSync, their XL2730Z is already listed in several european stores, though not yet available. In Finland some site estimate availability to somewhere between 12 and 31 days.
 
Looks like BenQ is the first one to actually deliver FreeSync, their XL2730Z is already listed in several european stores, though not yet available. In Finland some site estimate availability to somewhere between 12 and 31 days.

Most monitors presented in CES should be available before the end of the quarter, so this mean somewhere between today and march.. this said, AMD is not controlling the process of retail.. this is the brands, manufacturer of thoses monitors who control it ofc..

Im happy to see Benq open the fight anyway ...

On other side ..


I dont want create troll or controversy, but this article of this troll site can maybe be interessant to note ... http://wccftech.com/nvidia-gsync-mobility-confirmed-require-dedicated-module-raises-questions/

Remember that AMD hved use an laptop on this first demonstration due to the fact, eDP was allready supporting the adapativesync ... on the CES, Nvidia partners have show a laptop powered with G-sync .,...remember that Nvidia have confirm, again and again, that they will not support adaptivesync from the Displayport vesa standard ... raise a question, does the laptop powered with g-sync are using a G-sync module, or just adaptivesync from eDP?
 
Last edited:
Back
Top