AMD's "freesync" could come to next gen consoles?

Firmware updates? Pft... Future product sales opportunity, more likely. :rolleyes:

Anyways, I'd rather devs focus on a consistent input & visual response.
 

Since when does one answer a multiple choice question with "Yes." or "No."?

Anyway, just found this recent video where one of the AMD guys says:

Robert Hallock said:
[...] Any AMD Radeon graphics card today that supports FreeSync over DisplayPort, will also be able to do it over HDMI. You don't need any special cables. All you would need is an HDMI monitor that supports FreeSync in the same way that today you'd need a DisplayPort enabled monitor that supports AMD FreeSync. [...]


:smile2:

So, let's hope the Radeon graphics cards in the PS4 and Xbox One are amongst the ones that already support FreeSync, so that they can instantly offer FreeSync support once there are HDMI FreeSync monitors available (probably would still require software updates for both PS4 and Xbox One though).

Let's hope Sony and Microsoft will do it :smile2:.
 
Not sure how that would affect existing games. Any games that use time to flip buffer to adapt postprocessing or resolution might have problems, unless there was a legacy mode?

I'm guessing this is something to look forward to for next gen rather than this.
 
Not sure how that would affect existing games. Any games that use time to flip buffer to adapt postprocessing or resolution might have problems, unless there was a legacy mode?

Why would it affect existing games at all? It wouldn't.

On the PC you need to enable FreeSync in the driver control panel AFAIK.

On the consoles though, the games themselves would probably have to enable FreeSync via their own gamecode.

So, only new games which would enable FreeSync in their gamecode would use FreeSync. And existing games would still work as before without any difference, unless the game devs would patch their existing games to enable FreeSync in the gamecode.

Simple as that and no issue at all (I guess).

I'm guessing this is something to look forward to for next gen rather than this.

A little bit more optimism please. If even users doubt this could happen, then the manufacturers probably just go along.

So:

Believe!

:yes:
 
since FreeSync would bring both, more consistent input and visual response.
I see Freesync as a solution for tearing, but not necessarily capping framerates to something sensible.

While it opens things up for devs to choose a higher cap than 30 but lower than 60, the tech itself doesn't lend to consistency, and that is something devs will still have to choose to target, so it will still be a question of 30fps (& backward compatibility for older displays) + better visuals or higher framerate + all the caveats associated.

i.e. the devs still have to aim for a target (lest you have wildly varying framerates sans tearing & double buffer). F/G-sync is just a ticket out of triple buffering or tearing.

It's easier on PC where you can just throw everything you've got at the game & decide what framerate to cap it at (w/ drivers, 3rd party tools etc.) to avoid the variance in frametimes, but console games still need to be tailored.
 
Last edited:
Also Adaptive Sync doesn't address control input variability, only display anomalies caused by variable frame rates. It's a bandage for something many developers are unwilling to address in their games (virtually locked 60/30 on consoles) and/or for people that don't want to have sensible graphics options for their game on PC (hence potentially wildly variable frame rates).

On PC I always adjust games for responsive controls, which ultimately means that everything is at least 60 hz. Hence Adaptive sync, while nice if I wanted to push to see how good a game could look, is ultimately pointless for me as I'll be at the refresh cap for my monitor regardless.

On console, it could offer at least more visual consistency when a developer isn't very good and/or disciplined enough to code their game such that it runs at a consistent 60/30 hz. But that would still leave the games with unsatisfactory control input. Which along with many games being 30 fps (hence crap controls anyway) is why I rarely game on consoles anymore.

Regards,
SB
 
I see Freesync as a solution for tearing, but not necessarily capping framerates to something sensible.

While it opens things up for devs to choose a higher cap than 30 but lower than 60

That would not be the only benefit of FreeSync.

It also makes situations better where the game is capped at 60 fps for example but occasionally drops to 52 fps for example.

Without FreeSync that is a big issue and very noticeable. But with FreeSync it's much less noticeable and much smoother.

Same would go for games being capped at 30 fps but occasionally dropping to 28 fps for example. With FreeSync (and AMD's Low Framerate Compensation) this would be much less noticeable and a lot smoother. Not that anyone would want a game to drop to 28 fps, but just saying.

And FreeSync also helps with input lag.

F/G-sync is just a ticket out of triple buffering or tearing.

You make it sound like as if that would not be a huge benefit, even though it definitely would be a huge benefit.
 
Can the software (the game) detect if freesync is enabled?

If no and for games using any type of vsync a toggle to disable it should not be that hard. Some console games already have it.

/start_dreaming
There you go, there is a super easy solution for each cases, everyone is happy and every console games can benefit from adaptive sync. :yes:
/end_dreaming

:no:
 
The way I see it, the more freedom for the developers using this or that tech, the better.

If a game with the desired IQ is capable of running in the hardware at 45Hz (which is still 50% faster than 30) and with no tearing, why shouldn't it?
The current 60 or 30 dilemma seems like an awfully limited plethora of choices.
 
It also makes situations better where the game is capped at 60 fps for example but occasionally drops to 52 fps for example.

It solves variation in input response?

You make it sound like as if that would not be a huge benefit, even though it definitely would be a huge benefit.

My point is that the devs should still develop with a target in mind and try to keep it, otherwise you'd simply have the game fluctuating from 0-60, just without tearing.

If a game with the desired IQ is capable of running in the hardware at 45Hz (which is still 50% faster than 30) and with no tearing, why shouldn't it?
The current 60 or 30 dilemma seems like an awfully limited plethora of choices.

Of course. They clearly do have more choices. They still need to strive for consistency at something though.
 
Well, essentially you're no worse than turning off v-sync, so if the framerate dives, you're still dealing with the lag that comes with that.

So sure, less egregious, but I'd rather the devs still strive for consistency at a target.
 
Why would it affect existing games at all? It wouldn't.

On the PC you need to enable FreeSync in the driver control panel AFAIK.

On the consoles though, the games themselves would probably have to enable FreeSync via their own gamecode.

So, only new games which would enable FreeSync in their gamecode would use FreeSync. And existing games would still work as before without any difference, unless the game devs would patch their existing games to enable FreeSync in the gamecode.

Yeah, that's what I was getting at. Some games - that use the timing data from the predetermined buffer flip to run as intended - might not play nice with a "driver control panel" style over-ride enforced in an update.

So it might need to be something developers enable optionally in new or updated games. Trouble is it might also need a hardware revision - then you're feature splitting the systems and requesting developers potentially pass two sets of testing.

A little bit more optimism please. If even users doubt this could happen, then the manufacturers probably just go along.

It's probably the kind of thing that'd get enabled if it was there, especially if one manufacturer could use it for bragging rights or one was scared the other would.

Thing is, it's an extension to HDMI at the moment (and so there's little motivation to support it) and PS4Bone were set in stone a long time ago. Their HDMI might not even have been developed with something like this in mind.
 
Last edited:
Well, essentially you're no worse than turning off v-sync, so if the framerate dives, you're still dealing with the lag that comes with that.

Yes, this. This is why professional FPS gamers (when I was still semi-professional at least) wouldn't just turn off v-sync. They'd turn down graphics settings as low as possible to get the highest FPS possible in order to have the most consistent and responsive control input.

Regards,
SB
 
Well, essentially you're no worse than turning off v-sync, so if the framerate dives, you're still dealing with the lag that comes with that.

So sure, less egregious, but I'd rather the devs still strive for consistency at a target.

Would certainly help with that 16ms <-> 33ms <-> 16 ms judder though! I've nobbed around on a G-sync laptop(!) and it continued to appear smooth into the 40s (frame rates I'd normally associate with being a little bit juddery). Can't say it felt more responsive, as such, but it definitely felt nicer and 'cleaner' in motion (your eyes could track things a touch more comfortably).
 
Back
Top