AMD's "freesync" could come to next gen consoles?

Yeah, that's what I was getting at
Yeah, that's what I was getting at. Some games - that use the timing data from the predetermined buffer flip to run as intended - might not play nice with a "driver control panel" style over-ride enforced in an update.

Those would be fine either way. If they are locked 30 fps currently, they'll still be locked 30 fps with adaptive sync enabled. If they are uncapped they'll still be uncapped. It's no different than adaptive sync on PC. Some games that are ported from console to PC have their physics and animation tied to the frame rate and have a locked 30 FPS frame rate. Whether adaptive sync is on or off doesn't change that.

If Microsoft or Sony enable adaptive sync in the OS, then it'll get rid of tearing as long as the display device supports adaptive sync. Of course, that's only if it's within the supported adaptive sync refresh rate of the display device. No changes have to be made to any games.

Regards,
SB
 
No changes have to be made to any games.

Yeah, thx, after reading your post and thinking some more about it, yeah, you're actually probably right :yes::smile2:.

That would make adding FreeSync support to PS4 and Xbox One even more a no-brainer :cool:.

Sony and Microsoft really should do it, it would be awesome.
 
Those would be fine either way. If they are locked 30 fps currently, they'll still be locked 30 fps with adaptive sync enabled. If they are uncapped they'll still be uncapped. It's no different than adaptive sync on PC. Some games that are ported from console to PC have their physics and animation tied to the frame rate and have a locked 30 FPS frame rate. Whether adaptive sync is on or off doesn't change that.

If Microsoft or Sony enable adaptive sync in the OS, then it'll get rid of tearing as long as the display device supports adaptive sync. Of course, that's only if it's within the supported adaptive sync refresh rate of the display device. No changes have to be made to any games.

Some games, for example, query the GPU for time to scan out beginning, to determine things such as how long the last frame took so you can adapt the next (AA solution or res cuts etc), or so you can know exactly where in the frame (vertical height) a tear will occur so you can go ahead and tear and wait for a less sensitive part of the screen.

Console games can be more tightly tied to the timing of the output hardware than PC games. If you change the timing of the outputs you can fundamentally change the behaviour of the software. What if a display defaults to cycling at 144 hz? 75hz? What if your game is hardwired to assume two updates = 33 ms? Games won't even have been tested for compatibility with none 60hz outputs, or the absence of a rigid output period.

Not everything from the PC experience of Freesync will translate directly to a hypothetical Freesynced PS4Bone. Different assumptions are made when writing and testing the software. There would need to be a 60hz lock legacy mode for guaranteed software compatibility.
 
Last edited:
Again, nothing would need to be changed. Unless you think there's suddenly going to be industry standard TV's that don't conform to either 60 hz or 50 hz? Adaptive sync doesn't change what a displays EDID will report or support with regards to an HDMI connection. A 60 hz adaptive sync TV still supports 60 hz, so a developer can still target a display with 60 hz timing. When the display actually refreshes when video out occurs is irrelevant for the program in this case, because it is still based on 60 hz.

What you describe is certainly an issue on PC where you have 144 hz monitors that actually do have 144 hz timings with regards to actual video out on a connected device. But for televisions, even 120 hz TV's still conform to 60 hz timings because the devices they are hooked up to are limited to 60 hz except in the case of HTPCs (which then find themselves limited to 60 hz and not 120 or 240 or whatever). That isn't changing, and adaptive synch won't change that either.

What you are describing will only happen if the industry standard for televisions changes such that both devices that output video and devices that accept video changes to enable higher display outputs. And even then, all the consoles have to do is limit what is reported to legacy games to the old standard of 60 hz.

Regards,
SB
 
What you describe is certainly an issue on PC where you have 144 hz monitors that actually do have 144 hz timings with regards to actual video out on a connected device. But for televisions, even 120 hz TV's still conform to 60 hz timings because the devices they are hooked up to are limited to 60 hz except in the case of HTPCs (which then find themselves limited to 60 hz and not 120 or 240 or whatever). That isn't changing, and adaptive synch won't change that either.

What you are describing will only happen if the industry standard for televisions changes such that both devices that output video and devices that accept video changes to enable higher display outputs. And even then, all the consoles have to do is limit what is reported to legacy games to the old standard of 60 hz.

Well, there already are some HDTVs and UHDTVs which can do up to 1920x1080 @ 120Hz natively, see:

http://www.blurbusters.com/overclock/120hz-pc-to-tv/

;)

By the way, found the following page on AMD's website:

http://support.amd.com/en-us/search/faq/219

Looks like PS4 and Xbox One should be good to go for FreeSync, shouldn't they?
 
Last edited:
Back
Top