Is adaptive vsync going to save next-gen consoles?

Do you think adaptive vsync will help consoles against PCs?

  • Yes

    Votes: 4 30.8%
  • No

    Votes: 5 38.5%
  • I dunno

    Votes: 4 30.8%

  • Total voters
    13

Globalisateur

Globby
Legend
Veteran
Supporter
The main advantage of PC gaming is probably the possibility of having 60fps in AAA games. I am kind of tired of blurry 30fps console gaming, coupled with high input lag, and I think that ~45fps without judder could be an ideal compromise.

For instance even the judder heavy Infamous FL at ~45fps is much better than the 30fps capped game. I can't even imagine how great this game would be with an adaptive framerate!

I think it's a pity most AAA games are capped at 30fps because many of them are running with a solid or even nearly locked framerate so we lose a lot of frames and some responsiveness in order to have less judder, (which is a bad compromise IMO).

Take The division and Far Cry Primal on consoles. Those games are basically locked at 30fps. That's really a shame because in that type of open world games we know by experience (PC tests) that a locked 30fps could well mean a game that is often running at ~45fps or ~40fps, and often higher.

That's where an adaptive framerate could really help a lot console gaming, more than PC (PC already has it anyways). Because a judder-free framerate of ~45fps means 50% more frames and lower input lag...and that's totally APU free!

That could make next gen console gameplay 'feel' like 60fps PC gaming in all AAA games and encourage customers to still buy those consoles instead of buying a new GPU. I know it will to me. I think that the gap from 30fps to 45fps is much more significant than 45fps to 60fps because of the law of diminishing returns.

This technology will obviously be optional, Freesync TVs adoption will probably be slow, so devs will be forced to still target a solid 30fps, at least for the first batch of next gen consoles.

So yes, in a way, I think that adaptive framerates like freesync or gsync are going to help the next consoles to be even more relevant than now against PCs. This gen high end PCs have an impressive advantage already, adaptive vsync. Consoles games tragically don't have that currently and most probably won't have before next gen.
 
Last edited:
I would say adaptive sync will not be relevant, I think that main selling point from numbers perspective will be ~60fps @1080p (or so). We already can forsee what kind of improvments 10nm process might yield ( if next gen consoles will be lucky enogh with deadlines to use, after latest spencer comments its even more questionable ). Current jumps in transistors scaling won't be enough to achive 4k@60 with jump in graphics/production values to justify next gen feeling in general population so devs will target accordingly.

Of course it would be nice option but console games are build with framerate targets, especially later iterations . As you said, tv adoption will be slow ( even if there will be any, given lack of it's presence on tv roadmaps and main purpose for tv usage is playing fixed framerate video).

Personally I see adaptive sync as kind of funny band aid on pc side. With that kind of spec diffrence( jaguar + downclocked 7870/7700 on bandwidth contested bus) someone would expect that even i3+7870 should not have problem with locked 60fps with ports of console games holding steady 30, and lately it's not even true with much stronger configs.
 
While i'd like that to be the case, you need TV screen manufacturers to start supporting adaptive sync (gsync or freesync) now. Nobody is going to care if the active userbase of adaptive sync monitors is <10%. Another way to solve the same issue is by doing what the guys at RAD have done with The Order, quoting directly
Pessino’s mentioned that the game’s fluid performance is a combination of two elements.

Firstly, not only the 30 FPS framerate is fully stable, with no dips under that threshold, but the development team implemented post-processing filters aimed to make it feel even more fluid, like temporal anti-aliasing, which Pessino defines “extremely effective.”

Secondly, the game’s rendering frame rate and simulation frame rate (basically the frame rate at which all the inner workings of the gameplay are calculated) are the same, and there are zero frames of latency between the player’s input and the action on the screen.

Pessino mentioned that many games implement one, two, or even three frames of lag after the player inputs a command, and that makes the game feel more sluggish.

On the other hand, for The Order: 1886 Ready at Dawn decided not to resort to that development trick, so the player’s action is executed during the same frame as it is prompted (pretty much as it happens with the most responsive fighting games).

This, according to Pessino, requires a bit more complex engineering compared to systems that involve a slightly delayed response, but contributes to make the game feel much more fluid and responsive

I'm guessing what he's saying is that because they know they are going to hit their target framerate 99.9% of the time they are not buffering frames ahead, which leads to increased input latency.
 
I would say adaptive sync will not be relevant, I think that main selling point from numbers perspective will be ~60fps @1080p (or so). We already can forsee what kind of improvments 10nm process might yield ( if next gen consoles will be lucky enogh with deadlines to use, after latest spencer comments its even more questionable ). Current jumps in transistors scaling won't be enough to achive 4k@60 with jump in graphics/production values to justify next gen feeling in general population so devs will target accordingly.

Of course it would be nice option but console games are build with framerate targets, especially later iterations . As you said, tv adoption will be slow ( even if there will be any, given lack of it's presence on tv roadmaps and main purpose for tv usage is playing fixed framerate video).

Personally I see adaptive sync as kind of funny band aid on pc side. With that kind of spec diffrence( jaguar + downclocked 7870/7700 on bandwidth contested bus) someone would expect that even i3+7870 should not have problem with locked 60fps with ports of console games holding steady 30, and lately it's not even true with much stronger configs.

I agree on both bolded parts (so I don't fully understand your point). Current AAA locked 30fps game (like Destiny, Far Cry, The division) already achieve ~40 or ~45 fps at 1080p and they will probably do that at 4K on next gen too. That's where adaptive framerates come to the rescue, and it's 50% more frames for free which is and will be significant.

My point is that ~45fps + adaptive vsync is very similar to 60fps, it's the 'feel' of 60fps, and much better than locked 30fps. An impressive upgrade from 30fps and CPU / GPU free. For next gen instead of blurry locked 30fps on consoles Vs 60fps on PC we'll have adaptive ~45fps on consoles and still 60fps on PCs.

While i'd like that to be the case, you need TV screen manufacturers to start supporting adaptive sync (gsync or freesync) now. Nobody is going to care if the active userbase of adaptive sync monitors is <10%. Another way to solve the same issue is by doing what the guys at RAD have done with The Order, quoting directly


I'm guessing what he's saying is that because they know they are going to hit their target framerate 99.9% of the time they are not buffering frames ahead, which leads to increased input latency.

Well first, next gen isn't there yet. I think it won't come before 2020 (at least for PS5). And at the beginning it will only be optional and won't change how devs do their games (capped at 30fps by default). By default all manufacturers now include adaptive vsync on their GPUs so I don't see how the GPUs inside future consoles couldn't have it. It would be a shame not to enable the output (HDMI 2.0 most probably) with it. And AMD already showed that freesync using HDMI is possible and quite easy and is already planned will all major monitors companies.

http://wccftech.com/amd-talks-freesync-in-2016-displayport-1-3-hdmi-2-0a/

RTG_Page_29-635x357.jpg


Low input lag at 30fps in some games like Destiny, Bloodborne or The order is a start but not enough, I think we need 50% more frames (on average) to significantly reduce the lack of clarity in motion to an acceptable level and maybe reduce the need for over the top motion blur; hopefully :).

It could be a very interesting feature of futures consoles: all games having adaptive vsync by default could well be a selling point, I know it will to me. Let's put it in another way. If PS5 doesn't ship with freesync enabled by default on all major AAA games I will be very disappointed and might well not buy the console. Blurry 30fps at 4K will not be tolerable in 2020 I am afraid. I'll most probably buy a PC equipped with adaptive framerates if that's the case.
 
Last edited:
Isn't the real problem here the fact that no TVs in the near or far future are expected to support any type of adaptive V-Sync? Not much point in consoles supporting it just for the few people who hook them up to new monitors, which must be a minuscule minority. Which consequently means that no, adaptive v-sync will not save next gen consoles as, even if they did support it, not many people would be taking advantage of it.

Unless I missed a memo and newer TVs will in fact support it, in which case I stand corrected. I have seen absolutely nothing regarding this and I read a lot about new TVs. 2016 panels for sure don't. These days it's all about 4K and HDR and never have I seen adaptive v-sync mentioned.
 
IMO, very very large percentage of console gamers is satisfied with how games look and play on consoles. Additionally, since there is no big market push for it, TV manufacturers would be very slow to adopt this feature [scalers need to be new] and users upgrade TVs even less often.

So no, adaptive vsync is not going to save next-gen consoles because they don't need saving.
 
I agree they don't need saving. Silly phrasing by globalisateur!

Regards adoption of flexible syncing, is it expensive to include? In this chicken-and-egg situation, we need the consoles (and other devices) to add the feature so TV makers introduce some models that support it. Eventually it'd get some use. But if console companies wait for TVs to support it, it'll never happen.
 
I'll happily admit the thread title is because of my personal opinion on the matter. For others it may sound a bit clickbait; the question poll would be more suited to this subject, more neutral.
 
Sony can do it, the rest will follow. Not many will care though, sorry
There's actually reason for Sony to do it as they can integrate with their TVs and have a USP. they've already made a PlayStation TV before. Imagine if the Bravia line supported soft-sync when no-one else did. Every PS buyer out there looking for a new TV would see a mass of very similar displays all SmartTV and beautiful and with stupid marketing buzzwords, and then one brand that actually offers something tangibly different for their hobby. I know that'd be a decent factor in my own purhasing decision. If nothing else they'd get a lot of marketing from tech press covering the feature adn telling everyone it's only available on Bravia sets.
 
Until they stay out of OLED in their TV business, no serious IQ lover will think about Sony when buying a new TV. Not now that Panasonic are back in the game, Samsung rumoured to re-join, both of which will just open the gates to lower pricing for everyone else to start producing their own OLED TVs.
LCD needs to die. Yesterday.
 
Last edited:
I have an offtopic curse, it seems.

One of my best mates has the new 55inch LG OLED tv. The motion resolution is atrocious. Also: the LG OLED panel displays ghosting (not burn in, actual ghosting), I took 60fps video of it, here is a screen cap (it's from the new x files, first episode).
3zv2kcQ.jpg

Also you are looking at +50ms input lag. Banding problems (not that big a deal to me) as well as screen artefacts in shadowy areas.

The Bravia W8 series have lower input lag, better color accuracy, higher motion resolution (with backlight strobing) and are less than a third of the price. OLED at this moment is not the end all solution for televisions IMO. For movies however..


On-topic: it should be included in every tv? Was it not part of the new HDMI of DP standard? And thus, all certified tv's should support this part of the spec? I will look into it
 
Until they stay out of OLED in their TV business, no serious IQ lover will think about Sony when buying a new TV. Not now that Panasonic are back in the game, Samsung rumoured to re-join, both of which will just open the gates to lower pricing for everyone else to start producing their own OLED TVs.
LCD needs to die. Yesterday.
Well yeah, it's a case of all other things being equal. If Sony try to compete against better sets with just soft-sync, it probably won't work for them.
 
I think that's only freesync and it specifically requires a DP. I'm thinking this needs to become an industry standard at least a year before the next gen starts in order for a move towards adaptive sync across all games to happen. Personally, I think the technology is excellent and will provide more freedom to devs.
 
I definitely think future consoles and TVs will support adaptive sync through HDMI, because why the hell not?
However, I voted "no" because I think adaptive sync will become standard in the PCs+monitors well before the new consoles + supporting TVs arrive in the market, so it'll just be something everyone has.
 
The problem is adoption of freesync tvs will be slow, so next gen people will still be playing on 1080 or 4k tvs that don't have the feature. They'll design the games to run at fixed framerates just as they always did, because that's what most people will have.
 
Games designed for 60fps and 30 fps now still have fluctuating framerates and could benefit from soft-sync. It shouldn't be too much to present an option to lower refresh rate for those on soft-sync, to say 48 Hz for a 60 fps game, and offer rock-steady refresh for those who prefer it.
 
Do TVs & Blu Ray players universally support 24/48Hz these days or is it only certain models?
 
Back
Top