Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

If Sony doesn't provide VRR by the end of the year, one would assume that the PS5 hardware is incapable of VRR support, and that Sony has made a grave mistake in mentioning it.
There is no hardware required to implement VRR, it's entirely a software (firmware) implementation. It took Sony a year-and-a-sodding-half implement suspend-and-resume on PS4 (firmware 2.50 - March 2015), a feature they showed off at PS4's reveal in February 2013. Like all teams, Sony will prioritise features and I would bet that the reason that VRR hasn't been delivered is that PS5 telemetry shows very people people have them hooked up to VRR-capable displays. There may also be some embarrassment if Sony do not have a good offering of TVs supporting VRR so that may be a factor.

On a software/firmware levels, if both devices support VRR then during the connection negotiation phase, the display well send an index table of supported resolutions and frame rates. Once in receipt of this, the host (console/PC) can request a new refresh rateusing the HDMI command line. It's kind of idiot proof. :yep2:
 
Like all teams, Sony will prioritise features and I would bet that the reason that VRR hasn't been delivered is that PS5 telemetry shows very people people have them hooked up to VRR-capable displays. There may also be some embarrassment if Sony do not have a good offering of TVs supporting VRR so that may be a factor.

Well maybe if Sony added VRR to their TV sets then PS5 users would have higher percentage of VRR capable displays. :LOL:
 
The problem with VRR is precisely that. There is no benefit in any way or form to supporting it for the existing user base other than the small selection of users that have it available. So in general there would only.L be an advantage if Sony themselves also want to push TVs that support it. There is even a risk that it would hurt more than it helps.

In that respect the 40FPS mode makes more sense than supporting VRR. I think even with VRR you want a stable framerate and scaling resolution dynamically is preferable to varying the framerate even with VRR.
 
The problem with VRR is precisely that. There is no benefit in any way or form to supporting it for the existing user base other than the small selection of users that have it available. So in general there would only.L be an advantage if Sony themselves also want to push TVs that support it. There is even a risk that it would hurt more than it helps.

I disagree. They could have had support for VRR of some form since 2018 like other platforms have had without risking anything.
 
So give me one example where it would have been a benefit, and particularly one where you wouldn’t have been better off with dynamic resolution scaling, for instance?
 
So give me one example where it would have been a benefit, and particularly one where you wouldn’t have been better off with dynamic resolution scaling, for instance?

A couple of places spring to mind:

- Where frame rate is dropping due to CPU
- Where frame rate is dropping largely due to front end stuff on the GPU (so resolution scaling has limited effect)
- Where below a certain resolution effects begin to break down and look really bad
- Where you would otherwise resort to dropping vsync / tearing

That covers quite a lot of areas actually!

Edit: - where you want to run at as high a frame rate as possible e.g. above 60 but you're never going to be able to hold 120
 
I disagree. They could have had support for VRR of some form since 2018 like other platforms have had without risking anything.

It's certainly a weird omission to be sure. You can't help but wonder if Sony have a particular implementation of VRR in mind. Ideally you should not require individual games to support VRR, it should be automatic but that is probably trickier.
 
A couple of places spring to mind:

- Where frame rate is dropping due to CPU
- Where frame rate is dropping largely due to front end stuff on the GPU (so resolution scaling has limited effect)
- Where below a certain resolution effects begin to break down and look really bad
- Where you would otherwise resort to dropping vsync / tearing

That covers quite a lot of areas actually!

Edit: - where you want to run at as high a frame rate as possible e.g. above 60 but you're never going to be able to hold 120
Agreed; in the situation that resolution or compute is not the bottleneck for performance. You can still eat frame rate losses from the cpu. The GPU can only “catch” up so much so to speak. I’d rather just have a smoother experience with variable frame rate then trying to bottom out the GPU to chase an issue on the CPU side.
 
Bit of a chicken&egg kind of situation here!
I mean, sort of. But if they are going to have your official specs include HDMI 2.1 (of which VRR is a feature) and claim VRR as a feature, they should probably support that feature on the hardware you are advertising it on. Even if their displays don't.

Official Playstation Blog
Scoll all the way down to the bottom and look at the Video Out section. That's from March of 2020.
Video Out Support of 4K 120Hz TVs, 8K TVs, VRR (specified by HDMI ver.2.1)
 
So give me one example where it would have been a benefit, and particularly one where you wouldn’t have been better off with dynamic resolution scaling, for instance?

Tales of Arise.

In graphics mode, the PS5 runs pretty consistently around 45fps. There's enough power to go way over 30, but not enough to get anywhere near 60. As Audi pointed out in one of the DF Weekly episodes, this game uses an art and texture style which doesn't benefit all that much from increases in resolution. Capping graphics mode at 30fps would provide minimal visual improvement while reducing framerate considerably. And there's already a 60fps mode that holds its target very well, so you don't need VRR for that. Ultimately, in this game, for people who want just a little extra visual fidelity without dropping all the way down to the very unpleasant 30hz range, VRR would have been a huge benefit.
 
So give me one example where it would have been a benefit, and particularly one where you wouldn’t have been better off with dynamic resolution scaling, for instance?

Dynamic resolution takes effort to design/implement and isn't perfect as its implementation is only as good as the devs who coded it.

If you are targeting 60 fps but 30% of your frames are taking on average 20 ms to render during some scenes then without VRR your framerates would hit ~46 fps while with VRR it would average 56 fps. If every frame took 20 ms to render that would produce a smooth framerate of 50 fps with VRR but without VRR you would get something like a juddering 45 fps with alternating frame times of 33.3 ms and 16.7 ms.

VRR, dependent on circumstance, can remove a lot of penalties that come from missing your frame time. It may allow for more nuanced use of dynamic resolution or other solutions that degrades IQ to maintain targeted framerates.
 
There is no hardware required to implement VRR, it's entirely a software (firmware) implementation.

Isn't HDMI 2.1 not a hardware compliancy on supporting VRR, hence my comment?

The problem with VRR is precisely that. There is no benefit in any way or form to supporting it for the existing user base other than the small selection of users that have it available. So in general there would only.L be an advantage if Sony themselves also want to push TVs that support it. There is even a risk that it would hurt more than it helps.

Sony doesn't need to take another consumer 'L' on going back on promises.
 
Agreed; in the situation that resolution or compute is not the bottleneck for performance. You can still eat frame rate losses from the cpu. The GPU can only “catch” up so much so to speak. I’d rather just have a smoother experience with variable frame rate then trying to bottom out the GPU to chase an issue on the CPU side.

But that was much more a last gen problem, wasn’t it? Not so much an issue this gen where the CPU is better balanced against the GPU, and they can even surrender power to each other to balance out further.

And a game that runs best around 45fps means that the vast majority of people gets shafted a bit and that will remain so for quite a while. So that shouldn’t have happened really. But I guess games end up like that sometimes and in those cases, well.

I also understand the 90-120 frames example, I guess it does make some sense there.

Mind you I agree that when Sony announces something like VRR they should implement it anyway. But it is going to mean that most games will have to have a good 30 and 60 (or 60 and 120) mode, and then on top of that maybe a 40 mode (for 120hz capable displays) and then perhaps an unlocked 45-ish mode for VRR … not a great situation.

I am also not sure I buy that dynamic resolution is still hard to do though in this day and age.
 
But that was much more a last gen problem, wasn’t it?

That's when VRR-mode was available, last generation with the mid-gen updates.

and then on top of that maybe a 40 mode (for 120hz capable displays)

Can you name 120 Hz HDR capable displays that do NOT support VRR (that is NOT a Sony)?

I do agree with you about the various modes, they simply should not do a 40 FPS mode at all, instead it should be VRR.
 
Since I've just been looking at tv's I was shocked how many HDR smart tv's are HD-ready aka 1366x768 I honestly expected 1080p to be the lowest available
 
Last edited:
But that was much more a last gen problem, wasn’t it? Not so much an issue this gen where the CPU is better balanced against the GPU, and they can even surrender power to each other to balance out further.

And a game that runs best around 45fps means that the vast majority of people gets shafted a bit and that will remain so for quite a while. So that shouldn’t have happened really. But I guess games end up like that sometimes and in those cases, well.

I also understand the 90-120 frames example, I guess it does make some sense there.

Mind you I agree that when Sony announces something like VRR they should implement it anyway. But it is going to mean that most games will have to have a good 30 and 60 (or 60 and 120) mode, and then on top of that maybe a 40 mode (for 120hz capable displays) and then perhaps an unlocked 45-ish mode for VRR … not a great situation.

I am also not sure I buy that dynamic resolution is still hard to do though in this day and age.
yea. Developers should target a solid 30/60 fps. and there should be no dipping in frame rate.
But if the target is 120 FPS, that's a very difficult frame rate to sustain, so it's better to lose the framerate than have the GPU catch up since you're unlikely to notice 120 dropping to 90 but you'll notice if it drops to 60. Typically with VRR however once you get above 60, it's going to feel very smooth regardless of those fluctuations, so I'd rather have that then a variable resolution.
 
Back
Top