Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

VRR for ps5 brings it advantages and disadvantages to the system, if DF is to believe (my tv doesnt do vrr). Having have had it on pc for over seven years, its a welcome addition for games that benefit from it.
Its no magic though, it doesnt truly improve performance in the sense that the hw unlocked more flops or something. Its merely a uncapped framerate, where looking at the sky or when not much is going on the framerates go up, just without the tearing and jidder issues. Studios target fps caps for a reason… VRR for example has good use when your 60fps title dips below 60 (between 52/60 for example) where vrr could enable acceptable 60fps.

Theres no extra gpu power, higher clocks or more teraflops, performance doesnt improve on that level, merely a TV function to process these frames.
No idea what happened since the last few days but people somehow are going nuts over something like vrr/freesync/gsync etc. This tech has existed since atleast 2013 and rarely has it seen use in benchmarks to gauge performance. Its like this feature has gone under the radar for almost a decade for a quite large group of gamers, which is a scary thought considering media/internet coverage the last two decades.
 
Studios target fps caps for a reason…
reason is tearing that vrr fixing, imo can be quite a diff (like miles/ratchet findelity mode from 30fps to around 45 or performance rt from around 60 to around 80-85), btw apparently in miles ps5 jumps power consumption from 200w to over 220 when uncapped
 
reason is tearing that vrr fixing, imo can be quite a diff (like miles/ratchet findelity mode from 30fps to around 45 or performance rt from around 60 to around 80-85), btw apparently in miles ps5 jumps power consumption from 200w to over 220 when uncapped

Yes indeed, its uncapped framerates where not too much is happening/non heavy scenes and the fps ramps up, increasing load on in special the cpu.
You wont get more frames in scenes where the framerates never went much or at all above 60 (target fps/max load normally).
Its a welcomed feature, but its no magic or ’extra performance’.
 
Yes indeed, its uncapped framerates where not too much is happening/non heavy scenes and the fps ramps up, increasing load on in special the cpu.
You wont get more frames in scenes where the framerates never went much or at all above 60 (target fps/max load normally).
Its a welcomed feature, but its no magic or ’extra performance’.
If it's going from 60 to 80-85fps I would say you are getting extra performance.
It's just that we're used to saying cpu & gpu limited, where as this was display limited.
It's taking away that bottleneck.
 
Last edited:
Yes indeed, its uncapped framerates where not too much is happening/non heavy scenes and the fps ramps up, increasing load on in special the cpu.
You wont get more frames in scenes where the framerates never went much or at all above 60 (target fps/max load normally).
Its a welcomed feature, but its no magic or ’extra performance’.
Where there is nothing on scene its around 100fps, its obvious there was big room in miles and ratchet (two games that imo could massivly gain from similar approach could be forbiden west fidelity mode and forza horizon 5 fidelity mode)
 
Every game that is almost locked 30 or 60fps will benefit from VRR if implemented well.
And though it's been available for a long time on PC monitors, it's been added only relatively recently on TVs, and that's nice.
Anything that can improve the gamer's experience is welcome.
 
Every game that is almost locked 30 or 60fps will benefit from VRR if implemented well.

Absolutely, in special when ue5 and other high end engines come into play.
Its that some have the false perception that they are going from 40 to 120fps, yeah you do in certain situations but thats hardly going to be a locked framerate.
I wonder now if vrr/freesync/gsync will be the norm when bechmarking?
 
I wonder now if vrr/freesync/gsync will be the norm when bechmarking?

the interesting part is when fremerate is unlocked, will make comparisons even more informative, if it can be mesured properly, because now when we have games capped at 60fps, and for example one console drops to 50s and the other is 60, we suppose it could be higher with uncapped framerate, but we can't know for sure.
will make comparisons with PC GPUs more interesting too.
 
the interesting part is when fremerate is unlocked, will make comparisons even more informative, if it can be mesured properly, because now when we have games capped at 60fps, and for example one console drops to 50s and the other is 60, we suppose it could be higher with uncapped framerate, but we can't know for sure.
will make comparisons with PC GPUs more interesting too.
Now what we really just need is the ability to actually measure the refresh rate of the television and the ability to compensate for LFC activation. With that we will have a rather accurate understanding of console frame times. We are working on it but it will take time.
 
Now what we really just need is the ability to actually measure the refresh rate of the television and the ability to compensate for LFC activation. With that we will have a rather accurate understanding of console frame times. We are working on it but it will take time.

Then we’d have better comparisons going forward between platforms, in special xsx vs ps5. But also rx6600xt/2070 vs consoles. You have alot to do going forward ;)
Will all platform benchmarks go with uncapped fps from now?
 
Last edited:
Now what we really just need is the ability to actually measure the refresh rate of the television and the ability to compensate for LFC activation. With that we will have a rather accurate understanding of console frame times. We are working on it but it will take time.
Have you considered intercepting the HDMI signals from the consoles which trigger timing changes in the TV's panel? It's not the same thing, but may be easier. Trying to measure the panel's actual refresh rate could be tricky trying to accommodate all the variations of the fundamental panel technologies.
 
Have you considered intercepting the HDMI signals from the consoles which trigger timing changes in the TV's panel? It's not the same thing, but may be easier. Trying to measure the panel's actual refresh rate could be tricky trying to accommodate all the variations of the fundamental panel technologies.

Would that be with a logic analyzer or oscilloscope? Or is there something similar to wireshark (dumpcap) that could be used?
 
Back
Top