Ratchet & Clank: Rift Apart [PS5, PC]

  • Thread starter Deleted member 7537
  • Start date
From NXGamer's video the framerate only stumbles from 40FPS when the game is doing or preparing all those zone transitions.

I wonder how much more performance the game can push in fidelity mode when not hindered by regular vsync.
40FPS is already a whopping 33% upgrade in performance over the previous 30FPS limit. And with VRR it looks like this could still go up.
Going from a 25ms frametime to 33ms means they're leaving at least 8ms "on the table", where they could add a bunch more stuff.

Perhaps Insomniac's plan was to develop the game for HDMI 2.1 TVs from the start, and the 30FPS were just a "compatibility mode" for older / lower-end TVs.


TVs having minimum VRR at around 40hz reduces the usability.
AFAIK it doesn't, because most TVs with VRR through HDMI 2.1 are also 120Hz. If the full frequency can be lowered, then so can the bottom end of the synced framerate, because it's just a matter of the first being a product of the second.
A 35FPS framerate can be done through a 120Hz panel lowered to 70Hz (just send the same frame for every 2 refresh cycles), 38FPS is just 76Hz with 1 frame per 2 cycles, etc.
 
Isn’t VRR supposed to “just work”?
It does, but the downside to VRR is that if you can end-up with a very variable refresh rate resulting in frame pacing issues.

If your game's frame rate is variable between say, 30fps and 50fps and poor to dips in combat which happens a lot, that's going to be a rough experience. You're trading one problems for another. Games may definitely benefit from an over-arching system monitoring performance and having a consistent frame rate target rather than letting it flap around at the mercy of whatever the rendering can crank out - not to mention a whole bunch of games still have separate systems like physics and logic tied to frame rates.

With more options comes more complexity for engine decisions such as when is to beneficial to sacrifice resolution to maintain frame rate, or sacrifice frame-rate to maintain resolution.

And some TVs have specific issues, like the LG C9 and CX.
 
It does, but the downside to VRR is that if you can end-up with a very variable refresh rate resulting in frame pacing issues.

If your game's frame rate is variable between say, 30fps and 50fps and poor to dips in combat which happens a lot, that's going to be a rough experience. You're trading one problems for another. Games may definitely benefit from an over-arching system monitoring performance and having a consistent frame rate target rather than letting it flap around at the mercy of whatever the rendering can crank out - not to mention a whole bunch of games still have separate systems like physics and logic tied to frame rates.

That's where frame rate caps come in. There's no real downside at all to VRR when combined with a frame rate cap. It's all up side.
 
Most newer games that I've played on my PC constantly hitting below 60fps (I set a 60fps cap tho). I play games on a normal 60Hz TV (without VRR). So far I don't feel the need to have VRR since I don't really feel the tearing. Maybe I'm not as sensitive since when DF did an analysis on a game with tearing (like AC Valhalla for example), when they indicate where the tearing happens, I don't really noticed it unless I pause the video.
I'm saying this because while I do like VRR, it does come with a draw back that on the majority of display it is not compatible with BFI and I need my BFI. Maybe on a more expensive display you can get a good enough motion clarity without BFI, but that type of display usually is out of my budget range, so if someday a cheap TV can have bot VRR and BFI options (assuming you can't activate both at the same time), then I would likely to enable BFI and leave VRR off. Maybe if I encounter a situation where the tearing is so bad and the game support VRR, I might enable VRR, but right now I haven't encounter a game that made me wishing for VRR.
Having said that, the most taxing games that I've played is Horizon with frame rate between 45 to 55. At that framerate, I didn't notice any tearing (and no, I don't use vsync).
As for BFI, the difference in motion clarity is noticeable enough for me in every situation that I always turn it on. I can feel the flickering when there is a really big uniform bright patch but fortunately I can adapt to it (when I turn of BFI and then turn on BFI, I can feel the flickering for some time, but eventually my eyes adapted to it). Turning BFI off feels like I'm losing focus when there is fast movement on screen or worse, a whole screen pan.
Of course the most ideal one would be a VRR display with great motion clarity without BFI or a VRR display where you can also use BFI at the same time. But like I've said before, this type of display probably will be too expensive for me.
 

A full technical review with footage from later in the game.

Interesting he compare to robot the 2005 movie and no PBR make the shading looks dated into the movie. Polycount is better on the movie but not far much better, anti aliasing, motion blur and Depth of field too. But this is comparable.

 
yeah the overall look of R&C is better, but technically there are still unreachable things in real time like polycounts, round and spherical things are round in the movie, while in R&C if zoomed in we can still see poly edges, but that's logical.
 
They aren't mucking around with the ultimate RT settings. :runaway:
 
Looks surprisingly tame for a true next gen game.

I believe Medium settings would be equivalent to PS5, while Very Low will probably look terrible to make it run on old hardware (pretty impressive they got it running on hw like that and without a SSD).

And high RT will likely have better quality than PS5 with raytracing reflections, shadows and RTAO. There will probably be a medium RT settings as well that offers RT reflections at PS5 quality level.

Overall, I think it looks promising and shows good scaling across a wide range of HW. Can't wait for @Dictator's analysis!
 
Back
Top