Current Generation Games Analysis Technical Discussion [2024] [XBSX|S, PS5, PC]

Nixxes released another patch for Horizon FW. Claims to improve texture streaming and improve texture quality while also reducing memory usage. @Flappy Pannus I haven't gotten the chance to test it out but I wonder if that fixes the texture issues?

Very interesting. I hit clear vram overflow issues on the 4070ti at max settings so will see if this makes a difference later.
 
Very interesting. I hit clear vram overflow issues on the 4070ti at max settings so will see if this makes a difference later.

I can't say for sure as I'm not testing like for like areas due to just watching my wife's playthrough, but based on a couple of sessions now including some cut scenes and seemingly graphical intensive areas, I'd say they have vastly improved the memory management.

I haven't seen any of the previously occurring massive drops into the 30's although I suspect some areas still hit memory limits, as I see more graceful drops now from a roughly average 100fps into the low 60's in very intense situations (without DRS). That's with FG on. But turning it off in those most intense moments barely impacts the frame rate. This is very similar to the behaviour I saw in R&C where performance would just lower gracefully (no horrific drops or stutters) when the VRAM limits are hit. And turning on FG made little to no difference presumably due to the additional pressure it puts on VRAM itself. So based on experience so far this is much better and the game remains completely playable (over 60fps 99.99% of the time) with everything set to max now.

Next time I see what appears to be a vram limited area due to the performance hit, I'll try dropping textures to high to see if it has an impact.
 
Frame gen can use quite a more more extra VRAM so if you're not getting a huge performance increase from it, you might get more a boost by turning it off and giving the game some extra VRAM.
 
It's a shame hellblade isn't releasing with the large improvements epic showcased in 5.4, especially the new multithreading advancements
it could always get an update later down the road if the game sells well enough. Sadly that is what happens with gaming. By the time 5.4 games are ready they will be at 5.8 or 6.0
 
Frame gen can use quite a more more extra VRAM so if you're not getting a huge performance increase from it, you might get more a boost by turning it off and giving the game some extra VRAM.

Yes agreed. In a game that's borderline on VRAM like this is on a 12GB GPU, FG can help in some scenes for sure but I think it often net you nothing except extra latency. I'll probably leave it off here like I did with Rifts Apart. fortunately base performance is more than good enough without it in this case.
 
Whoever calls frame generation a performance increase should be forced to run games internally at 15fps and see if they would still call it a performance increase just because they turned frame gen to hit 30fps.
 
Whoever calls frame generation a performance increase should be forced to run games internally at 15fps and see if they would still call it a performance increase just because they turned frame gen to hit 30fps.

On PC, I try to get higher than 60 mostly because of the reduction in latency, even if 60 is perfectly playable. Running at 120 with frame generation isn't giving you a better gameplay experience, it's not useless, but it's far from a must have feature.
 
Whoever calls frame generation a performance increase should be forced to run games internally at 15fps and see if they would still call it a performance increase just because they turned frame gen to hit 30fps.

I don't agree with this. FG has never been claimed to work at 15fps, in fact AMD specifically recommends 60fps at a minimum and I'm not sure if Nvidia has an official recommendation but DF recommends 40fps there. Personally I think a good implementation of DLSS FG can work well at below 40fps base, at least on a controller. I'm about 70 hours into CP2077 at the moment, all of which has been with FG on and a frame rate averaging in the 70's, with regular drops into the 60's. However the experience is superb. It's extremely smooth and plenty responsive on a controller. It feels far smoother than running in the 40's with FG off, so there it is a very clear performance boost. My more limited experiences with Witcher 3 and Plagues tale are similar.

However FG isn't a magic bullet and in situations where you are for example close to your max refresh rate limit without it, or like in the case I describe above with Forbidden West, very close to or beyond VRAM limits without it, then turning it on can be a net negative.

Bear in mind that the negative frame generation scenario I describe in FW is a pretty specific one and also one that is easily overcome. We are talking about a GPU that's powerful enough to max out the game on every setting at a fairly high resolution (3840x1600, DLSS Q) but that is also VRAM constrained enough at those settings to hit limits. If I were using a 4070Ti Super with 16GB for example, I wouldn't be VRAM limited and could likely whack on frame generation with no concerns. Even on my GPU, in areas that are not VRAM limited, turning on FG can take me from a variable 80's to a locked 116 which I definitely see as a win (given no noticeable reduction in responsiveness for me personally). Unfortunately, you then suffer in areas where the VARM limit hits, so it's best to leave it off at my settings. However a straight forward way to overcome this would simply be to knock textures down to High from Very high and run with FG on all the time for a healthy net performance gain up to my monitors max refresh rate.
 
Whoever calls frame generation a performance increase should be forced to run games internally at 15fps and see if they would still call it a performance increase just because they turned frame gen to hit 30fps.
So people who use Frame Gen to go from 60fps to 120fps aren't actually getting a performance increase? That's just all in their heads?

Your argument boils down to 'it's not always useful in any situation, therefore cant be called a performance increase', and that's just reductive.

This just feels like the old 'it's not real frames so doesn't count' rhetoric all over again that we had to battle through with reconstruction.
 
Yes i have tried. And let me try to be clear again, calling it a performance increase is not correct at all imo, and its irritating to me those who try to defend it as a performance increase.

It can be a minor motion quality output increase at already high enough frames, to smooth visual perception and that's it, but calling it a performance increase? That's just disingenuously misleading, more so from people who should know better, but have their PR/image agenda to defend.

Also the example i gave was deliberately on the edge to try to make it clear why it should not be called a performance increase.
 
Last edited:
Yes i have tried. And let me try to be clear again, calling it a performance increase is not correct at all imo, and its irritating to me those who try to defend it as a performance increase.

It can be a minor motion quality output increase at already high enough frames, to smooth visual perception and that's it, but calling it a performance increase? That's just disingenuously misleading, more so from people who should know better, but have their PR/image agenda to defend.

Also the example i gave was deliberately on the edge to try to make it clear why it should not be called a performance increase.

It is and can be a performance increase.

If a game feels smoother/better with it on, than it does with it off, that is an increase in performance.
 
Last edited:
To me it does not, since the latency of the internal rendering and game physics simulation is still the same as if it is off, so in the end and in practice there's no performance increase.
 
Yes i have tried. And let me try to be clear again, calling it a performance increase is not correct at all imo, and its irritating to me those who try to defend it as a performance increase.

It can be a minor motion quality output increase at already high enough frames, to smooth visual perception and that's it, but calling it a performance increase? That's just disingenuously misleading, more so from people who should know better, but have their PR/image agenda to defend.

Also the example i gave was deliberately on the edge to try to make it clear why it should not be called a performance increase.

You're equating "more performance" with lower input latency, which is certainly one way of doing it.

But it's equally valid to consider more frames as higher performance given they increase both visual fluidity and anti aliasing quality.

Certainly the benefits from FG in CP2077 are far from minor. On my GPU at my chosen settings, it's the difference between playable and unplayable. The same holds for Witcher 3 in my admittedly limited testing.
 
Lets agree to disagree then, since i don' t consider smoother visual fluidity and negligible antialiasing (and introduced artifacts as well) can be equated to "more performance", instead to me fall into visual quality trade offs.
 
To me it does not, since the latency of the internal rendering and game physics simulation is still the same as if it is off, so in the end and in practice there's no performance increase.
Lots of games, dare I say most games, don't simulate physics at full output framerate. They calculate physics at some usually arbitrary tick rate (sometimes a fixed number, sometimes a fraction of the refresh rate or framerate), and interpolate the output to the rendered framerate. What frame generation does is is apply this type of methodology to the rendered framerate - interpolating it up to a higher output. If we are going to to tie "performance" to "physics simulation" with the caveat that interpolation cannot be factored in, then I fear that we have a fair amount of 30 and perhaps 15fps games.

No one has complained about physics and animation being interpolated even though it's been happening for years.
 
Lets agree to disagree then, since i don' t consider smoother visual fluidity and negligible antialiasing (and introduced artifacts as well) can be equated to "more performance", instead to me fall into visual quality trade offs.
More performance is not a defined term, it involves many factors, chief among them:
1-Motion Smoothness: this involves higher frame rates, good frame pacing, no frame drops, stutter free experience, and tear free.
2-Higher Responsiveness: this involves lower input latency

You will notice that motion smoothness is the largest factor at play here.

The typical way we've come to experience higher performance on PC is to have frame rates unlocked, and the let the game run free, this achieves motion smoothness through more motion samples, and also lower latency since more samples comes with more chance to sample input.

However, several games don't really behave that way, they have an almost fixed latency irrespective of frame rates. Leaving you with the option to increase motion smoothness alone.

This methodology also doesn't guarantee great performance, as bad frame pacing, frames drops and stuttering can destroy any motion smoothness and latency. Also tearing can distort motion smoothness.

On consoles, they cap frame rates to get rid of bad frame pacing, frame drops and stutters, this sacrifices latency. They also deploy various Sync techniques to get rid of tearing.

Frame generation (DLSS3) attempts to increase performance through higher frame rates and lessening the impact of frame drops and stutters, but doesn't improve latency as it stays fixed. It also comes in handy when you are CPU limited (whether due to single threaded code or insufficiently powerful CPU), in these situations your only option is frame generation to have any hope of faster performance.

You also get much smoother cinematics, cut scenes and animation sequences where the player is deprived of control.

Consider this situation, you are in a zombie game, numerous zombies swarm you screen, your fps buckle down as the game is CPU limited, you use an explosive weapon to gun them all down, your screen is filled with fire and explosions, your fps buckle more due to the increased alpha and particles, you feel the severe frame drop, and the smoothness of the presentation drops below acceptable levels. In this situation frame generation will maintain a smoother presentation through it all, and while the responses to your control remains the same, enemies and particles are shown in a much smoother way to your eyes.
 
Lots of games, dare I say most games, don't simulate physics at full output framerate. They calculate physics at some usually arbitrary tick rate (sometimes a fixed number, sometimes a fraction of the refresh rate or framerate), and interpolate the output to the rendered framerate. What frame generation does is is apply this type of methodology to the rendered framerate - interpolating it up to a higher output. If we are going to to tie "performance" to "physics simulation" with the caveat that interpolation cannot be factored in, then I fear that we have a fair amount of 30 and perhaps 15fps games.

No one has complained about physics and animation being interpolated even though it's been happening for years.

Yeah I had a similar thought. If we define performance as “number of time steps simulated” then surely all of the caching and reuse happening today means that displayed fps is almost always higher than simulation frequency. If a shadow map, irradiance cache or BLAS isn’t updated for a few frames shouldn’t they be considered “fake” frames too?

Frame gen is just another trick in a big back of tricks. If it works well I see no reason to dismiss it. The only issue I have with it is the over eager marketing presenting generated frames as equivalent to rendered ones.
 
The real issue with FG is how inconsistent it is between games.

With some games feelings noticeably heavier with FG enabled and some feeling drastically smoother and more responsive with it enabled.

And each person will feel it to varying levels.

I think Portal RTX is a sluggish POS with FG enabled, but CP2077 (with PT enabled) feels better and more responsive with it on than with it off.
 
Back
Top