Current Generation Games Analysis Technical Discussion [2024] [XBSX|S, PS5, PC]

Nixxes released another patch for Horizon FW. Claims to improve texture streaming and improve texture quality while also reducing memory usage. @Flappy Pannus I haven't gotten the chance to test it out but I wonder if that fixes the texture issues?

Very interesting. I hit clear vram overflow issues on the 4070ti at max settings so will see if this makes a difference later.
 
Very interesting. I hit clear vram overflow issues on the 4070ti at max settings so will see if this makes a difference later.

I can't say for sure as I'm not testing like for like areas due to just watching my wife's playthrough, but based on a couple of sessions now including some cut scenes and seemingly graphical intensive areas, I'd say they have vastly improved the memory management.

I haven't seen any of the previously occurring massive drops into the 30's although I suspect some areas still hit memory limits, as I see more graceful drops now from a roughly average 100fps into the low 60's in very intense situations (without DRS). That's with FG on. But turning it off in those most intense moments barely impacts the frame rate. This is very similar to the behaviour I saw in R&C where performance would just lower gracefully (no horrific drops or stutters) when the VRAM limits are hit. And turning on FG made little to no difference presumably due to the additional pressure it puts on VRAM itself. So based on experience so far this is much better and the game remains completely playable (over 60fps 99.99% of the time) with everything set to max now.

Next time I see what appears to be a vram limited area due to the performance hit, I'll try dropping textures to high to see if it has an impact.
 
Frame gen can use quite a more more extra VRAM so if you're not getting a huge performance increase from it, you might get more a boost by turning it off and giving the game some extra VRAM.
 
It's a shame hellblade isn't releasing with the large improvements epic showcased in 5.4, especially the new multithreading advancements
it could always get an update later down the road if the game sells well enough. Sadly that is what happens with gaming. By the time 5.4 games are ready they will be at 5.8 or 6.0
 
Frame gen can use quite a more more extra VRAM so if you're not getting a huge performance increase from it, you might get more a boost by turning it off and giving the game some extra VRAM.

Yes agreed. In a game that's borderline on VRAM like this is on a 12GB GPU, FG can help in some scenes for sure but I think it often net you nothing except extra latency. I'll probably leave it off here like I did with Rifts Apart. fortunately base performance is more than good enough without it in this case.
 
Whoever calls frame generation a performance increase should be forced to run games internally at 15fps and see if they would still call it a performance increase just because they turned frame gen to hit 30fps.
 
Whoever calls frame generation a performance increase should be forced to run games internally at 15fps and see if they would still call it a performance increase just because they turned frame gen to hit 30fps.

On PC, I try to get higher than 60 mostly because of the reduction in latency, even if 60 is perfectly playable. Running at 120 with frame generation isn't giving you a better gameplay experience, it's not useless, but it's far from a must have feature.
 
Whoever calls frame generation a performance increase should be forced to run games internally at 15fps and see if they would still call it a performance increase just because they turned frame gen to hit 30fps.

I don't agree with this. FG has never been claimed to work at 15fps, in fact AMD specifically recommends 60fps at a minimum and I'm not sure if Nvidia has an official recommendation but DF recommends 40fps there. Personally I think a good implementation of DLSS FG can work well at below 40fps base, at least on a controller. I'm about 70 hours into CP2077 at the moment, all of which has been with FG on and a frame rate averaging in the 70's, with regular drops into the 60's. However the experience is superb. It's extremely smooth and plenty responsive on a controller. It feels far smoother than running in the 40's with FG off, so there it is a very clear performance boost. My more limited experiences with Witcher 3 and Plagues tale are similar.

However FG isn't a magic bullet and in situations where you are for example close to your max refresh rate limit without it, or like in the case I describe above with Forbidden West, very close to or beyond VRAM limits without it, then turning it on can be a net negative.

Bear in mind that the negative frame generation scenario I describe in FW is a pretty specific one and also one that is easily overcome. We are talking about a GPU that's powerful enough to max out the game on every setting at a fairly high resolution (3840x1600, DLSS Q) but that is also VRAM constrained enough at those settings to hit limits. If I were using a 4070Ti Super with 16GB for example, I wouldn't be VRAM limited and could likely whack on frame generation with no concerns. Even on my GPU, in areas that are not VRAM limited, turning on FG can take me from a variable 80's to a locked 116 which I definitely see as a win (given no noticeable reduction in responsiveness for me personally). Unfortunately, you then suffer in areas where the VARM limit hits, so it's best to leave it off at my settings. However a straight forward way to overcome this would simply be to knock textures down to High from Very high and run with FG on all the time for a healthy net performance gain up to my monitors max refresh rate.
 
Whoever calls frame generation a performance increase should be forced to run games internally at 15fps and see if they would still call it a performance increase just because they turned frame gen to hit 30fps.
So people who use Frame Gen to go from 60fps to 120fps aren't actually getting a performance increase? That's just all in their heads?

Your argument boils down to 'it's not always useful in any situation, therefore cant be called a performance increase', and that's just reductive.

This just feels like the old 'it's not real frames so doesn't count' rhetoric all over again that we had to battle through with reconstruction.
 
Yes i have tried. And let me try to be clear again, calling it a performance increase is not correct at all imo, and its irritating to me those who try to defend it as a performance increase.

It can be a minor motion quality output increase at already high enough frames, to smooth visual perception and that's it, but calling it a performance increase? That's just disingenuously misleading, more so from people who should know better, but have their PR/image agenda to defend.

Also the example i gave was deliberately on the edge to try to make it clear why it should not be called a performance increase.
 
Last edited:
Yes i have tried. And let me try to be clear again, calling it a performance increase is not correct at all imo, and its irritating to me those who try to defend it as a performance increase.

It can be a minor motion quality output increase at already high enough frames, to smooth visual perception and that's it, but calling it a performance increase? That's just disingenuously misleading, more so from people who should know better, but have their PR/image agenda to defend.

Also the example i gave was deliberately on the edge to try to make it clear why it should not be called a performance increase.

It is and can be a performance increase.

If a game feels smoother/better with it on, than it does with it off, that is an increase in performance.
 
Last edited:
Yes i have tried. And let me try to be clear again, calling it a performance increase is not correct at all imo, and its irritating to me those who try to defend it as a performance increase.

It can be a minor motion quality output increase at already high enough frames, to smooth visual perception and that's it, but calling it a performance increase? That's just disingenuously misleading, more so from people who should know better, but have their PR/image agenda to defend.

Also the example i gave was deliberately on the edge to try to make it clear why it should not be called a performance increase.

You're equating "more performance" with lower input latency, which is certainly one way of doing it.

But it's equally valid to consider more frames as higher performance given they increase both visual fluidity and anti aliasing quality.

Certainly the benefits from FG in CP2077 are far from minor. On my GPU at my chosen settings, it's the difference between playable and unplayable. The same holds for Witcher 3 in my admittedly limited testing.
 
Lets agree to disagree then, since i don' t consider smoother visual fluidity and negligible antialiasing (and introduced artifacts as well) can be equated to "more performance", instead to me fall into visual quality trade offs.
 
To me it does not, since the latency of the internal rendering and game physics simulation is still the same as if it is off, so in the end and in practice there's no performance increase.
Lots of games, dare I say most games, don't simulate physics at full output framerate. They calculate physics at some usually arbitrary tick rate (sometimes a fixed number, sometimes a fraction of the refresh rate or framerate), and interpolate the output to the rendered framerate. What frame generation does is is apply this type of methodology to the rendered framerate - interpolating it up to a higher output. If we are going to to tie "performance" to "physics simulation" with the caveat that interpolation cannot be factored in, then I fear that we have a fair amount of 30 and perhaps 15fps games.

No one has complained about physics and animation being interpolated even though it's been happening for years.
 
Back
Top