Current Generation Games Analysis Technical Discussion [2024] [XBSX|S, PS5, PC]

I suspect that the change to the transformer model is setting the stage for Nvidia's future neural rendering plans and was motivated by more than just the pursuit of better upscaling quality. Maybe a future version of DLSS will have post-processing handled directly by the transformer model. It could also make it easier for DLSS to accept additional inputs beyond color, depth, motion vectors, and normals, and the transformer model might have a longer and better "memory" of the game world by keeping more temporal data. For example, Reflex 2 Frame Warp provides camera position as an input and presumably uses the transformer model to inpaint disocclusions, this might not have been possible with the old CNN model. It's also possible that other neural rendering tech like NRC and neural materials could provide extra data to guide the DLSS transformer in the future.
 
I doubt Nvidia will share but I’m curious as to what new connections are enabled by the transformer approach. I’m still very fuzzy on ML stuff but at a high level transformers are meant to find connections between “distant” data points - e.g. two words in a paragraph that are far apart that help provide context. CNN’s on the other hand are wired such that data points that are close together have more influence (like neighboring pixels in an image).

So if CNN’s have reached their limit with DLSS3 I wonder what “distant” data Nvidia is pulling in to further enhance the model. Is the color of a pixel really influenced by other pixels halfway across the screen? Unlikely. Maybe the transformer model is pulling in a lot more temporal data to help with things like disocclusion in the Alan Wake ceiling fan example. So “distant” here can mean far away in time.

From what I remember the conjecture for how Nvidia was really using ML for is the analysis of the motion vectors and native frame?

My understanding is that vision transformers at least when used in ML vision are better able to contextually analyze the the image as a whole. In theory my guess this improves the above analysis and detection of elements in the native frame.

For example if you look at Nvidia's example with DLSS 3 and what they used the optical flow accelerator it's evident why that could have been replaced via the new ML model as well.


 
What id realistically expect is with the 6000 series, there will be a node shrink and a notable jump in performance. Alongside you’ll likely be looking at a new gen of consoles.

That should align nicely with AA/AAA being native RT as the norm. By then hopefully PT is where RT is now in terms of availability in games.

Only worry is udna and its focus on RT performance. We need amd to seriously step up.
 
What id realistically expect is with the 6000 series, there will be a node shrink and a notable jump in performance. Alongside you’ll likely be looking at a new gen of consoles.

That should align nicely with AA/AAA being native RT as the norm. By then hopefully PT is where RT is now in terms of availability in games.

Only worry is udna and its focus on RT performance. We need amd to seriously step up.

It’ll be several years after the new consoles launch before games start taking advantage of the new hardware. So probably 4-6 years from now. Its hard to imagine AMD not having great RT and upscaling performance by then.
 

This stuff drive me a bit mad. In so many games, cuts between scenes are more stuttery on PC. You can literally see in the clip that the PS5 version is a bit ahead of the PC version, as the PC stutters so hard that it falls behind.

PS: if the timestamp isn't working, the cutscenes in question is at 4 minutes.
 
Last edited:

This stuff drive me a bit mad. In so many games, cuts between scenes are more stuttery on PC. You can literally see in the clip that the PS5 version is a bit ahead of the PC version, as the PC stutters so hard that it falls behind.
It stutters every time you hit something maybe the cut scene stutter is working as intended and the consoles bugged.

I'm joking obviously but I better say it ;)
 

This stuff drive me a bit mad. In so many games, cuts between scenes are more stuttery on PC. You can literally see in the clip that the PS5 version is a bit ahead of the PC version, as the PC stutters so hard that it falls behind.
Yes and more people need to call this stuff out. This is 100% compilation stuttering.. which is pathetic because it's a cutscene ffs.. all the PSOs should be known ahead of time.

Ninja Gaiden 2 Black has the same crap, despite being UE5 with presumably PSO precaching..
 
Cutscene stutters!? Haha that’s a new low.
It's been happening since at least the PS4 generation. Ps2 games on the other hand had perfect cuts without any stuttering. Kingdom hearts, metal gear solid, god of war and many others had more fluid cinematics than most modern games.

I mean have you seen the new lords of the fallen? That game looks like it's going to crash everytime the camera cuts. It's horrible.


That's unacceptable.
 
It's been happening since at least the PS4 generation. Ps2 games on the other hand had perfect cuts without any stuttering. Kingdom hearts, metal gear solid, god of war and many others had more fluid cinematics than most modern games.
Yeah, but they had no shaders, so why even bring them up? If you want to identify other tech that did it better, you need to have the same on-screen results, so complex PBR materials run on massively parallel GPUs and portable across different hardware.
 
Yeah, but they had no shaders, so why even bring them up? If you want to identify other tech that did it better, you need to have the same on-screen results, so complex PBR materials run on massively parallel GPUs and portable across different hardware.
I bring them up because we have hardware that's a thousand times more powerful but there are still regressions. And it's not like it's all games. We have naughty dog, kojima productions, rockstar, Santa Monica and many others that don't have this problem.

But it's still too widespread.
 
I bring them up because we have hardware that's a thousand times more powerful but there are still regressions.
It's not a regression though. There's a whole new thing the hardware is having to do creating a new problem. It's like asking Olympic sprinters to carry a 50kg backpack in their races and then complaining that they aren't as fast at the 100m as athletes from the 90s.

And it's not like it's all games. We have naughty dog, kojima productions, rockstar, Santa Monica and many others that don't have this problem.
That's a relevant comparison as it's looking at the same problem being faced and apparently solved. Now the Olympians are still having to carry a 50kg backpack and they're all really slow except the athletes from Zimbabwe and Senegal who are just as fast as the 90's runners and everyone can wonder how they do it.
 
I bring them up because we have hardware that's a thousand times more powerful but there are still regressions.
That's not really a new thing. New paradigms come with their own positives but also negatives.

The switch to 3d gaming from 2d gaming had its fair share of regressions in many ways, for instance.
 
I bring them up because we have hardware that's a thousand times more powerful but there are still regressions. And it's not like it's all games. We have naughty dog, kojima productions, rockstar, Santa Monica and many others that don't have this problem.

But it's still too widespread.

A problem is hardware isn't uniformily "a thousand times more powerful" across the board. Some aspects of hardware have barely increased in terms of performance if at all, while others have increased more so than the average. Like these stutters, regardless of the type, would be a lot more mitigated (if non existent) if hardware actually were uniformally faster in that way.

People want to complain about lazy devs and etc. but it's underselling the challenge of how you work around how hardware is actually progessing.
 
Those Lords cutscene stutters are probably not from shader compilations, but from loading subsequent level sequences in response to and event at the end of running sequence.
 

Spider-Man 2 Debuts to 'Mixed' Steam Reviews Amid Serious PC Performance Problems


ign said:
Currently, 55% of Spider-Man 2's Steam reviews are positive. “Despite having a high-end GPU and running the latest Nvidia drivers (5.66.36), the game frequently crashes,” said one RTX 4090 user. Another read: “The game is completely unplayable on PC. The game crashes to desktop every five minutes. I have already requested a refund.”

"Hold off on buying until they get a couple of stabilization patches out because holy hell," one reviewer added. "To say this is 'rough' is an understatement. Lighting doesn't load in some cutscenes, those same scenes run at seconds-per-frame, audio desync issues up the wazoo, freezing, stuttering, and just about every other performance issue I can think of.

Sigh. Think Nixxes is really just being stretched too thin.
 
That's shocking and tragic. Nixxes had such a good reputation and did such a good job. Now Sony bought them, they are struggling??
 
Back
Top