Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
You can argue that the cost of those stable shadows isn't worth it at the moment for this particular game fine, but there's no denying that 'large blocks of pixels blinking in and out of existence' is a particular egregious graphical defect we've just learned to accept as a limitation of shadow maps.
Large block of pixels, hyperbole much? The number one problem with Alan Wake at least on console is image quality and the terrible aliasing. Very few outside forums like these are noticing shadow maps vs rt shadows.
They don't ruin the entire presentation of AW2 for sure, but it's a longstanding defect that is well overdue for being addressed. Throughout the history of game graphics there are a multitude of effects that when taken in isolation, are invariably 'minor' and when solutions are introduced (such as soft shadows) have always been derided as having an egregious cost relative to their benefits. It's when they become commonplace, and especially when combined with other advanced rendering methods, that we come to accept them as necessary to construct a realistic image as a whole.
There's more to "graphics" than shadow maps and these insignificant minor defects. Imaging wasting all that performance to implement path tracing when the moment the character moves, all semblance of believability is irradicated? Wasting performance on barely noticeable path-tracing is funny when the physics in the game are a giant regression from control. Hair physics in AW2? Awful. Cloth physics, not great. Animations, awful with noticeable dose on uncanny valley. The world of AW2 is sterile and static. Before I can even bother myself with shadow maps, I have to ignore all the things I mentioned and more. The closer you get to real time graphics, the more the minor details stick out. I know many are wowed by AW2 but I'm certainly not one of them. The more I see, the less impressed I become. I earlier said that I wouldn't purchase the game but I got it for free. The lack of cohesion just kills this game for me. When I look at the performance cost for this glorified corridor shooter with tank controls, it's hard to be impressed.
 
100% agree. AW2 represents one of the worst implementation of path tracing so far. The visual difference between both modes is not "transformative" as some would say and the performance cost it's laughable. In regular gameplay, the differences are not notable to a majority of people. It's impact is frankly insignificant. It's one of those things that is done for doing sake but not out of practicality. People are pointing out shadows and I can't help but chuckle.

Failing eyesight is nothing to chuckle about.

 
Failing eyesight is nothing to chuckle about.

I’ve watched the video, I’m not impressed. The game looks fine, good. Nothing outstanding from my perspective when the scope of the game is factored in. More importantly, I own the game and like I said, not impressed. Funny how different people can have different opinions.

Anyway on a different note, lows of 648p for series x & ps5, 432p on series s with frame rate lows of 0 fps. Laughable really, I mean why even bother. The image quality looks so bad especially when combined with the upscaling artifacts in motion.
 
Daniel Owen tested the new Starfield patch. NVIDIA GPUs gained a massive 27% uplift, AMD gained 5%. So yeah, Besethda is indeed optimizing the game for NVIDIA GPUs.

But was it 45-50% improvement like some
People were expecting it to be? We knew it was under performing; we didn’t know by how much, but the way many of you were speaking about it, it should be significantly faster than AMDs offerings based on past performance.
 
But was it 45-50% improvement like some
People were expecting it to be? We knew it was under performing; we didn’t know by how much, but the way many of you were speaking about it, it should be significantly faster than AMDs offerings based on past performance.
Who said that?
 
Who said that?
I’m not going to name names. The usual suspects (repeatedly) beating up on the string of UE5 titles that didn’t perform well and then Starfield came out and they started the same thing with the cherry picking of screenshots.
 
Last edited:
@DavidGraham Game just required a little more optimization for Nvidia gpus. There are very few games now that won't receive performance patches after launch for one gpu/cpu vendor or another. The chips and cheese article was obviously not totally comprehensive. I saw some criticisms from developers stating that looking at only some of the longest running shaders was not enough to give a total view of performance. Would be interesting to see chips and cheese do a follow up and explore whether any of those shaders had been optimized or if the gains came from elsewhere.
 
The 4090 is now 22% faster than an overclocked 7900XTX, instead of being 8% slower before.

Lol, yet some armchair developers were claiming that this was a limitation of NVIDIA hardware and that nothing could be done about it.

From cheese’n chips.

However, there’s really nothing wrong with Nvidia’s performance in this game, as some comments around the internet might suggest. Lower utilization is by design in Nvidia’s architecture.

No, of course there was nothing wrong with NVIDIA’s performance in this game…
 
Last edited:
I’m not going to name names. The usual suspects (repeatedly) beating up on the string of UE5 titles that didn’t perform well and then Starfield came out and they started the same thing with the cherry picking of screenshots.
I thought some people said gpu performance when Starfield first came out was in general the status quo, and we should not be surprised.
 
No, of course there was nothing wrong with NVIDIA’s performance in this game…
They measured the performance of the hardware against the code, they have no visibility into any software improvements that could occur that would improve hardware performance.
ie. they looked at how the code was being run on Nvidia hardware and they didn't see any major issue.
but they couldn't see what could be done on the software side to improve performance for nvidia, that's something Bethseda would have to do.

And, honestly, imo, I didn't actually think they would do it. I'm happy they did of course, I've only ever run Nvidia. I just didn't expect them to bother to do it.

I thought some people said gpu performance when Starfield first came out was in general the status quo, and we should not be surprised.
The commentary is around the status quo of GPU performance around the heavy costs of real-time dynamic lighting, in which most people were ripping up in favour of baked lighting running at 3x the frame rate and basically saying that these titles were unoptimized garbage because titles from last generation looked better, some even went as far to point at titles back to PS3.
 
Do we know how much is down to driver side changes vs game code changes? And is the game still putting out exactly the same image?

it would be interesting to know what was holding the Nvidia cards back at launch. Presumably with AMD being in house with Bethesda and the focus on RDNA2 consoles, we're now seeing Nvidia get the same kind of focus that AMD had in the run up to launch.

I almost never buy PC games at launch.
 
Do we know how much is down to driver side changes vs game code changes? And is the game still putting out exactly the same image?

it would be interesting to know what was holding the Nvidia cards back at launch. Presumably with AMD being in house with Bethesda and the focus on RDNA2 consoles, we're now seeing Nvidia get the same kind of focus that AMD had in the run up to launch.

I almost never buy PC games at launch.

It looks like it's almost entirely from the most recent patch.
 
@DavidGraham Game just required a little more optimization for Nvidia gpus. There are very few games now that won't receive performance patches after launch for one gpu/cpu vendor or another. The chips and cheese article was obviously not totally comprehensive. I saw some criticisms from developers stating that looking at only some of the longest running shaders was not enough to give a total view of performance. Would be interesting to see chips and cheese do a follow up and explore whether any of those shaders had been optimized or if the gains came from elsewhere.

People should cool off on chipsncheese a bit.…
 
I'm doing some DLSS comparisons right now and I'm shocked how bad TLoU's TAA really is. Vegetation looks like it has much fewer polygons. Many textures completely disappear.

FullHD vs. UHD DLSS Performance. Both use same 1080p base resolution

Control does not have the best TAA but when you compare the TAA with the one of TLoU

With enabled DLSS the character in Control has smooth edges on the arm, a continuous hair, the textures are better preserved. See the ground, the legible writing on the gray obhect on the right or the sign on the left. With DLSS, it looks as if supersampling has been applied. DLSS flickers significantly less in motion.

DLSS and Ray Reconstruction also help enormously in Cyberpunk 2077 (Pathtracing).
 
Last edited:
That's not TAA in The Last of Us, that's mip/LOD setting which isn't set correctly versus where it is with DLSS. Considering Iron Galaxy, a notoriously mediocre studio, was in charge of the PC port it's not surprising the settings aren't right.
 
Status
Not open for further replies.
Back
Top