Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
Which is fine if you are willing to overlook the very last gen and low density world geometry. The lighting is fantastic, no doubt. The design is great. The world detail is sorely lacking, IMO. It's still a great looking game, but it is definitely one that straddles generations with next gen lighting combined with unfortunate last gen geometry.

This really doesn't compute to me. Night City's downtown area is packed with an ungodly amount of geometry, especially for a dystopian designed city (with an emphasis on verticality), depicting NC residents living on top of each other within such a condensed environment. And the interior designs are packed with lots of geometry as well.

The outer lying areas (desert, wastelands, dumps, etc.) are perfectly fine too.

It's like Ultra preset lighting with low to medium preset geometry.

Don't agree.

So it should run well just because of that. Basically the last gen geometry makes that level of RT possible, but for someone like me that also drags down the overall graphical impact significantly.

Isn't the geometry/LOD settings on console quite lower than the PC edition, especially when compared to the PC's high/ultra settings?

I'm actually excited to see what CDPR does with UE5 and a much higher world geometry budget.

I wish they stuck to the REDengine.... but that's just me. :cautious:
 
Last edited:
Why is path tracing not supported on AMD?
A 4080 is about 3X times faster than 7900XTX in Cyberpunk Overdrive and other Path Tracing games, the 7900XTX is slower than a 3080/4070 here, I expect Alan Wake 2 to be the same.

This is situation where the highest AMD GPU is equal to the lowest minimum requirements for Path Tracing or below it. So why bother list them?

Lords of the Fallen is not a well performing game, but using Ultra not running well on "most"
So far, most UE5 games have been anything but well performing.

Which is fine if you are willing to overlook the very last gen and low density world geometry.
Cyberpunk and Desodre have ample amount of geometry, Desordre is even using Nanite. The Portal RTX games have good amount of geo as well, their poly count has been increased substantially over the original.


It's like Ultra preset lighting with low to medium preset geometry.
Unfortunately for UE5 games released so far, while the static geo is excellent, the lighting is often not that impressive, the reflections are mediocre as well. Worse yet, the performance is not great either. In my opinion the tradeoff is worse at least in the current roster of released UE5 games.
 
Last edited:
I dunno. AW2 looks great but I'm not really wowed by it. Not enough to want to play at 1080p on a 4070 or 540p on a 3070. The cost of Northlight engine is way too high.

Edit: I wonder how DF will cover this game. Will they err on the side of sanity and call out the specs for being laughably ridiculous? Maybe they’ll err on the side of insanity and try to justify these insane specs for what is a corridor/wide linear shooter? We shall see. If you look at the steam hardware survey, a majority of people have gpus worse than the 3070 and large amount have GPUs worse than the 2060. Who exactly is remedy making this game for? The small amount of people who own a GPU better than an rtx 3080? I can’t even wait to see how bad this will be on console seeing as they’re using either FSR 2 or some form of tsr. What ungodly resolutions are they upscaling from especially in the newly announced performance mode?
 
Last edited:
I dunno. AW2 looks great but I'm not really wowed by it. Not enough to want to play at 1080p on a 4070 or 540p on a 3070. The cost of Northlight engine is way too high.

Yeah needing a 3070 for 540p medium settings is pretty ridiculous. Haven’t seen anything in the content released so far to justify that cost.
 
This is so weird. Alan Wake 2 is the first game I looked at that I thought would destroy my pc. I've been expecting my 3080 to (relatively) struggle with it since they first started putting out trailers..

Edit: I look at these visuals and think "holy shit." I guess that's not a shared impression.

 
This is so weird. Alan Wake 2 is the first game I looked at that I thought would destroy my pc. I've been expecting my 3080 to (relatively) struggle with it since they first started putting out trailers..

Edit: I look at these visuals and think "holy shit." I guess that's not a shared impression.


The DLSS 3.5 screens were certainly impressive, but outside of that I haven't been blown away yet, and that video in particular is not really the best advertisement. That's an extremely choppy capture, with quite a lot of specular aliasing (likely due to reconstruction) to boot, and I don't see anything cutting edge in terms of character modelling. I'm sure the game in-person without shitty youtube compression (especially damaging for the visuals in such a dark game) will present far better, but that video specifically doesn't floor me.
 
This is so weird. Alan Wake 2 is the first game I looked at that I thought would destroy my pc. I've been expecting my 3080 to (relatively) struggle with it since they first started putting out trailers..

Yes, but running at 540p is a massive handicap. What specifically about the visuals should be taxing a 3070 at that middling resolution?

The 4070 and 4080 settings are interesting. The 4080 is about 60% faster and has 4GB more VRAM. Recommended settings suggest a 2.25x bump in resolution and upgrade from medium to high and from medium RT to high RT. That’s a big jump for only 60% more horsepower.
 
This is so weird. Alan Wake 2 is the first game I looked at that I thought would destroy my pc. I've been expecting my 3080 to (relatively) struggle with it since they first started putting out trailers..

Edit: I look at these visuals and think "holy shit." I guess that's not a shared impression.

I dunno, maybe I'm just jaded but these graphics are just good. The more you aim for realism, the more the little details stick out. Look at 1:28 in your video, she turns her head and somehow gravity stops working on her pony tail. The hair physics are not great. The clothing physics are not great. What's up with the eye animation? Does she not blink? What's up with the animations in general? Then there's a ton of aliasing.... The object density reminds me of the division e3 reveal trailer 10 years ago. Its great that we're finally getting close to the level of visuals we were promised a decade ago. It looks good but for me, the strongest thing the game has going for it is it's material quality and the geometric density. It looks like further iteration on the work done with control. Does it make me go wow? Not really.
 
Yes, but running at 540p is a massive handicap. What specifically about the visuals should be taxing a 3070 at that middling resolution?

It's more like 720p in terms of actual performance requirement once DLSS is accounted for. But yeah, 720p medium for 60fps on a 2080Ti level GPU is crazy heavy and I do agree thtat without it's PT effects I'm struggling to see the justification.

That said, the game still looks great and will clearly run acceptably on a wide range of hardware, so I'm not particularly concerned. And the fact that it's pushing the limits at the higher end is obviously great and what we should be wanting in every realease.

Also it's not been mentioned yet but the CPU requirements look to be fairly reasonable, at least at the higher end.
 
60 fps at 4K DLSS performance is a bit better than CP2077 clocked on a 4080.

Why is path tracing not supported on AMD?

Doubt it'll be supported on intel either. The OD mode in Cyberpunk hits performance of both AMD/intel cards way more than nvidia and Alan Wake isn't going to be any different. I guess the situation is same with Portal RTX.

I tested out the Serious Sam PT upgrade earlier this year on 6800XT and 4090, and 4090's peformance would be about 3-3.5x of a 6900XT. Similar figure with heavy RT games like Witcher3 or Cyberpunk's non-OD mode.
 
Speaking of which, it always makes me so incredibly sad at the state of PC game development that Ultra doesn't have system specs requiring the absolute top end hardware. But then again, most development is console first and then port to PC.

Regards,
SB
Lowering settings is not a "fix" for a bad performing engine.
Here is a fun fact: At day the pathtracer in Cyberpunk gets over 90 FPS with DLSS performance in 4K (HDR picture) with the 4090:


Lords of the Fallen runs with ~110 FPS in 1080p.
 
It's more like 720p in terms of actual performance requirement once DLSS is accounted for. But yeah, 720p medium for 60fps on a 2080Ti level GPU is crazy heavy and I do agree thtat without it's PT effects I'm struggling to see the justification.

That said, the game still looks great and will clearly run acceptably on a wide range of hardware, so I'm not particularly concerned. And the fact that it's pushing the limits at the higher end is obviously great and what we should be wanting in every realease.

Also it's not been mentioned yet but the CPU requirements look to be fairly reasonable, at least at the higher end.

1080p at DLSS performance is 540p internal.
 
What he means is that the DLSS has a cost which makes 1080p with DLSS performance closer to 720p in terms of frame rate than 540p.

I see. Don’t think we should be doing that. It’s guesswork and adds unnecessary confusion. We don’t say 4K DLSS performance (1080p) is closer to 1440p.
 
I don't think we should be inferring anything about the games optimization until we see the game clearly for ourselves on our own hardware and better understand the quality/performance on display for each setting.

Sometimes developers completely miss the mark regarding their recommended specs. It's important to remember these are presets and not necessarily a optimized to give the best performance for the quality overall. Sometimes it's one simple aspect (such as VRAM) which can lead a developer to recommend a higher tier GPU than what's truly necessary than if you had just dropped textures a notch. And we don't know the visual difference between the settings..

...and so on and so forth.

Perhaps in the future, AI algorithms will be able to greatly improve the whole "Recommended Specs/ Can your PC run it" aspect of PC releases, where they could generate detailed custom settings for a customer based on their specific PC specs and quality/perf preferences. Imagine being able to look at images of a game before release, and toggle on/off settings one by one and have it give you estimated performance based on your input hardware config.

Would be cool.
 
Status
Not open for further replies.
Back
Top