Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
UE5 is not scaling well I got to say.
The recent demo from Black Myth Wukong also has terrible performance. The demo machine has a 4070 and ran the game at 4k (nanite & lumen on, but no HWRT) with dlss2 and frame-gen. It barely hit 60fps with frequent dips and large stutters (possibly shader compilations). Meaning a 4070 could't maintain a solid 30fps running the demo at 4k without frame-gen, and most likely couldn't maintain 60fps at native 1440p (they seem to use dlss balance) without HWRT features.

Sure they still get a year to optimize, and I highly doubt if they will.
 
Far more likely it's a simple sharpening difference between the consoles.

Neither the performance profile on PC or XBSX even remotely support the PS5 running at 1080p which is 2.25x more pixels than the Series X.

Also that video lol... "PS5 is running at 1800p internal".
I saw something in the open surprise comparison video that lead me to believe that there is extra sharpening in the PS5. I saw it in this video but I could be wrong.
 
I'm pretty sure they said they used FSR2 instead of TSR because TSR had too many artifacts. If that's true, it's a really weird thing to look back on, because what did it look like with TSR?
 
That's comparing native 4k vs "FSR Performance" which would be a max internal resolution of 1080p. The demo is getting in excess of 3x performance with FSR so the image quality to performance trade-off is fair and comparable to DLSS no?
FSR IQ is absolutely not comparable to DLSS.

I have a 1440p monitor and FSR at any quality level looks terrible in every single game I have tried.


The Series S version looks like when Overwatch first came out and I was playing it on a macbook with an integrated intel gpu.
This is a bad looking game due to all the issues and on Series S it's shocking they even released this.
 
Last edited:

Yes there's plenty of unsightly fsr artifacts for all of the platforms. I was referring to artifacts specifically related to image sharpening. As in artifacts in excess of Series consoles that can be attributed to additional sharpening.

FSR IQ is absolutely not comparable to DLSS.

I have a 1440p monitor and FSR at any quality level looks terrible in every single game I have tried

Maybe. My point was keeping everything in context and like for like. We can't know for sure until we do head on comparisons. But I agree it's a bad looking game all around, and I mean very bad. No way is Epic proud of this game's association to their brand new engine.
 
I'm pretty sure they said they used FSR2 instead of TSR because TSR had too many artifacts. If that's true, it's a really weird thing to look back on, because what did it look like with TSR?

Alex said the devs found TSR to be too costly for gpu budget. I can't believe tsr would have nearly as much artifacts as fsr. Epic TSR is fantastic. What they were able to accomplish when upscaling the first ps5 demo was magical.
 
I think its great that AMD is getting this as anything that evens the playing field between NV and AMD is good for all of us.

However expecting this to be on a par with DLSS 3 is optimistic at best IMO.

The signs are already there... a 60fps base recommendation, auto disable on fast mouse movement, no hands on for the press to assess the latency impact....

As for the driver level version, on the one hand I expect that to deliver even more questionable results, but the fact that it applies to all games is pretty awesome so you can afford to cherry pick scenarios where it will add value. I hope to see Nvidia implement something similar soon but it's pretty cool to see AMD ahead of Nvidia in a software feature for a change.
The driver-level version is called AFMF and its also the one that auto-disables on fast mouse movement not FSR3.
 
UE5 is not scaling well I got to say.
The recent demo from Black Myth Wukong also has terrible performance. The demo machine has a 4070 and ran the game at 4k (nanite & lumen on, but no HWRT) with dlss2 and frame-gen. It barely hit 60fps with frequent dips and large stutters (possibly shader compilations). Meaning a 4070 could't maintain a solid 30fps running the demo at 4k without frame-gen, and most likely couldn't maintain 60fps at native 1440p (they seem to use dlss balance) without HWRT features.

Sure they still get a year to optimize, and I highly doubt if they will.
I knew it. Many devs are going to reject HWRT... It's impact to visuals are just too minimal for the most part to be a bother and can reduce performance a lot in some scenes.
 
I knew it. Many devs are going to reject HWRT... It's impact to visuals are just too minimal for the most part to be a bother and can reduce performance a lot in some scenes.
Yeah, this is a big problem. HW RT Lumen is slower, not faster. There a huge part of a modern nVidia GPU unused with UE5 and at the same time when you use these transistors you can get realtime pathtracing.
 
Yeah, this is a big problem. HW RT Lumen is slower, not faster. There a huge part of a modern nVidia GPU unused with UE5 and at the same time when you use these transistors you can get realtime pathtracing.
Previously, I would not have agreed with that last sentence but after seeing Alan Wake running with path tracing at 110 FPS on a 4090, I would definately do now. Path tracing is much more accurate and thanks to the new denoiser, the lighting adapts much quicker with reduced noise. It really uses the RT cores to its fullest, unlike Lumen. And it's quite performant as well, atleast on a 4090 with frame generation. It's pretty impressive Nvidia pulled off path tracing in such a modern title. We will have to see how it runs on older architectures, but it would be seriously funny if Alan Wake had better performance than Immortals, despite being path traced.

Things might get very interesting soon.
 
Yeah that looks like extra sharpening to me. God there are so many of these hack channels out there now.
lol we could easily make one for B3D. If only people actually cared more about buying and testing titles, instead of just wanting to argue over other people's conclusions.
 
Yeah, this is a big problem. HW RT Lumen is slower, not faster. There a huge part of a modern nVidia GPU unused with UE5 and at the same time when you use these transistors you can get realtime pathtracing.

I think hw RT lumen might be faster than software lumen on the epic preset.
 

The decision to use FSR performance mode and to target 4K may have come late in development - long after the art style and the animations were locked down.

Perhaps it was a due to the performance of alternative modes, perhaps it was some kind of marketing tie in. I dunno. Epic's own UE specific TSR would probably have been a better choice for working with UE5 though.

FSR 3x reconstruction using on both axes is going to look awful with fine details and lots of movement including large amounts of object rotation.
 
lol we could easily make one for B3D. If only people actually cared more about buying and testing titles, instead of just wanting to argue over other people's conclusions.

There's a big difference between fanboy posts on a forum where the pushback against misinformation is far more immediate, and clickbait video channels that are used as fuel by those very fanboys. Of course there are zealots and imbeciles on here like any online space, but there's also extensive reporting on here by people who have direct experience with the games in terms of just a gamer, and several people who are directly involved in the industry. Some of the most popular threads have been about the extensive testing with games - Spiderman, Ratchet and Clank, Last of Us are all very long threads with many contributors and hard data, kind of hard to miss. Hell I think I've run and posted maybe 15+ benchmarks/videos of R&C alone, both from PC and PS5.

Those kind of contributions will of course be focused on the PC versions as the tools to analyze them are more readily available to the end user than console titles.
 
Last edited:
Previously, I would not have agreed with that last sentence but after seeing Alan Wake running with path tracing at 110 FPS on a 4090, I would definately do now. Path tracing is much more accurate and thanks to the new denoiser, the lighting adapts much quicker with reduced noise. It really uses the RT cores to its fullest, unlike Lumen. And it's quite performant as well, atleast on a 4090 with frame generation. It's pretty impressive Nvidia pulled off path tracing in such a modern title. We will have to see how it runs on older architectures, but it would be seriously funny if Alan Wake had better performance than Immortals, despite being path traced.

Things might get very interesting soon.
GI, shadows and reflections are really fast in the Pathtracing system in Cyberpunk. Just a simple comparision between Immortals and Cyberpunk with the same "scene", DLSS quality in 3440x1440p:
Immortals:


Cyberpunk (GPU was not overclocked):


I think hw RT lumen might be faster than software lumen on the epic preset.
Maybe on nVidia GPUs, but i think on AMD hw RT Lumen would be much slower. So nobody will really use it...
 
Status
Not open for further replies.
Back
Top