Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
We are finally approaching the first generation of RT games. Hybrid RT games. And a handful of RT only games, really lucky that 4A jumped on that one. Expecting all their titles going forward to be RT only as well.
As long as they have good upscaling solution I suppose it's fine. Exodus came out too early to put FSR2.1 on there on console. Based on the Witcher 3 video FSR TAAU is much superior to TAA in general so huge upgrade we didn't get to see. And the conversation went into how the game was a bit too soft at lower resolutions

Even lumen is basically based on RT. So even when games wont be using RT GI they technically still will be
 
1- preposterous, ue is clearly intended for pc.
2- platform size and install base is slightly misleading -- PC is the only platform where epic doesn't have to give up 30% of their revenue on each sale. It's strategically an extremely important platform to fortnite. At least unless they win that lawsuit. Very unlikely it's one of the biggest even when you factor that in, but it's still not something theyre going to give up on or deprioritize.
 
Tried out Fortnite on my system (i5-12400, RTX 3060) - pretty brutal at any settings. Not sure if every stutter is related to shader stutter though, as I was still getting stuttering after ~20 mins of play in the same spots (such as when your team materializes), but even outside of that there were periodic small stutters just walking around.

Thing is though - DX11 with all nanite/lumen disabled was actually far worse, the massive stutters made it basically unplayable. Neither was a good experience mind you, but I was kind of shocked how bad DX11 is - was it always like this? I haven't played it in years but I don't remember it being quite this bad on a far weaker system I had at the time. I mean I don't know how even PC DIY sites, which don't take shader caching into account usually, could even get the benchmarks they're getting with this performance as you would have to play for a long time to get these shaders cached, swapping out cards and you would have the low 1% in the 20fps range if my experience was typical. So I don't know think we can really make any determinations yet of how the engine as a whole is going to perform on PC, just too many wildly different experiences out there right now.

Night and day on the PS5 though. Some drops from 60fps like when gliding into the map, but otherwise a locked 60. But at the moment, nothing comparable to how it struggles on my PC.

It's totally possible to get a perfectly flat frame time graph on PC. I can't speak for shader compilation stutters as I think by the time I'd played around with my settings enough to get a perfectly flat line (via RTSS), they'd probably have worked themselves out anyway. As (I think it was you) said earlier, SCS doesn't really mean much in a game like this where you're playing the same environments over and over. I didn't notice anything intrusive (like Callisto) on my first play attempts albeit they could have been hidden beneath a generally unstable frame rate. But now that the frame rate is ironed out it's as flat as a pancake.

And this is on a GTX 1070 and an HDD... (forgot nanite likes SSD's so went and installed it to my HDD without thinking).

I've played around with a few settings but for the purposes of this I went with Nanite, Lumen, VSM and Textures on High, everything else medium, no RT (obviously) and 50% resolution scale (not TSR, just TAA) at 3840x1600. That nets me a solid 45 fps aside from the very occasional and minor wobble.

I can get 60fps at those same settings with around 30% resolution scale but then it's unplayably ugly. 50% even without TSR is surprisingly good though. With TSR, especially at Epic, it's gorgeous but that's a pretty heavy additional strain on the GPU putting 45fps out of reach.

I honestly don't see why you couldn't achieve 60fps on a 3060 at my 45fps settings. Not sure whether HWRT would hinder or help performance here though?

Also since the PS5 uses DSR here while the PC doesn't (no idea why) then it's going to have a leg up in resolution performance.
 
Are there any examples of games running "slowly" on Nvidia hardware due to using DXR 1.1?
slower might be the word that I should write instead of slowly. Let's just say that while it works on nVidia hardware is not nVidia-optimized code path
 
The first PS5 UE5 demo showcasing nanite was never released elsewhere.
gotta wonder how PS5 will perform in games with Lumen and Nanite enabled. I'd go for 60fps/120fps even if they had to decrease the resolution a bit. Most people that own the new consoles play on 4K 120Hz TVs. But given the extra smoothness I'd go with that.

It's the same reason why I still prefer to game on my 1440p 165Hz monitor than on my 4K 120Hz TV. :giggle:It's super easy to achieve 165fps 1440p nowadays in many games even on a mid-tier GPU. My TV has better IQ, but it can't compete -and of course achieving 4K 120Hz isn't easy even with DLSS-. Also 165Hz gives you some extra AA and clear motion. 👌Plus my monitor has a decent contrast. Just check this well-known page and compare on your TV or monitor...


It's also worth noting that 4K native + typical AA with no reconstruction techniques like DLSS, XeSS and FSR 2 -on games that work like a charm- is quite disappointing. 😒 The more native 4K games with "native AA" I play the more disappointed I am. DLSS 3 and FSR 3 are going to be of huge help, improving not just the framerate but also AA just by adding extra framerate.
 
Also since the PS5 uses DSR here while the PC doesn't (no idea why) then it's going to have a leg up in resolution performance.
It's nothing to do with resolution, it's the stuttering. I don't doubt you have a good experience, but with mine, and every video I've seen (I've looked at 5 yt's with varying hardware), the stuttering is significant and far from brief.
 
Last edited:
@Cyan Yes, it's a weird thing seeing such natural looking lighting in games. I saw the video of the Witcher 3 ray tracing, and it's amazing how much proper lighting improves that game that is otherwise quite outdated looking at this point. I think Fortnite demonstrates it really well. You have really nice shadowing and lighting, and it doesn't have that artificial sharp look because there's a smooth rolloff from bright areas to shadowed areas. Also, the virtualized shadow maps fix one of the biggest visual problems in the game, which is shadow pop-in. I always used shadows off because the performance hit of high shadows sucked, and anything but the max had severe shadow pop-in issues that were incredibly distracting. The new virtualized shadow maps seem to stretch to your full draw distance, and I don't see any pop in, even on medium.

WRT shadows, I think the one of the big things that Lumen does is bring "infinite" shadows to software based shadows. That was always one of the biggies WRT shadows just being REALLY weird in games, that at certain times of the day they'd just stop after a certain distance when they shouldn't. That and hopefully spelling the end of just not right looking AO.

It's basically bringing correct lighting, albeit perhaps at lower quality than hardware accelerated RT, to any hardware capable of running UE5 without the need of RT hardware acceleration.

And even more importantly with the higher geometric density and detail of objects that Nanite brings, it actually makes that improved lighting look so much better, even in software, than RT based global illumination does in almost all non-Nanite games.

For me WRT lighting (so not looking at reflections and other things that hardware RT can assist with).
  • Most games pre-hardware accelerated RT. Bad to good.
  • Hardware accelerated RT - generally good to really good, but inconsistent
    • Lack of enough geometric density and detail means it still doesn't quite look right.
  • Nanite 5.0 with Lumen - impressively good.
    • Really nice and looking even better than RT lighting in most other shipping games due to how light interacts more convincingly with objects and terrain that have enough geometric detail.
    • One big problem, foliage means it still doesn't look quite right.
  • Nanite 5.1 with Lumen - WOW, now we're talking.
    • Now that vegetation can also react more realistically with lighting with increased geometric detail ... just wow.
Hardware assisted RT got me excited for good lighting but still wasn't quite there and the performance was definitely not quite there for me yet, but I hoped that at some point in the future X generation of hardware assisted RT cards would be fast enough for me to enable it in all games. Except something was still not quite there yet, world and object geometric detail wasn't high enough to truly make the lighting look right.

Now, Nanite goes a long way towards addressing the detail needed in order for lighting to really shine (pun intended). Sure software Lumen may not be as detailed as hardware accelerated RT can potentially be, but OMG does it look amazing combined with the advances in geometric detail that Nanite brings.

The big question still, of course, is how well it's going to run in shipping games. Fortnite is amazing and all with what we've seen, but it's still a relatively simple graphics style and environments aren't as dense as either a fully forested scene or a busy downtown city center.

Considering how well it runs and how incredible it looks on PS5, I'm hoping that Epic can iron out the things that make it run less than well on PC when software Lumen is used.

Regards,
SB
 
As long as they have good upscaling solution I suppose it's fine. Exodus came out too early to put FSR2.1 on there on console. Based on the Witcher 3 video FSR TAAU is much superior to TAA in general

Witcher 3 didn't have TAA, it used FXAA/SMAA. TAA alone is a huge upgrade.
 
Last edited:
TAA has come a long way from halo reach on 360(the first time I recall ever seeing it) but TAAs time is over. TAAU is the future, old man
 
Witcher 3 had some rudimentary form of TAA that barely did anything.

Yeah I loaded it up, there's no explicit method of AA described- just 'AA' - if it's TAA it looks nothing like TAA in modern games (as expected given its age). Does absolutely nothing for subpixel aliasing, I would have never guessed it was a form of TAA at all.
 
Yeah I loaded it up, there's no explicit method of AA described- just 'AA' - if it's TAA it looks nothing like TAA in modern games (as expected given its age). Does absolutely nothing for subpixel aliasing, I would have never guessed it was a form of TAA at all.
CDPR were the ones who claimed it was an in-house temporal AA.
 
Alan Wake beat it by four months.
Really? I didn't know Alan wake used TAA. I hope it wasn't as fricken ghosty as reach, that crap drove me insane at the time, it was too early. I honestly ended up wishing it used fxaa every time I went back to it. But every AA solution that came out after reach is dead now while the temporal solution survives and evolves. which is interesting to think about
 
Status
Not open for further replies.
Back
Top