Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
Going back to the original question though:

The one term that's appropriate for all techniques that aren't simple spatial interpolations is 'image reconstruction', of which 'temporal upscaling' is a subset, no? DLSS2 is even described as such by nVidia : DLSS 2.0 - Image Reconstruction for Real-time Rendering with Deep Learning

Yes, both temporal upscaling and frame interpolation are forms of image generation.

Technically, frame interpolation is frame construction as you are creating something entirely new which does not exist. You're adding a made up (interpolated) frame into the stream of actual frames using data from the previous and next frame (for video) or previous frame(s) combined with a prediction of what the next "real" frame might be. How well that works depends on the algorithm used to create the new frame.

Temporal reconstruction generally uses information from previous frames and the current frame to add detail to the current frame, IE - it's reconstructing the current frame using data from previous frames. Adding data to it rather than creating something entirely new. Now some of the added detail might be created but it's generally blended in from previous frames. Basically, you're just constantly adding information to the current frame but you aren't generally making up an entirely new frame.

As such, since temporal reconstruction is generally adding information to the current frame in order to present the final frame it's upscaling the resolution of the final frame. You're quite literally taking the current frame and reconstructing it.

Frame interpolation can be thought of more accurately as a form of temporal construction, rather than adding detail to the current frame you are creating an entirely new frame (at the same resolution as the previous and next frames) that roughly follows what came before and may not accurately predict what comes in the next real frame (for games or real time video without a delay) in order to create the intermediate frame. Thus you can end up with some VERY weird anomalies during frame interpolation when there is either fast motion or erratic motion or worst case fast and erratic motion.

With motion video you can instead use 2 real frames (or more, and one of the reasons that TVs doing interpolation have lots of added latency because at a minimum they must have the data from the next frame before generating the intermediate frame) and create an intermediary frame using information from those two frames. So display real frame, read in next real frame, interpolate (create/construct) intermediate frame using previous real frame + real frame that was just read in, display intermediate frame, display real frame that was read in...rinse and repeat for a simple TV based video interpolation stream. Anomalies related to quick erratic motion can still be a problem, however, but can be mitigated by looking far enough ahead in the video stream to try to better predict an appropriate middle frame. Much harder to do with a game where you want a new "intermediate" frame to be generated before the next "real" frame is rendered.

That's where you can get into things such as asynchronous reprojection from the VR world to reproject things based on camera (head) movement and sort of fill in blanks with solid colors at the periphery.

Regards,
SB
 
Last edited:
This might be a dumb question since its way out of my knowledge, but would it be possible to work with the engine's occlusion pass(es) to add an offset to save data from the edges of occluded geometry (might be flawed if there's transparencies involved with foward rendering engines?) to aid the with the interpolation/reconstruction process?

edit: Basically if an object gets occluded together with the motion vector data if if could be programed to keep track of it until eventually the geometry of that object aproaches the opposite side(if moving linearly in a straight path for ex.) and gather data once it reachs a certain offset nearing the disoclusion point and pass that along to the interpolation process to avoid artifacts.
 
Last edited:
Not familiar with Warhammer but it looks great visually and gameplay wise

The Warhammer "lore" is great, and is a good fit for videogames, whether it's Warhammer FRP or Warhammer 40k. I just remember seeing the 40k rulebook cover as a kid and being totally drawn in by it.

1670524154922.png

Darktide really captures the world well.

There's even a metal band named Bolt Thrower (named after Warhammer FRP ballistas) that used the 40k rulebook artwork as their album cover.
 
Last edited:
Not familiar with Warhammer but it looks great visually and gameplay wise
same here, when they mention Warhammer I dunno what they are talking about. Good to know that the engine is quite unique. It seems like RT only works with nVidia GPUs, I think.
 
Last edited:
The Warhammer "lore" is great, and is a good fit for videogames, whether it's Warhammer FRP or Warhammer 40k. I just remember seeing the 40k rulebook cover as a kid and being totally drawn in by it.

View attachment 7716

Darktide really captures the world well.

There's even a metal band named Bolt Thrower (named after Warhammer FRP ballistas) that used the 40k rulebook artwork as their album cover.
Lol still have that rulebook sitting in my shelf
 
Not familiar with Warhammer but it looks great visually and gameplay wise

When it comes to Fantasy or Science Fantasy no IP in the world has more lore written about it than Warhammer (fantasy) and most especially Warhammer 40k (Science Fantasy). While D&D might be similar in amount of lore to Warhammer Fantasy, it still pales in comparison to how much has been written for Warhammer 40k.

When it comes to games, it's a rich breeding ground for creativity as it encompasses such a large galactic area combined with encompassing thousands and thousands of years that there is a lot of creative freedom for a game to be set within that setting without having to worry too much about narrative consistency.

This is especially true when narrative dissonance (IE - wait but this book said this thing happened but that book said something different happened) is accepted and embraced as part of the lore because technology and records are spotty at best and word of mouth is unreliable when trying to recount something that happened say 7 thousand years ago from an event where no accurate written records exist. So does anyone know for sure that in the history of Warhammer 40k that this thing happened at this world? Or did that thing happen at this world? Or are they both wrong and something else happened at this world? Is what was recorded as actual events just rumors of what happened? Much like modern history, there's no way to know for sure exactly what happened in most of the world 7 thousands years ago. After all, history is written by the victors. :)

It's a lovely setting of darkness and danger and hypocrisy and corruption and war ... everlasting war.

Regards,
SB
 
It's nothing to do with resolution, it's the stuttering. I don't doubt you have a good experience, but with mine, and every video I've seen (I've looked at 5 yt's with varying hardware), the stuttering is significant and far from brief.

I've taken a while to respond to this as I wanted to look in a bit more depth at the performance I'm getting, and I think I see what your referring to. Regardless of resolution, there are certainly some frame time spikes.

What I would disagree with though are the significant and far from brief descriptors. And this got me thinking... how do we actually define significant or brief stuttering?

For example, is a single dropped frame (so potentially a stutter of up to 33.3ms at 60fps) considered significant or noticeable? And how does VRR impact the perception of that?

What I noted in my playthrough is that depending on the area the game could fairly regularly miss the 25ms frame time target (40fps) depending on the area. So this could have been performance related but I'm not sure resolution had that big of an impact on them. The thing is though, the majority of those misses were below 50ms and on my VRR display were basically invisible. These kind of stutters could non-existent for minutes at a time or they could be every few seconds depending on the area/what's going on. But then more occasionally there would be bigger stutters it the 50+, 75+, or even 100ms+ range. It's pretty difficult to test but I don't think the 50-75ms range was overly noticeable to me while the larger ones could be if I paid attention. Anything over 100ms was certainly obvious and what I'd traditionally refer to as a stutter, but from a 6 minute playthrough I recorded I think there were only 2 or 3 of those.

So yeah it's definitely not ideal, but for the most part for me the game is extremely smooth, and I'd struggle to consider it a game breaking departure from a fully consistent 40fps. It's not as though the frame pacing is off for example which absolutely would be a night and day difference.
 

Surprised Alex isn't doing this one. Hopefully, Callisto Protocol didn't break him. At least, he still joins.
Great video. Though, Mesh Shaders and the performance difference between HW/SW Lumen were not covered, which left me a bit disappointed. This is technically the first game (aside from that China MMO) that uses Mesh Shaders. So it would have been nice to benchmark the 5700 against the 2060 Super. @Dictator any plans of doing that, maybe in an upcoming optimized settings video?
 
Last edited:
Great video. Though, Mesh Shaders and the performance difference between HW/SW Lumen were not covered, which left me a bit disappointed. This is technically the first game (aside from that China MMO) that uses Mesh Shaders. So it would have been nice to benchmark the 5700 against the 2060 Super. @Dictator any plans of doing that, maybe in an upcoming optimized settings video?
Does PC have the option to use SW Lumen or it defaults to HW?
 

Surprised Alex isn't doing this one. Hopefully, Callisto Protocol didn't break him. At least, he still joins.
Great video John and @Dictator.

I really hope the stutter issues of the engine on PC can one day fully be eradicated as this video just leaves me fully optimistic for the future of gaming technology otherwise
 
Great video. Though, Mesh Shaders and the performance difference between HW/SW Lumen were not covered, which left me a bit disappointed. This is technically the first game (aside from that China MMO) that uses Mesh Shaders. So it would have been nice to benchmark the 5700 against the 2060 Super. @Dictator any plans of doing that, maybe in an upcoming optimized settings video?

It was an excellent video overall, but yeah - I was expecting some brief performance comparisons between HW and SF lumen on the PC.


Great for @Dictator to speak to Epic about this. Definite significant shader stuttering here confirmed, at least Epic is continuing to work on it.


Surprised Alex isn't doing this one. Hopefully, Callisto Protocol didn't break him. At least, he still joins.

He's in the video in the second half when it talks about PC.
 
I'm most interested in why there is no DRS in the PC version which is clearly a massive help for the console versions given they can drop all the way down to 40% of 4K when needed (actually lower than the minimum allowable fixed factor on PC which is 50%).

DRS is woefully underused on PC imo, and when it is used, its implementation quality is not always great either. Part of the reason is the console SDK's (as far as I understand it) make it extremely easy to utilize. Also part of the reason perhaps is that VRR is just far more common on PC displays, especially when you're above 60fps already and suffer a momentary framedrop of ~10fps due an alpha-heavy effect for a few frames, it's generally not a huge deal vs. it happening on a fixed refresh rate display. So ease of implementation is harder and the perceived benefit is lower.

However, what I have seen is for games that use DLSS with dynamic res, it can work very well - dynamic res has been part of DLSS and FSR for a while now and it's likely easier for developers to integrate when they're already using that reconstruction tech. Games that have used DLSS + dynamic res ime seem to work decently, Deathloop and Spiderman certainly behave far more consistently with it (Deathloop doesn't even offer DRS unless you're using DLSS/FSR) than earlier PC games with DRS that don't use temporal reconstruction ime.

Using DRS with reconstruction seems to gives you the best of both worlds - the perceived drop-off in quality when it has to engage is lower simply due to higher quality of DLSS/FSR to begin with, but you can also have a wider range of performance uplift targets because the jump from performance/quality modes is more significant, but still relatively difficult to detect in quality, especially in brief scenes where you would need to engage it. An issue with DRS in general (console games have also had to face this) is either being too aggressive where the quality of the scene is always affected, or not being aggressive enough and you still get frame drops. DLSS/FSR with dynamic res seems to address these concerns well, at least from what I've seen.

From what I've heard DLSS will be added to Fortnight eventually, maybe then it will take advantage of DRS.
 

Surprised Alex isn't doing this one. Hopefully, Callisto Protocol didn't break him. At least, he still joins.

Lumen is looking really good. The fact that it runs well on consoles is going to be great for adoption. I didn’t realize reflections still sampled the surface cache even with hardware tracing enabled. Wonder what it would take to have reflections sample the proxy mesh directly. Overall hardware tracing doesn’t seem to increase IQ much over software in Fortnite.

Edit: Looks like it’s just flipping a switch.

Ray Lighting ModeControls how Lumen Reflection rays are lit when Lumen is using Hardware Ray Tracing. By default, Lumen uses Surface Cache for best performance, but can be set to Hit Lighting for Reflectionsfor higher quality.
 
Last edited:
The Sowftware Lumen is a definite a huge mprovement in Fortnite but that doesn't look better to me than some games with Voxel GI.

It looks pretty good graphically. Now I'm interested in the game which I wouldn't be before the video. Star Citizen is also a game that uses POM on surfaces to dramatically increase the level of detail. In many places you don't see that the details are only 2D. Normal maps definitely can't keep up.

What I don't like so much is that the smoke effects don't always take on the ambient lighting and stand out.
 
Last edited:
Overall hardware tracing doesn’t seem to increase IQ much over software in Fortnite.
Yes, because even mirrors in the game are designed with weird normal map and roughness to hide the low precision of SDFs and surface cache as well as the lack of skinned meshes, not even speaking of translucent surfaces, such as car's windows that completely lack reflections on UE 5.1 (quite disappointingly so). Imagine how fun it would have been from a gameplay point of view if the game had an area with distorted mirrors labyrinth full of players or bots all with HW RT and shading in hit points)
 
Status
Not open for further replies.
Back
Top