Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
Alex comments on the current situation between IHVs.

The conclusion to Tom's Hardware's review of DLSS 3.5 and Cyberpunk 2077's 2.0 update is very good - it really hammers home the situation right now and I agree with the sentiment there.

"It's a messy situation, and it's not likely to get any better in the near term. If there were some hypothetical universal solution that could provide upscaling, frame generation, ray reconstruction, and whatever Nvidia's engineers come up with next, things might be different."

The conclusion review is pointing out how NV has a higher quality in the software and it IS proprietary. Yes. A big reason why it is proprietary - other than money - is because it DOES require hardware ML accelleration in big quantities. How do we solve that situation?

Do we wait for NV to make their technology open to run on everything regardless of performance? They have no financial incentive to do that I bet. Do we wait for DirectX to make a meaningful generalised ML accelleration model for games? Microsoft has yet to signal that...

It is increasingly the case that Nvidia is dictating the advance of visual computing on PC because it is doing really great things in software (and in HW) and it IS an issue to maintaining competition due to the leapfrogging that is occuring.

One solution I have heard in the past is that we no longer have Direct X as it is today, but rather, vendor APIs (3 GLIDEs). That is messy? That is sooo 90s. Too much work? All AMD GPUS then would be covered with desktop, then you have NV and Intel. ehhhh

What about an optional situation where the game dev hands off the game pre-UI, pre-post processing to the driver, and the driver hands it back with DLSS/XeSS, FSR, Frame Gen, denoising, etc being applied? Each IHV then has its hardware and software stack leveraged? IDK

I think the party to solve it is Microsoft in this case and the DXR model is maybe an insight into how to do it? Yes it is a black box, but it is technically allowing for IHV specific innovation on hardware and software under the hood. Eh? Once again I have no idea.

 
Things like this really bug me...

View attachment 9632

Does it also do that without DLSS?

Regards,
SB

It's possible, I've seen this effect happen with TAA, but this in particular can sometimes seem to be a hallmark of DLSS:

1695324968067.png

That overly white aliased specular edge rears its head too often for me in DLSS games, even when compared to FSR in some cases. On the whole minor, but it's a very common artifact.
 
It's possible, I've seen this effect happen with TAA, but this in particular can sometimes seem to be a hallmark of DLSS:

View attachment 9634

That overly white aliased specular edge rears its head too often for me in DLSS games, even when compared to FSR in some cases. On the whole minor, but it's a very common artifact.

Unfortunately for me, things like this just stand out like a sore thumb and I'm constantly looking at them the moment they pop on screen.

Just like in RL as soon as I step into a building/house I'll immediately start noticing all the imperfections in the trim, moulding, baseboards, door framing, light fixtures, etc. :p Good builders/carpenters are very good at hiding any imperfections so I can mostly ignore them after I see them, bad ones aren't and my eyes are constantly drawn to them.

That's a large reason I generally dislike DLSS/FSR/XeSS. I keep trying them to see if they've gotten to the point where I consider them an improvement (less distracting than without) and so far, it's mostly been nope. This doesn't give me much hope that DLSS 3.5 is going to get it to the point I find it useable. :( Be nice if I didn't really notice them like a lot of people who seem to not notice them.

Regards,
SB
 
Unfortunately for me, things like this just stand out like a sore thumb and I'm constantly looking at them the moment they pop on screen.

Just like in RL as soon as I step into a building/house I'll immediately start noticing all the imperfections in the trim, moulding, baseboards, door framing, light fixtures, etc. :p Good builders/carpenters are very good at hiding any imperfections so I can mostly ignore them after I see them, bad ones aren't and my eyes are constantly drawn to them.

That's a large reason I generally dislike DLSS/FSR/XeSS. I keep trying them to see if they've gotten to the point where I consider them an improvement (less distracting than without) and so far, it's mostly be nope. This doesn't give me much hope that DLSS 3.5 is going to get it to the point I find it useable. :(

Regards,
SB

Games have lots of imperfections even when not upscaling so I’m not sure how you enjoy playing anything 😀
 
Games have lots of imperfections even when not upscaling so I’m not sure how you enjoy playing anything 😀

Eventually the hope is that things become less distracting to the point where I'm not always noticing them. Shadows, for instance, I didn't start allowing to stay on until like 2-3 years ago because they were so weird and distracting. Same with AO. AO was always so wrong that I just disabled them with the benefit that the game then also ran better. DoF, Motion Blur, etc. All distracting and unrealistic enough that I'd just immediately disable them.

It's one of the reasons I stayed with a Voodoo 5500 and returned a GeForce 3 that I got to replace it back in 2001. The V5500 was just better when using RGSSAA at removing some of the artifacts that were so distracting with the GeForce 3 with it's OGSSAA.

Regards,
SB
 
Last edited:
Nvidia are undoubtedly bounding ahead on software features right now, but I also think there's potential for more open standard features to catch up and become 'good enough' to where the advantages of Nvidia's solutions aren't something people cant live without.

There was kind of a silent transition with most of Nvidia's PhysX tech to where it stopped being anything that couldn't be done just as well or even better with new, open tech features. I'm not saying I quite expect that same situation, but it does show that the tech situation does change over time. And it's not something that AMD necessarily needs to provide all on their own, either. Game/engine developers are just as important in terms of introducing new and/or better ways of doing things themselves.
 
Nvidia are undoubtedly bounding ahead on software features right now, but I also think there's potential for more open standard features to catch up and become 'good enough' to where the advantages of Nvidia's solutions aren't something people cant live without.
This is clearly the way forward, and it is nothing new. Hardware, software and platform vendors have always collectively come up with prototypes and visions of the future to align on with varying levels of confidence; I've been in many of these seats myself over my career.

Sometimes it even makes sense to go all the way to vendor extensions (i.e. DLSS and similar) when suitable alignment doesn't happen and there's a need for broader R&D. Nanite used vendors extensions for 64-bit atomics for years before they were available in DirectX. Conservative raster likewise had extensions before it was standardized.

But let's not lose sight of the goal here: in the end the only way this makes sense for multiplatform (and that includes multi-IHV on PC!) games is for us to eventually get back to different platforms producing the same pixels. This has been a hallmark of graphics APIs since we moved off of Glide because at some level, that is the entire purpose in portable graphics APIs. Obviously standardization takes time - especially when consoles are involved - but if this is meant to be a core part of future renderers it's not really optional. If ML is truly an important part of the future, the hardware must be generally and efficiently accessible by everyone, not just hardware vendors. You can make arguments about it still being too early for that, but I don't think you can really make good faith arguments about the long term.

The end state of the current trajectory is NVIDIA should just make Omniverse their own game engine and people who are interested in targeting only NVIDIA PCs can use that, write all their shaders in CUDA, etc. :p Obviously this is not a world consumers really wants to live in but that's the point: they are trying to have it both ways by applying pressure on game engines to support proprietary, non-portable extensions. They are effectively trying to carve out a console model on PC. NVIDIA is a for-profit company and so their motivations are clear, but consumers and the tech press should know better than to perpetuate their marketing. If we all truly want to see this sort of tech in more games in the future, pressure needs to be applied onto IHVs and platforms themselves to sort out a good interface to let us express these sorts of algorithms, *not* to punch holes through standard APIs to allow arbitrary black boxes.

I think most people agree that ML and associated hardware is going to be useful for a variety of rendering tasks, but until it is exposed in the graphics APIs it remains more of an R&D prototype than a realistic core technology. We don't have to wait to get this started... Intel already has comparable ML hardware. Where's the industry pressure to expose even those two's ML hardware in a portable way for rendering? Why are people giving the IHVs/platforms a free pass on this? Why do we get whining about DLSS support in any game that dares not have it, but I've yet to hear people ask these simple questions to IHVs and OSVs?
 
Last edited:
Nvidia/Intel have proven ML hardware's usefulness. It's on AMD to adopt ML hardware similar to Nvidia/Intel so API standardization can happen, since nothing is standardized "until consoles say so".. Until then, all we will get as gamers are R&D protypes funded by IHVs from time to time.
 
Nvidia/Intel have proven ML hardware's usefulness. It's on AMD to adopt ML hardware similar to Nvidia/Intel so API standardization can happen, since nothing is standardized "until consoles say so".. Until then, all we will get as gamers are R&D protypes funded by IHVs from time to time.

The issue is D3D and Vulkan. They really need a way to incorporate ML models into rendering with high performance. As far as I know DirectML just couldn't run a model like DLSS anywhere near the same speed that Nvidia can running them "natively" on the GPU. If you could run these models as directML it would probably be easier for them to swap in/out. I'm not particularly a fan of vendor lock-in, but I do understand that Nvidia is pouring huge money into DLSS. It's not the same as a couple of programmers coming up with an algorithm and sharing it online. I would be curious to know how much money Nvidia has spent developing DLSS and I'd expect it's a lot. Intel is sharing their model, but it's also lower quality on competitor hardware.

Microsoft and Khronos really need to step in and provide ways to make things easier.

Edit: Epic is not pay-walling the source for Nanite and Lumen.
 
Last edited:
I did testing testing on this myself.

In Dying Light at 4k target DLSS and FSR 2 are with 3-4 frames of each other when the quality settings are matched like for like.

But FSR2 ghosts like a mofo on the swaying grass where as DLSS doesn't.

I turned DLSS down to performance mode and still got a better looking image (to my eyes) than FSR2 in quality mode.

If the frame rates are matched DLSS will give you superior image quality.

If image quality is matched DLSS will give you superior performance.
PC Gamer are saying that using XeSS on AMD GPUs in Cyberpunk gives a better visual tradeoff.

 
Things like this really bug me...

1695322095285.png


Does it also do that without DLSS?
Yeah happens with TAA as well. DLAA eleminates this, and DLSS Quality minimises it a great deal.


There was kind of a silent transition with most of Nvidia's PhysX tech to where it stopped being anything that couldn't be done just as well or even better with new, open tech features.
GPU PhysX never got entirely replaced, even a decade later and cloth simulation in games is still not as advanced as GPU PhysX (ApeX), fluid/water motion is still not advanced as PhysX. Even particles/debris are not as dense and as reactive to the environment as PhysX. The industry replaced PhysX with a downgraded version.
 
Nvidia/Intel have proven ML hardware's usefulness. It's on AMD to adopt ML hardware similar to Nvidia/Intel so API standardization can happen, since nothing is standardized "until consoles say so".. Until then, all we will get as gamers are R&D protypes funded by IHVs from time to time.
Yea, that's fair. AMD probably doesn't need to catch up to Nvidia in terms of software so much as they just need to enable the same possibilities via hardware, and then let the software ecosystem naturally work together to meet up and enable similar possibilities across the spectrum.
 
Yeah happens with TAA as well. DLAA eleminates this, and DLSS Quality minimises it a great deal.



GPU PhysX never got entirely replaced, even a decade later and cloth simulation in games is still not as advanced as GPU PhysX (ApeX), fluid/water motion is still not advanced as PhysX. Even particles/debris are not as dense and as reactive to the environment as PhysX. The industry replaced PhysX with a downgraded version.
Eh, it's definitely been replaced overall. They are generally as good or better nowadays, and any areas they aren't likely isn't because PhysX was doing something they couldn't, and more a deliberate choice of performance optimization versus benefit.

Go look at the particle effects in Returnal(UE4), for instance. A developer who really wants to do a PhsyX-like implementation could. But they're usually just more interested in tailoring their tech to what fits their vision, not just what is cool.
 
What about an optional situation where the game dev hands off the game pre-UI, pre-post processing to the driver, and the driver hands it back with DLSS/XeSS, FSR, Frame Gen, denoising, etc being applied? Each IHV then has its hardware and software stack leveraged? IDK

I think the party to solve it is Microsoft in this case and the DXR model is maybe an insight into how to do it? Yes it is a black box, but it is technically allowing for IHV specific innovation on hardware and software under the hood. Eh? Once again I have no idea.
I think you're right, and I don't know the right answer either.
 
They are generally as good or better nowadays
That's not ture, I have yet to see games that enable interactive smoke/fog as in the Batman/Metro series (smoke that can be pushed around and dispersed with forces), nor I have seen any game that offers the same fluid simulation as Borderlands 2 and Borderlands The Presequel, with fluids reacting to the player, and reacting to bullets and explosions, and I have yet to see any game offer the same complex clothing simulation as Mafia 2.

Fallout 4 offered the last look at PhysX particles in it's full glory, with particles debris being extremely dense and reactive to both the players and environment. Borderlands 2 and The Presequel both had an unmatched particle density and reactivity too.
 
In my testing for CP, the lighting upgrades from RR are way more pronounced with proper hdr displays than in sdr. To a point where I’d want to know the monitor used for the review and how the hdr was setup.

Avoid anything below quality for dlss when using reconstruction if below 4k. At 4k, balanced is fine. Nothing ever below that though.

FG goal should be 60fps or above. Input lag and motion artefacts become pronounced below this.
 
Nvidia/Intel have proven ML hardware's usefulness. It's on AMD to adopt ML hardware similar to Nvidia/Intel so API standardization can happen, since nothing is standardized "until consoles say so".. Until then, all we will get as gamers are R&D protypes funded by IHVs from time to time.
Why would they (Intel/Nvidia) care what AMD has to say ? It's as Andrew said, you can't have multiplatform development anymore if participants want to silo themselves off from each other ...

Nvidia should go and carve out their own industry centered around themselves by making their own APIs, turning Omniverse into a full featured game engine, and start buying out their closest game developers like CDProjekt, Remedy, and presumably 4A Games as well while their at it if they so badly want to standardize their proprietary technology without any obstruction ...
 
Status
Not open for further replies.
Back
Top