Digital Foundry Article Technical Discussion [2025]

I have a 4K OLED, DLSS Perf has never run on my PC.
Neither has FSR.

DLAA has run a lot.
DLSS Quality has run a lot.
RT too.
HDR even more so.

But DLSS Perf/FSR...nope.

Also what made me turn of the DF 5090 video.
I don't care about 4 x FG DLSS Perf
I do care about 4 x FG DLSS Quality
I would like to know if FG and DLAA can work together.
I do not care about FSR one single bit.

So any review doing DLSS Perf/FSR numbers are not targeting me...an upcoming 5090 buyer 🤷‍♂️
Missed the demographics I'd say.

There are lots of esports gamers that buy 4090s to set their games to low graphics on 1440p to get 480 fps. Tons of different use cases for 4090. Just find a reviewer that caters to your specific goals.
 
There are lots of esports gamers that buy 4090s to set their games to low graphics on 1440p to get 480 fps. Tons of different use cases for 4090. Just find a reviewer that caters to your specific goals.
They all seem to follow "the script" -> 4K with 1x/2x/4 x FG DLSS Perf.
Useless for me.
Useless for the e-gamer at 1080p/1440p.

Oh well.
 
  • Like
Reactions: snc
Why does 5090 still choose 4 nm process? Price shouldn’t be a problem for this expensive GPU.

When can we have 3 nm GPU or even 2 nm GPU?
 
Why does 5090 still choose 4 nm process? Price shouldn’t be a problem for this expensive GPU.

When can we have 3 nm GPU or even 2 nm GPU?
Normally the entire architecture is on a single process. Nvidia will likely transition to 3nm with the Rubin architecture (RTX 60 series).
 
Why does 5090 still choose 4 nm process? Price shouldn’t be a problem for this expensive GPU.

When can we have 3 nm GPU or even 2 nm GPU?
Because 4nm maximizes Nvidia’s margins while also allowing them to sell an increased number of GPUs in the future.
 
I have a 4K OLED, DLSS Perf has never run on my PC.
Neither has FSR.

DLAA has run a lot.
DLSS Quality has run a lot.
RT too.
HDR even more so.

But DLSS Perf/FSR...nope.

Also what made me turn of the DF 5090 video.
I don't care about 4 x FG DLSS Perf
I do care about 4 x FG DLSS Quality
I would like to know if FG and DLAA can work together.
I do not care about FSR one single bit.

So any review doing DLSS Perf/FSR numbers are not targeting me...an upcoming 5090 buyer 🤷‍♂️
Missed the demographics I'd say.
if you like the best of the best, have you ever thought about getting the fastest 1080p monitor you can get with great image quality, use some kind of SSAA -via a feature on your GPU settings, or in games like RE- and get max framerate? Best of both worlds.
 
if you like the best of the best, have you ever thought about getting the fastest 1080p monitor you can get with great image quality, use some kind of SSAA -via a feature on your GPU settings, or in games like RE- and get max framerate? Best of both worlds.
1080p vs 4K OLED...nope.
I aim for GPU limited, not CPU limited.
 
The DLSS TNN model brings massive improvements over DLSS CNN.

people in general are very happy with this tech, reading Reddit and so on....

On a different note (dunno where to put this, but since DF always report any stuttering in games...), there's a fix to avoid micro-stutters.

Found it in a very recent video in another language. It works for any GPU. The key to that is to disable MPO (a Windows thing).

How to disable MPO and avoid micro stutters or kind of sudden halts.

nVidia has a Windows registry file for this -should work in W11 too-.


More info:


 
I am VERY skeptical that disabling multi-plane overlay is a good idea.

Multiplane overlay (MPO) support is a WDDM feature that allows the graphics hardware to compose multiple layers of content into a single image that it can then display on a screen. It's essentially a hardware-accelerated method of compositing different "planes" of content - where a plane can be a video, the desktop, an application window, etc. - without having to involve the CPU or use up other system resources to do the blending in software.


The MPO feature is available starting in Windows 8.1 (WDDM 1.3). This article describes how to implement this capability in your driver.
 
I’m a little confused by some of the issues Alex attributes to ray reconstruction. Isn’t ray reconstruction essentially a denoiser for path traced surface detail? How is it related to ghosting of moving objects like the NPC’s head in cyberpunk? That seems to be an upscaling issue.
 
At 2400 euros, the raw performance of the RTX 5090 has increased by 30% compared to the RTX 4090 with a significant increase in power consumption.
Raw performance increases more slowly and therefore new methods such as AI must be used and more AI units must be integrated into graphics cards.

I tested the new DLSS 4 and now DLSS Performance looks significantly better than native UHD. DLSS Ultra Performance looks fine and it is now usable in many more games. Long vertical lines such as on buildings are smoothed out better than in native UHD. I would never guess that the base resolution is 720p. Still there are some noticeable quality losses in certain scenarios.

When I look at the conventional performance increase of this Nvidia generation I think it is realtictic to say that 2027 consoles will have significantly less conventional performance than an RTX 4090. You have to rely on machine learning because the price for more raw performance increases enormously. Smaller production stages will become considerably more expensive.

Thankfully, Nvidia recognised early on that conventional hardware and raster graphics would quickly lead to a dead end. It's good that they started the change early on.
 
Last edited:
I’m a little confused by some of the issues Alex attributes to ray reconstruction. Isn’t ray reconstruction essentially a denoiser for path traced surface detail? How is it related to ghosting of moving objects like the NPC’s head in cyberpunk? That seems to be an upscaling issue.
It is doing both at the Same time - the ghosting in the head only occurs in CNN RR, Not with CNN SR.
 
Maybe. Or maybe they wanted to secure the AI market, put in AI hardware to sell GPUs to datacentres, and have found useful ways to use it in gaming since.

Well unless someone else proves them wrong we have no data points to indicate they made the wrong bet.

It is doing both at the Same time - the ghosting in the head only occurs in CNN RR, Not with CNN SR.

Ok that’s weird. I must not understand what ray reconstruction actually does.
 
I am VERY skeptical that disabling multi-plane overlay is a good idea.

Tech conspiracists are so annoying. Always coming up with random tweaks they take on as religion. The worse being latency bros. The amount of settings they change for placebo and end up completely screwing up their experience is down right impressive. Remember it takes no credentials to have a YT channel.
haven't tried it yet tbh. I don't usually have stutters -except in some advances UE5 games, at times, or in say Resident Evil 2 Remake, where a cutscene is about to start, the base framerate suddenly decreases like 4-5 frames but with FG it's like nothing changed-.

I like to try stuff just because I am poor. And out of curiosity. Bad experiences are very important to learn in life. I remember sharing news about lite debloated Windows 11 installers, or W11 debloating apps, etc etc, and use them and basically bricked my PC. 😁

This happened last time less than a month ago. So I banned them, not I'll ever use them. This gives you the experience to share with others and help others not to make the same mistakes, or if you have to fix a PC of a friend or a neighbour, being used to fiddle with that, it's going to be easier for you.

nVidia has the file to enable and disable MPO. I didn't try 'cos it's not a big issue for me, but if you can enable it with a couple of clicks, same as disabling it, with a file provided by nVidia.. it should be easy.
 
Last edited:
At 2400 euros, the raw performance of the RTX 5090 has increased by 30% compared to the RTX 4090 with a significant increase in power consumption.
Raw performance increases more slowly and therefore new methods such as AI must be used and more AI units must be integrated into graphics cards.

I tested the new DLSS 4 and now DLSS Performance looks significantly better than native UHD. DLSS Ultra Performance looks fine and it is now usable in many more games. Long vertical lines such as on buildings are smoothed out better than in native UHD. I would never guess that the base resolution is 720p. Still there are some noticeable quality losses in certain scenarios.

When I look at the conventional performance increase of this Nvidia generation I think it is realtictic to say that 2027 consoles will have significantly less conventional performance than an RTX 4090. You have to rely on machine learning because the price for more raw performance increases enormously. Smaller production stages will become considerably more expensive.

Thankfully, Nvidia recognised early on that conventional hardware and raster graphics would quickly lead to a dead end. It's good that they started the change early on.
great post. I am sure everyone of us here would want to have a nVidia 10090 raster performance wise, run games internally at 1000fps without any help, everything pure and native. But the hardware isn't improving.

nVidia could just do that and go overkill to achieve a great raster experience, but you'd have to own a transformer -no pun intended- the size of your entire room.

If you are lucky you have a nuclear power plant near your home. 🙂

It depends on many factors, but more raster power means more power consumption nowadays. That means that if you want the best of the best the components are going to be more expensive. This means having lieke a 1600W PSU and that your electricity bill isn't an issue.

For people who are preoccupied with the electricity bill, efficiency is important. RTX 5070 and 5060 sound very appealing.

It's what nVidia said -5070 = 4090 for 550$-. They know their stuff, their GPUs have the best power efficiency, followed by Intel.
 
Back
Top