Digital Foundry Article Technical Discussion [2025]

@DavidGraham Now I'm curious about RTX Mega Geometry on older gpus, because they lack the functions of the RT cores that do geometry cluster intersection and and cluster decompression. I wonder if they have some kind of compute fallback or what exactly is going on there. Wonder if Mega Geometry will be a setting that you can toggle in-game, because I feel like there could be a hit on 40 series and older that users wouldn't want.
 
We are getting both ... RTX Geometry to accelerate fps on all RTX GPUs, and also a new Ultra Quality mode for ray tracing, adding fully ray traced refractions, and fully ray traced transparent reflections, It will also improve the quality of the fully ray traced indirect lighting.

I wish Alan Wake 2 was actually worthy of all this support. It's mediocre in so many aspects. Can we get some upgrades for something like Elden Ring or monster hunter wilds?
 
Digital Foundry's deep dive on every thing related to Blackwell.

0:01:08 Blackwell architecture: Blackwell SM design
0:03:39 Updated Tensor Core and RT Core
0:09:37 AI Management Processor and Max Q power management
0:15:13 Display Engine and video encode/decode
0:18:32 Hardware specs: RTX 5080 and 5090
0:29:16 RTX 5070 and 5070 Ti
0:34:23 RTX 50 Series laptops
0:40:14 RTX Software: Neural Shaders
0:44:24 SER 2.0
0:47:06 RTX Mega Geometry
0:50:15 RTX Hair
0:53:13 Neural Radiance Cache
0:55:58 RTX Skin
0:58:01 DLSS 4: Super Resolution, Ray Reconstruction, Frame Generation
1:06:18 Generative AI demos
1:10:03 Wrap up discussion: RTX 50 series price and performance
1:17:36 Nvidia’s software package and features
1:22:38 How should we review graphics hardware?

wish they talked about all the controversy surrounding the RTX 5000 launch, especially the AMD fans that seem to find FG a bad thing when it can easily be one of the best things that happened to displays (typical, handheld like etc) and GPUs in years, and there is a trend pushing the fake frames agenda
 
Last edited:
Cyan, with all respect, just chill a bit on this topic. Maybe people on other forums are having brand arguments or arguments about whether FG is a generally useful technology or not, but that is not the discussion that has been had here (which has focused primarily on various marketing claims and comparisons). We're not going to bring those external discussions into this thread.
 
I like the small section about reviewing. I hope the different review sites give it a lot of thought. You almost need to take an approach like rtings and score or evaluate a product across multiple use cases, like Alex's car analogy:


If you have a 4k 120Hz monitor multi-frame gen is not going to be useful for you. For a 1440p480 monitor, it's far more interesting. I think monitors are actually a good match, because monitors cater to very different use cases and reviews have to reflect that. Some monitors are really just for esports. Some are for photo editing. Some are for video/HDR and some can't do HDR at all. You may or may not need hardware calibration. You might have a different viewing environment, so panel technology becomes a factor.

Really interested in seeing how Digital Foundry approaches this. They don't really give scores, which I like, but a bullet point style pros and cons of uses cases could be useful. I'm also not suggesting that all sites need to cover all use cases. I think Digital Foundry will always favour cutting-edge graphics, not esports settings, and they don't review things in terms of how good they are for using Blender or something. They have a specific focus that's valuable.

Nice breakdown of all the features here. Really excited to see if Alan Wake 2 update is part of launch reviews, or coverage comes soon after. I'm curious to see if we see a performance improvement, or if we get much better ray tracing quality at a similar performance etc. That's basically the game I'm thinking of getting a 50 series for.
new monitors were shown at CES 2025, some of them are 750Hz and 800Hz monitors. 1000Hz is near. That could be like human being vision, 1Hz for every frame a second.

Our maybe our vision has to be measured in nanoseconds, femtoseconds....

The nVidia announcement of FGx4 becoming the norm, and the use of LS, totally shifted my interest for new hardware. Instead of getting a hybrid console/PC handheld, the next thing I'm getting.is a 360Hz or 400Hz monitor.

After using FG for every game, my 165Hz monitor feels limited to me. Playing everything so smoothly makes you become a Shifty person. .🙂
 
Last edited:
Different GPUs have always rendered things a little different. In the early days, there were huge differences depending on hardware, or even the API used. It's a lot more standardized now, but there are still differences. Plus, driver options can change the way a game presents anyway. I think the people bringing up artist intention are those who have mode up their mind about AI and are fishing for reasons to hate it.

Just to clarify I'm asking for beyond the current situation and side stepping the hardware comparison aspect that is currently dominating the discourse. This also isn't the first time the thoughts crossed my mind.

When it was brought up in the video I believe it was in the context of neural faces and the importance of any of this techniques now and the future in preserving the artists intent. But what I'm asking is if that is actually important. I don't want this devolve into a broader social debate but let's just use a more concrete example with faces. We know there's been some controversey in terms of the approach to character faces recently the last few years, but how would hypothetically having more agency on the client side on this affect game design in general? And would it be for better or worse?

Really afterall on the PC that is what modding really is and the customer base is largely receptive to it. Albeit we have some developers that do have push back against how much they want others to able to modify their games (whether that be for vision or profit motitives such as DLC).
 
In Oliver and Alex's ces video (and the black state short on the clips channel) you can see x4 FG at like ~30ms latency in Black State. I'm not sure if that's with reflex 2 aswell (thinking it probably was), so lets see if I can phrase this without triggering people, that latency at the smoothness and clarity of 380hz would feel really good from my experience.
 
In the CES thread, @Dictator ’s comments to one of the questions was telling. People focus on hardware specs but nvidia is so far ahead on the software stack that it’s crushing everyone. And it wasn’t just his opinion. That seemed to the industry consensus from all the people he and Ollie spoke with.

And it’s not just a bunch of a PoC’s typical of trade shows. Most of it will be material and in use by end of this month! That’s unheard of.

There was a separate question regarding unreal and nvidia. Personally, from the outside it very much seems like nvidia and epic don’t have a lot of cohesion. If you take MS, they seem willing to take good ideas and implement them into the DX standard which is great to see. When it comes to unreal however, it seems epic and nvidia have parallel and competing paths to get something done.

When a company has 90% market share you’d think there’s natural harmony but it seems not. With all the components nvidia is putting together, how long before it naturally evolves into a competing product? Or there are enough individual components that other companies can make a basic underlying engine and stack on these bits for a full engine?

I wouldn’t mind seeing it.
 
I wish Alan Wake 2 was actually worthy of all this support. It's mediocre in so many aspects. Can we get some upgrades for something like Elden Ring or monster hunter wilds?
Agree with you about AW2. But ER and MH:W? ER has a very dated graphics engine to begin with so it won't look good with it's lighting. I'd rather have Dragon's Dogma 2 over MH:W since that's their latest engine and in PT mode, it looks great (minus the noise).
 
Last edited:
With all the components nvidia is putting together, how long before it naturally evolves into a competing product? Or there are enough individual components that other companies can make a basic underlying engine and stack on these bits for a full engine?

I wouldn’t mind seeing it.
That's a very interesting prophecy there and one that I could see a possibility of happening. Imagine Nvidia just makes their own graphics engine team to release an open-source graphics engine that companies can use to make their games. Indie developers would eat that up. That would shake the industry. Hmm...
 
Agree with you about AW2. But ER and MH:W? ER has a very dated graphics engine to begin with so it won't look good with it's lighting. I'd rather have Dragon's Dogma 2 over MH:W since that's their latest engine and in PT mode, it looks great (minus the noise).
I said those ones especially because of their last gen tech. Those would be the biggest upgrades.
 
In the CES thread, @Dictator ’s comments to one of the questions was telling. People focus on hardware specs but nvidia is so far ahead on the software stack that it’s crushing everyone. And it wasn’t just his opinion. That seemed to the industry consensus from all the people he and Ollie spoke with.
Nvidia being 5 years ahead in research is great, but gamers rightfully are only concerned with what products will actually deliver for them today and in the very immediate future. And most all the exciting future stuff Alex and Oli were talking about largely aren't gonna be relevant for a good while, and not before there's even better products available. So until then, yes, hardware specs still matter a ton.
 
That's a very interesting prophecy there and one that I could see a possibility of happening. Imagine Nvidia just makes their own graphics engine team to release an open-source graphics engine that companies can use to make their games. Indie developers would eat that up. That would shake the industry. Hmm...
Tools are some of the largest blockers for adoption for engines.

That may change if games are going smaller in scope, but those open world titles require a lot of tooling.

An important consideration for nvidia if they want to release an engine for game development.
 
Tools are some of the largest blockers for adoption for engines.

That may change if games are going smaller in scope, but those open world titles require a lot of tooling.

An important consideration for nvidia if they want to release an engine for game development.
Agreed but they have the money to incorporate developers to make tools for their team.

If they wanted to, they could easily reproduce the entire company at Epic with all of it's employees.
 
Nvidia being 5 years ahead in research is great, but gamers rightfully are only concerned with what products will actually deliver for them today and in the very immediate future. And most all the exciting future stuff Alex and Oli were talking about largely aren't gonna be relevant for a good while, and not before there's even better products available. So until then, yes, hardware specs still matter a ton.
Are you saying there will be better products of the same tech available by other companies?
 
I said those ones especially because of their last gen tech. Those would be the biggest upgrades.
I assumed that the reason why AW2 would be a good candidate because it is using the latest features in hardware. If they don't change the overall lighting in ER, it would be a wasted effort IMO.
 
I'm only part way through the DF direct and they brought up the issue of preserving the artists/developers intention with the introduction of AI into the pipeline. This certainly isn't the first time it's been brought up but I do find this issue interesting from the perspective of if that is actually for the better or worse?

Especially from a PC persperctive modding has always been rather popular and accepted. If anything developers that tend to intentionally lock out their games from modding (outside of MP integrity) is generally looked down upon. With that said hypothetically if we get to a future is it better or worse if there is this increased maliability on the client side in terms of the visual output of the game?
Adding AI to the pipeline in the form of RTX Faces and similar generative tech doesn't make games easier to mod. It just takes control away from the artists/developers and gives it to a black box pre-trained ML model. DLSS enhances the input provided by the developer-coded traditional pipeline. Neural Materials is trained on developer-provided ground truth. But RTX Faces and other generative tech replace the artist/developer's vision with its own generated visuals, or at least that's what it appears to do. Hopefully RTX Faces allows developers to fine-tune the output somehow.

We might see a neural rendering equivalent to ReShade/SpecialK that allows gamers to neurally mod their game, or neural filters that don't require the game to be modified. But that's a separate issue than generative techniques in the rendering pipeline taking control away from developers.
 
When it comes to unreal however, it seems epic and nvidia have parallel and competing paths to get something done.
I don't think the paths are competing, just sometimes the near-term priorities are different... and wouldn't it be obvious why that is?

Epic is making a game engine that scales across a large amount of hardware for many different uses. Obviously one would like to make it look as good as possible on all platforms, but it's hard to justify prioritizing techniques that only work on a sliver of the user base, especially if they impact the content pipeline (*at all*).

NVIDIA on the other hand is primarily interested in marketing their newest, high end GPUs. They are interested in game engines as far as they are a path to that end... they have no real interest in the non-graphical considerations of game engines, and relatively little interest in scaling to other platforms, or even down their own SKU stack. This isn't a criticism any more than I would broadly criticize Epic's priorities - both are a result of the economics and should not really be a surprise to anyone. Remember I've done about equal time now at both IHVs and ISVs in my career.

That said, I don't think the paths are really that different in the longer term. A couple months ago (some) people would have said "Epic is pushing for virtualized geometry with Nanite *instead* of raytracing and the two have opposing goals"... would you say that today? Or would you perhaps note that NVIDIA has had the same goal, but just needed some time to cook "Mega Geometry" (guys... you can't tell me no one has made any connections/nods with the very name of it...). Or perhaps note that Lumen hardware RT is now the preferred path and MegaLights (which fundamentally relies on raytracing) exists as an experimental feature, but also needed/needs time to cook?

Each company just has somewhat different near term priorities. There are always more good ideas than time and from a research perspective we really want different people exploring different avenues anyways as that's how we expand the state of the art more quickly. As various ideas prove out or don't (there's always a trail of graphical corpses for both Epic and NVIDIA too!) they gain industry traction. This is how it has always worked.
 
Last edited:
Back
Top