Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
Probably the best example I've seen to date, thanks! I've played a half dozen hours or so and have to admit the difference has not been super obvious to me in a lot of cases. When I see some nice lighting and toggle back to "ultra RT" (whatever the non PT one is called) it often looks very much the same, which is a testament to the original mode I think. I'll have to keep an eye out for this location and similar ones to check.

Shadows from all light sources in scenes also help with light leaking, thanks to RTXDI.
https://ibb.co/xfPbqsM
https://ibb.co/kH66pW6
This is the cases that most people show and while it's definitely a nice improvement it's also a bit frustrating because for the budget they are spending on overdrive mode you could absolutely add RT shadows to most/all of those other light sources too. From a user perspective though, it's probably the most consistently obvious visual improvement I see.
 
One additional thing to think about WRT Starfield on PC is that users will be able to tweak settings that are unchangeable on console. That can greatly increase or decrease CPU or GPU load.

Not everyone needs or wants to run every setting at max.
Many of the graphics settings are accessible via Creation Kit's modding interface even on consoles. If you look at graphics mods for Skyrim and Fallout 4, you'll find a 60fps mod for the Xbox One build of Fallout 4 and a bunch of mods that tweak settings for things like draw distance, level of vegetation and so on.

I have a 4090 winging it's way to me, so I expect 4K/60fps will be my low performance point, but I'm hopeful that 4K/120fps is achievable.
 
Last edited by a moderator:
Probably the best example I've seen to date, thanks! I've played a half dozen hours or so and have to admit the difference has not been super obvious to me in a lot of cases. When I see some nice lighting and toggle back to "ultra RT" (whatever the non PT one is called) it often looks very much the same, which is a testament to the original mode I think. I'll have to keep an eye out for this location and similar ones to check.


This is the cases that most people show and while it's definitely a nice improvement it's also a bit frustrating because for the budget they are spending on overdrive mode you could absolutely add RT shadows to most/all of those other light sources too. From a user perspective though, it's probably the most consistently obvious visual improvement I see.
A lot of gamers don't even know at what direction lighting will scatter. The first image proves that.
 
Probably the best example I've seen to date, thanks! I've played a half dozen hours or so and have to admit the difference has not been super obvious to me in a lot of cases.
Thanks. The difference is not obvious for larger scale scenes, for which the sparse probes really do their best by capturing the diffuse lighting and general look (and where all the small shadows and correct lighting aren't that visible). But, as with high-quality assets, resolution does matter. Once you go from the macro level to the details, this is where the per-pixel lighting starts to shine as probes can no longer capture the correct look for a huge variety of possible lighting conditions and their resolution would no longer be sufficient for diffuse shadows and other details that make lighting realistic. The same principle applies to high-poly assets or textures. You don't need 200K poly meshes for general large-scale views to look good. Similarly, you don't need 4K textures everywhere, as most of the screen will maintain a 1-to-1 texel to pixel density with lower resolution textures, unless you come very close to objects.

a bit frustrating because for the budget they are spending on overdrive mode you could absolutely add RT shadows to most/all of those other light sources too
You can, but I guess the advantage of the overdrive mode is less noise and better quality. Not sure whether non-restir RT shadows can handle that many light sources well from an image quality point of view, as this was the main idea behind restir - improving the signal-to-noise ratio.

A lot of gamers don't even know at what direction lighting will scatter. The first image proves that.
Can you elaborate on this, please?
 
Can you elaborate on this, please?
In this picture:

Cyberpunk-2077-Screenshot-2023-06-16-21-34-56-04.jpg


You can't tell where the light is coming from in the alley. So many people will not notice that that is wrong. That's the limit of GI light probes as they are computed in all directions (ie. a spherical direction).

In this picture:

Cyberpunk-2077-Screenshot-2023-06-16-21-35-27-93.jpg


It is clear that the sun is coming from the right hand side of the alley so the light scatters on the left side of the alley walls (i.e. take note of the shadows that the pipes are casting on the wall) - while the right side is completely blocked from the sun since it's also on the right side.
 
Last edited:
In this picture:

Cyberpunk-2077-Screenshot-2023-06-16-21-34-56-04.jpg


You can't tell where the light is coming from in the alley. So many people will not notice that that is wrong. That's the limit of GI light probes as they are computed in all directions (ie. a spherical direction).

In this picture:

Cyberpunk-2077-Screenshot-2023-06-16-21-35-27-93.jpg


It is clear that the sun is coming from the right hand side of the alley so the light scatters on the left side of the alley walls (i.e. take note of the shadows that the pipes are casting on the wall) - while the right side is completely blocked from the sun since it's also on the right side.

But if you look at the shadows on the building in the background the light is obviously coming from a light source (the sun) that is above and slightly to the left as the shadows go down and to the right. The shadows on the building in the background are showing the sun being in the same above and slightly to the left position in both the RT and non-RT shot.

So, both shots are showing an inconsistency between the outdoor lighting direction and that shaded alley lighting direction. For consistency, one explanation is that there might be a strong light bounce from somewhere off screen to the right. Like a white wall or something.

Regards,
SB
 
But if you look at the shadows on the building in the background the light is obviously coming from a light source (the sun) that is above and slightly to the left as the shadows go down and to the right. The shadows on the building in the background are showing the sun being in the same above and slightly to the left position in both the RT and non-RT shot.
Direct lighting is always correct even with GI light probes used to add light to the material. The situation where GI light probes break down is when the object is in shadow. That's when you don't know where the light is coming from. Using PT, you can tell because the light bounce directions are accurate while the object is in shadow.
 
So, both shots are showing an inconsistency between the outdoor lighting direction and that shaded alley lighting direction. For consistency, one explanation is that there might be a strong light bounce from somewhere off screen to the right. Like a white wall or something.
For GI, people sample for a random direction in a hemisphere centered around a normal. The original light source's location doesn't matter much (as GI rays bounce 2 times in random directions in this case). What is crucial is which part of the scene is brighter. As we can see, the part of the scene that's outside is brighter. Therefore, when rays travel from the darker part of the scene closer to the camera to the brighter lit one, they compute the brighter lighting in hit points, which adds lighting to the darker part of the scene. If a ray hits a pipe or something on the way (which is in the darker lit part of the scene), there will be a shadow.
 
Last edited:
This week, John discusses the reaction to the Final Fantasy 16 reviews, the focus on the 720p-1080p performance mode and whether the game was actually targeting 30fps all along. Meanwhile, the team are impressed by the latest Nintendo Direct, Xbox raises prices, while the lack of DLSS in AMD sponsored PC titles once again comes to the fore. Also: to celebrate our 117th edition, the team share their favourite Halo memories.
 
For GI, people sample for a random direction in a hemisphere centered around a normal. The original light source's location doesn't matter much (as GI rays bounce 2 times in random directions in this case). What is crucial is which part of the scene is brighter. As we can see, the part of the scene that's outside is brighter. Therefore, when rays travel from the darker part of the scene closer to the camera to the brighter lit one, they compute the brighter lighting in hit points, which adds lighting to the darker part of the scene. If a ray hits a pipe or something on the way (which is in the darker lit part of the scene), there will be a shadow.

Yes, I was correcting him that the "position of the sun" was incorrect in his post as the "sun" is obviously above and to the left. Position of the brightest light source affecting an object would be better or something that references the actual source of the light hitting an object.

Regards,
SB
 
Talking about the fight between probes and per-pixel rt, it seems the future of realtime GI has splitted into two. One is the super high quality per-pixel rt (comparing to other gi method) but also with a super high cost even with the help of ReStir and DLSS. And then we have the screen space probe solutions which are led by Lumen from Unreal 5. It is more capable I'd say, but still providing way better GI details than conventional world space probes. AMD seems to follow this path and has released their solution GI 1.0 which looks very inspired by Lumen, though I haven't seen any games use it. Ambient occlusion will probably be a outdated term in the future ;)
 
Regarding Q3, I'm not sure what the questioner means by "automatic", but I believe this is possible on the software side.
Something vaguely similar to this was accomplished in Ghost Recon Advanced Warfighter on Xbox 360. Their goal had nothing to do with VRR, obviously. But it achieved the same thing. It was sort of like Dynamic Framerate Scaling, as opposed to resolution. DF made a video about it. It wasn't fine grained, like the discussion in this DF weekly. It only flipped between two states. But, with direct intent, I'm sure a developer could implement a system like what they're discussing.
 
has not been super obvious to me in a lot of cases
Talking about the fight between probes and per-pixel rt, it seems the future of realtime GI has splitted into two. One is the super high quality per-pixel rt (comparing to other gi method) but also with a super high cost even with the help of ReStir and DLSS. And then we have the screen space probe solutions which are led by Lumen from Unreal 5. It is more capable I'd say, but still providing way better GI details than conventional world space probes. AMD seems to follow this path and has released their solution GI 1.0 which looks very inspired by Lumen, though I haven't seen any games use it. Ambient occlusion will probably be a outdated term in the future ;)

Lumen uses both world space and screen space caches. It even has multiple world space caches (probe grids and surface cards).
 
Lumen uses both world space and screen space caches. It even has multiple world space caches (probe grids and surface cards).
That's right. But the world space probes mostly exist as a fallback for far rays and volumetrics. I also believe in the Matrix demo they kinda swap to a HLOD solution for distant field RT. Still, the most important part is the screen space probes.

Also I wouldn't call surface cache as world space cache. They are there to provide radiance and you can replace them with any other techniques without changing the whole workflow (e.g. the good old RT on-hit lighting evaluation). I'd even say they are more object space because of the way Lumen accesses them.
 
But if you look at the shadows on the building in the background the light is obviously coming from a light source (the sun) that is above and slightly to the left as the shadows go down and to the right. The shadows on the building in the background are showing the sun being in the same above and slightly to the left position in both the RT and non-RT shot.

So, both shots are showing an inconsistency between the outdoor lighting direction and that shaded alley lighting direction. For consistency, one explanation is that there might be a strong light bounce from somewhere off screen to the right. Like a white wall or something.

Regards,
SB

You can't readily tell if its right or wrong. There is not enough information to know. We don't know what beyond the alley to the right. It may be a wall or open space that being hit with a lot of direct sunlight which bounces and hits the alley on the left side while there is a wall on left thats getting no direct sunlight so there is less indirect light to hit the alley wall on the right.

Discerning correct from incorrect lighting is not something that your brain can easily process especially when there are a number of light sources and plenty of materials for the light to bounce off.
 
Status
Not open for further replies.
Back
Top