Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
I think a part of the issue is that people on here are "experts", the average gamer will not be able to say when RT is on/off or quantifiably point out when/why RT is better in a picture.
And on here the discussion is on if the reflection in the eyeballs of a char is correct or not. Something you only see if you stop and zoom in.
As for the above picture, I instinctually gravitate toward the OFF picture, but when I see that is OFF and the other is ON and start to look closer, I think that RT is more correct, but at the same time, i feel it is a bland picture. I want artistic lightning as in my tv shows/movies it seems.
I think that’s just decades of training speaking there.
I believe in artistic lighting for cutscenes, but gameplay should be realistic. You’re going to get a better and more varied results out of the emotions just due to dynamism.
Give players back control of their games!
 
Alex from DF has said on Twitter that his video for A Plague Tale Requiem should be done shortly.

Really interested to see what his recommended settings are as I'm more CPU limited than GPU limited.

Game looks stunning though and can't wait for the ray tracing upgrade which I'm hoping they go with RTGI over all the other options as there's really not enough reflective surfaces in the game to justify adding ray traced reflections over everything else in my opinion.

They've really gone to town with the use of POM and I can't think of another game that uses it anywhere near as much as this.
 

Attachments

  • APlagueTaleRequiem_x64_2022_11_12_12_11_44_912.jpg
    APlagueTaleRequiem_x64_2022_11_12_12_11_44_912.jpg
    740.7 KB · Views: 9
Last edited:
I think a part of the issue is that people on here are "experts", the average gamer will not be able to say when RT is on/off or quantifiably point out when/why RT is better in a picture.
And on here the discussion is on if the reflection in they eyeballs of a char is correct or not. Something you only see if you stop and zoom in.
As for the above picture, I instinctually gravitate toward the OFF picture, but when I see that is OFF and the other is ON and start to look closer, I think that RT is more correct, but at the same time, i feel it is a bland picture. I want artistic lightning as in my tv shows/movies it seems.
both artistic lighting and RT try to recreate reality in most cases, where RT usually has an advantage. This picture in particular reminds of the countless hours I spent playing Oblivion and how bad the shading and lighting looked in the nostrils of the characters -most games have that but I spent more hours playing Oblivion-.

I saved the picture like 2 years ago, and when I played Oblivion intensively, I somewhat knew a couple of details about RT, but I couldn't care less about it given the hardware at the time and stuff. But darn those fake nostrils typical of games....

It was one of my pet peeves in videogames. Seeing that corrected made me chuckle.
 
Last edited:
Its raster + ray tracing that matters now, its a good thing your hardware is performant at both. When looking at a 3060/ti vs 6600XT for example (not far from eachother in price brackets here atleast), the 3060Ti is just as capable in raw raster but also packs more RT punch (+ML performance).
It seems the discussion revolves around raster vs RT which isnt really whats going on for this generation atleast. Its about what RT can offer on top of raw raster capabilities of a GPU.
 
I think that’s just decades of training speaking there.
I believe in artistic lighting for cutscenes, but gameplay should be realistic.
I dont agree, I am playing a game, I want larger than life stuff including lighting. And if you look at tv/movies, they do not strive for realistic lightning, I believe.

Also I do not think artistic lighting strives to be realistic, they try to convey a mood/feeling.
 
Last edited:
I think a part of the issue is that people on here are "experts", the average gamer will not be able to say when RT is on/off or quantifiably point out when/why RT is better in a picture.
And on here the discussion is on if the reflection in the eyeballs of a char is correct or not. Something you only see if you stop and zoom in.
As for the above picture, I instinctually gravitate toward the OFF picture, but when I see that is OFF and the other is ON and start to look closer, I think that RT is more correct, but at the same time, i feel it is a bland picture. I want artistic lightning as in my tv shows/movies it seems.
But you can have artistic lighting without it looking fake. In fact, precisely on tv fake or incorrect lighting sticks out and ruins scenes completely, specially when it’s post production lights. RT can help make artistic lighting look correct and ground every element in the scene a bit more in reality.
 
I think a part of the issue is that people on here are "experts", the average gamer will not be able to say when RT is on/off or quantifiably point out when/why RT is better in a picture.
And on here the discussion is on if the reflection in the eyeballs of a char is correct or not. Something you only see if you stop and zoom in.

You don't need to be able to point out what RT is doing in the scene for it to give you the overall impression of a more realistic/better looking scene. No-one expects the average gamer to be able to eyeball the specific differences between RT on and off but they should at least be able to say "this one looks better/more realistic than that one" because your brain instinctively knows what realistic lighting and shadowing looks like.

As for the above picture, I instinctually gravitate toward the OFF picture, but when I see that is OFF and the other is ON and start to look closer, I think that RT is more correct, but at the same time, i feel it is a bland picture. I want artistic lightning as in my tv shows/movies it seems.

That picture isn't helped by how washed out the face is in the RT scene - a good example of how you can't always just turn on RT without adjusting the lighting to work well with it. Nevertheless, the RT picture is clearly more realistic than the non-RT one. I could point out the shadows in the nose, on the eyes or under the hair, but I think that's unnecessary simple because the RT shot just gives the overall impression of additional realism. This would be even more stark the more realistic the base model is (as this one isn't exactly great which makes the less accurate lighting less noticeable).

The fact is that if someone walked up to me in real life with their face lit up like the non-RT shot, I'd sh*t my pants.

I dont agree, I am playing a game, I want larger than life stuff including lighting. And if you look at tv/movies, they do not strive for realistic lightning, I believe.

Also I do not artistic lighting strives to be realistic, they try to convey a mood/feeling.

I think you're confusing dramatic lighting and realistic lighting. Live action films always have realistic lighting - because they're set in the real world where light obeys the laws of physics, but that doesn't preclude them from also having dramatic lighting. I do agree with you that we also want that in games, and sometimes RT can indeed be less dramatic than traditional techniques, but IMO it generally looks better anyway just because it looks more like real life. But in any case, the resolution to that is simply more developer effort rather than reverting back to older, less accurate lighting models.
 
I dont agree, I am playing a game, I want larger than life stuff including lighting. And if you look at tv/movies, they do not strive for realistic lightning, I believe.

Also I do not artistic lighting strives to be realistic, they try to convey a mood/feeling.
You are correct, film, tv, and photography, never strive for realistic lighting. That is all lighting from behind a lens. It's a subjective discussion I'm sure, but if we can't cross the uncanny valley, world design, therefore light design will forever be entirely curated, we will never get how we interpret lighting from our eyes. There gets to a point where things have to be as it is, you can still have fantastic lighting via normal means, it doesn't look as good as added lighting, but gameplay will be forever limited because developers are afraid that they can't design levels or puzzles with light. They won't have destructible tunnels where light from outside can beam in. All of these are forver scripted because they can't script the light with dynamic destruction or dynamic additional of items.

Gameplay is so limited today because light in itself is a big factor. Removing curated light, letting light bounce and mix, lets you have all sorts of different lighting effects mix properly to create effects that curated experiences could never achieve.

There are pros and cons, but if I have full control over the game, I think letting light behave the way it's supposed to, is going to serve a better purpose than constantly god-lighting avatars and such. The reason why Driveclub got so much praise is because it threw as much of that away as possible.
 
Depending on the game realistic lighting is either good or not so good. Artistic lighting is either good or not so good. The best is when they are correctly mixed for the type of game you are playing.

In a tunnel (like a mineshaft, crawspace, cave, etc.) without a LOT of light sources it is impossible to see anything. I'm talking real life here.

In a game that could be a benefit for a horror game but a drawback for a more action oriented game, especially if it also doubles as a competitive shooter. In a post apocalyptic or war setting that means you'll just never be able to see anything. Something like the first 2 Metro games would be borderline unplayable with realistic lighting. You wouldn't be able to see enemies until they'd already shot you. And that's even with a flashlight.

And that's not even getting into how human visual systems adjust to varying degrees of light. If you wanted a "realistic" lighted system where the user doesn't feel like something is wrong. You'll also want to make sure said gamer is blinded for potentially up to a minute when exiting a dark/dimly lit tunnel into bright sunshine. Or even at night outside. If you are sitting next to a campfire, you won't be able to see diddly squat more than a few meters away in the darkness. And it'll take your eyes, again potentially up to a minute or more for your eyes to adjust and be able to see anything once that campfire is extinguished. Lots of experience from when I was younger and had better night vision when out camping and hunting and spelunking. At my age it can take well over a minute for my eyes to readjust to changes in lighting at night. If you don't also have those systems in place then many people will continue to think that lighting in games is unrealistic regardless of how "accurate" RT lighting could be.

And then if there's any sort of character interation, without artistic facial lighting in dark settings, you won't be able to see much WRT facial animations which are incredibly important in conveying the mood of a scene. This is one point where Metro: Exodus Enhanced Edition fails hard compared to the non-enhanced version. Sometimes you just can't see a character's face well enough during character interaction. Is it more realistic? Yes. Does it make for engaging gameplay or a cinematic experience? Nope.

So, if people want games to be enjoyable, full on realistic lighting such that their brain and visual system thinks it's actually real isn't what they want. You want artistic lighting that behaves realistically (current games have artistic lighting that doesn't behave realistically). For movies that means sometimes exotic and complex lighting setups with light barriers combined with incredibly bright light sources. For games, it's going to require artists to be able to place some key light sources that are invisible to the player, potentially in mid air, with limited light reach and/or limited bounces to accomplish similar things.

Regards,
SB
 
You are correct, film, tv, and photography, never strive for realistic lighting.

I think there are differences in how "realistic lighting" is being defined here. I'm defining it in terms of how the light behaves regardless of what the source of the light is. i.e. how it creates shadows, how it bounces and reflects of surfaces, how it takes on the colour of the surfaces it interacts with etc... RT can do all of those things accurately/realistically whereas traditional lighting methods can struggle with that.

Even though films add additional off camera light sources to add drama to the scene, the light in the scene still behaves realistically (because it's real!) We can certainly mimic that with RT based lighting systems though.
 

Great to see something a little off of the beaten path for DF, albeit this is their second Mac video. Some interesting results, especially that Metal FX's 'quality' setting is actually working from 1080p, so it's starting from the same native base as DLSS/FSR performance (!). The 1080p Metal Quality results are particularly impressive, considering it's working from 540p base res.

On the underwhelming side though, the M1 Ultra can't maintain 60fps native maxxed at 4k with this which is surprising considering its grunt, we are talking about a GPU with 800GB/sec bandwidth here. It seems be underperforming a 3060ti. Stutter struggle is not just for PC too apparently. 😟 Edit: Whoops, missed that he was testing low-power mode. Still, I would have expected better for the M1 Ultra.

The ghosting can also be prominent - maybe there's a lack of motion vectors in the game, but why wouldn't this also affect TAA if that was the case?

I would have also liked to see more M1 Max/Ultra comparisons in the same scenes as I'm interested to see how the M1 Ultra scales in games, considering it's arguably more of a 'real' chiplet design with two core shader chiplets stuck together.

Overall though, good to see a modern reconstruction method actually employed in a Resident Evil game for a change. Metal FX Quality, especially considering it's equivalent performance-mode starting res vs DLSS/FSR, is very impressive in that it looks better than 4K native in many scenes, and it's doing this while also inheriting the PC version's broken TAA to boot which it still has to work from.

Man, the PC really needs to get on this reconstruction tech bandwagon someday! 😉
 
Last edited:

Great to see something a little off of the beaten path for DF, albeit this is their second Mac video. Some interesting results, especially that Metal FX's 'quality' setting is actually working from 1080p, so it's starting from the same native base as DLSS/FSR performance (!). The 1080p Metal Quality results are particularly impressive, considering it's working from 540p base res.

On the underwhelming side though, the M1 Ultra can't maintain 60fps native maxxed at 4k with this which is surprising considering its grunt, we are talking about a GPU with 800GB/sec bandwidth here. It seems be underperforming a 3060ti. Stutter struggle is not just for PC too apparently. 😟 Edit: Whoops, missed that he was testing low-power mode. Still, I would have expected better for the M1 Ultra.

The ghosting can also be prominent - maybe there's a lack of motion vectors in the game, but why wouldn't this also affect TAA if that was the case?

I would have also liked to see more M1 Max/Ultra comparisons in the same scenes as I'm interested to see how the M1 Ultra scales in games, considering it's arguably more of a 'real' chiplet design with two core shader chiplets stuck together.

Overall though, good to see a modern reconstruction method actually employed in a Resident Evil game for a change. Metal FX Quality, especially considering it's equivalent performance-mode starting res vs DLSS/FSR, is very impressive in that it looks better than 4K native in many scenes, and it's doing this while also inheriting the PC version's broken TAA to boot which it still has to work from.

Man, the PC really needs to get on this reconstruction tech bandwagon someday! 😉

I didn't want the video too closely as I was a passenger in a cat at the time, but are you sure that the performance comparison at the end was in low power mode? I know that was set earlier on to compare the performance uplift from upscaling but I didn't think it was used when the later comparison to the 3080M was done.

Also on the reconstruction, I could be mistaken but didn't he say that quality mode was giving a performance uplift more in line with DLSS/FSR 2.0 quality modes? If so it seems it's doing a lot more work under the hood to reconstruct a quality image which is eating up frame time. It's actually quite interesting that when starting from the same input res (1/4) it can produce such different end results depending on the level of work performed between the performance and quality modes.
 
I didn't want the video too closely as I was a passenger in a cat at the time, but are you sure that the performance comparison at the end was in low power mode? I know that was set earlier on to compare the performance uplift from upscaling but I didn't think it was used when the later comparison to the 3080M was done.

I was kinda live-blogging it through editing as I was watching it, but my initial comment that I struck out was based on skipping forward to see performance but hitting those low power numbers first and not realizing the restriction at the time, but as I added later - yeah the performance for an M1 Ultra is still not great even with full power. I certainly would have expected it to hold at 4k 60 without RT considering the bulk of GPU power there, so you're right - really not far off from my initial impressions based on the reduced power regardless. I'm wondering on the power draw from a 3080M laptop vs the Studio running too.

This video was more of a look at Metal FX upscaling overall, but as mentioned I really would have liked to see more Max/Ultra performance comparisons. Very curious to see how that M1 Ultra 2X GPU scales with a native API.

Jesus though, this video also just further highlights that Capcom really has some QA issues, or rather just doesn't give a shit. There is no way a game, especially one with a far more restricted GPU target than the PC market, should ship with a complete lack of a shader pre-compile stage, or not have any asynchronous method to compile them during gameplay. Naming the performance mode of Metal FX as such when it's doing a complete different form of upscaling is stupid on Apple's part sure, but I also doubt it's actually supposed to look like that - I think Capcom is turning off TAA entirely when it's chosen.
 
Last edited:
but my initial comment that I struck out was based on skipping forward to see performance but hitting those low power numbers first and not realizing the restriction at the time
The performance comparison vs PC are done using high power, low power mode was only used to simulate slower M1 GPUs to measure the performance scaling of the MetalFX upscaler.

I'm wondering on the power draw from a 3080M laptop vs the Studio running too.
The 3080m in the laptop is of the 140w variety. The highest available 3080m is rated at 150w.
 
The 3080m in the laptop is of the 140w variety. The highest available 3080m is rated at 150w.

I can look up max tdp, I was more interested in what the Studio was actually pulling during the same test. The Mac studio has a max tdp of 215watt, but I doubt Village would stress every part of the M1 outside of the GPU cores, just as it won't for the PC laptop.
 
Unfortunately console gamers as far as raytracing is concerned will always go for the performance vs raytracing improvement angle

If raytracing doesn't meaningfully transform a scene they would take the hit to lighting accuracy for higher performance

By the time ps6 comes out this will no longer be an issue most likely. But with the limitations of RDNA2 on console it's just a factor in how people look at these things for now.

AMD is increasing their RT performance all the time though so that's good.
 
Unfortunately console gamers as far as raytracing is concerned will always go for the performance vs raytracing improvement angle

If raytracing doesn't meaningfully transform a scene they would take the hit to lighting accuracy for higher performance

By the time ps6 comes out this will no longer be an issue most likely. But with the limitations of RDNA2 on console it's just a factor in how people look at these things for now.

AMD is increasing their RT performance all the time though so that's good.
But would they? Console gamers played 30 fps Games for nearly everything since xbox360
 
But would they? Console gamers played 30 fps Games for nearly everything since xbox360
I mean your partially true (depending on the genre) but it's more complicated than that.

Im a console pleb, we have been playing 30fps(or below) games regularly since the 5th gen and the true advent of 3D graphics. But on the other hand people have become finicky about this stuff.

Now that 60fps and above modes have become popular due to crossgen overhead easily allowing for them, a large cross section of people are unwilling to go to back.

Which I unfortunately predicted before this gen started which is why I have been against PC like options in console games to control these things and had hoped devs would not so easily jump on the 60fps train across the board due to the increase in CPU power. But because they have, the expectation has been set and the dreaded fps snob has been reborn on console which is insufferable.

Devs will be castigated by certain people now for going to 30fps to get more out of these machines now unfairly, it's already been happening.

They will always expect devs to start development at 60fps on console and build effects on top for a 30fps mode as opposed to building a core 30fps experience truly pushing the hw which could not be taken to 60.
 
Status
Not open for further replies.
Back
Top