Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
I mean your partially true (depending on the genre) but it's more complicated than that.

Im a console pleb, we have been playing 30fps(or below) games regularly since the 5th gen and the true advent of 3D graphics. But on the other hand people have become finicky about this stuff.

Now that 60fps and above modes have become popular due to crossgen overhead easily allowing for them, a large cross section of people are unwilling to go to back.

Which I unfortunately predicted before this gen started which is why I have been against PC like options in console games to control these things and had hoped devs would not so easily jump on the 60fps train across the board due to the increase in CPU power. But because they have, the expectation has been set and the dreaded fps snob has been reborn on console which is insufferable.

Devs will be castigated by certain people now for going to 30fps to get more out of these machines now unfairly, it's already been happening.

They will always expect devs to start development at 60fps on console and build effects on top for a 30fps mode as opposed to building a core 30fps experience truly pushing the hw which could not be taken to 60.

The solution of Unreal Engine is great 30 fps is using HW-raytracing and 60 fps is using software lumen. I suppose it will be the solution for tons of engine. For example Snowdrop engine they were supposed to release games on stadia and they have a HW-RT and software solution too, same for Cry Engine, Frosbite and so on.

It means no sacrifice of dynamic GI for 60 fps, just no HW-RT on consoles.
 
I think there are differences in how "realistic lighting" is being defined here. I'm defining it in terms of how the light behaves regardless of what the source of the light is. i.e. how it creates shadows, how it bounces and reflects of surfaces, how it takes on the colour of the surfaces it interacts with etc... RT can do all of those things accurately/realistically whereas traditional lighting methods can struggle with that.

Even though films add additional off camera light sources to add drama to the scene, the light in the scene still behaves realistically (because it's real!) We can certainly mimic that with RT based lighting systems though.

Yes our definitions are different, I do not care if the lighting follows the rule of physics as long as the end product looks how it should look. Ie when I walk around everyday and what I see then is realistic, aka boring :)
Which means again all the reflectors, extra back light and colouring etc in TV and Movies are not realistic according to my definition.
So continuing with my definition, then artistic lighting is when you add, block, change how thing looks from "boring" light.

So when I talk about lighting in games, I want the same outcome as we have in TV/Movies and not the same as when I stroll down to my local store. If that is done by RT or by sacrificing owls to the Gods of light, I do not really care :)

There was another comment about the picture I quoted, that the RT part was washed out and that we need tuning of the settings to get the best out of it on the screen. But how often do average people do that or even know that they have to?
So again I would think for the average person, at this point in time, most people do not see how such a big jump in wow with RT.

Is it called confirmation bias? you know iRT light is correct, you look for it, find it and then yes its great!!
 
I mean your partially true (depending on the genre) but it's more complicated than that.

Im a console pleb, we have been playing 30fps(or below) games regularly since the 5th gen and the true advent of 3D graphics. But on the other hand people have become finicky about this stuff.

Now that 60fps and above modes have become popular due to crossgen overhead easily allowing for them, a large cross section of people are unwilling to go to back.

Not sure I agree on this, people on here and probably some other forums talk about it. But if you look at population that plays games, I sincerely doubt most of them are even aware of the 30 vs 60 fps. They might feel it when they play, but most probably can not pinpoint why this game feels better than that, when/if its the 30 vs 60 fps that is the reason.
 
Unfortunately console gamers as far as raytracing is concerned will always go for the performance vs raytracing improvement angle

If raytracing doesn't meaningfully transform a scene they would take the hit to lighting accuracy for higher performance

By the time ps6 comes out this will no longer be an issue most likely. But with the limitations of RDNA2 on console it's just a factor in how people look at these things for now.

AMD is increasing their RT performance all the time though so that's good.

But they wouldn't.....there are varying classes of console gamer who together make up the console market.

And the largest percentage of these groups are the casual gamers and children who just flat out don't give a shit about frame rate or how a game looks on their console.

They simply play the game in whatever default resolution/frame rate it starts with, including starting with ray tracing on by default.

The die hard tech heads likely also have a gaming PC where they can get 60fps.
 
Well I am not impressed by FSR 2.1. Even according to DF there are still some cases where 1.6 is better than 1.61 and in others areas it's clearly not better with the AA still failing at doing any convincing job.
 
Well I am not impressed by FSR 2.1. Even according to DF there are still some cases where 1.6 is better than 1.61 and in others areas it's clearly not better with the AA still failing at doing any convincing job.

In the vast majority of scenes it's appreciably better. There is significantly less pixel popping in many scenes, perhaps not quite in the 'impressive' category (I'd still say Forbidden West's TAA/Checkerboarding improvements in later patches were a bigger uplift) but it's an improvement.

CP's TAA solution is truly awful though, it looks like SMAA at points so really nowhere to go but up I guess. Also bear in mind most reconstruction comparison videos on the PC are done with 4K as the output resolution, whereas with CP on the consoles, the final res is much lower. As such, it may not be the ideal game to isolate FSR 2.1's potential improvements over existing TAA/Checkerboard solutions. Scorn is one such game that outputs at a native 4K using FSR and it looks very good, albeit we really need a game that did this with a TAAU/Cboard solution before then gets an FSR patch to compare properly.

One thing that wasn't entirely clear though - was motion blur + DOF re-enabled for the motion tests? They're turned off to do static screenshots comparisons which is fine, but in my experience, DLSS can exhibit the most artifacts when it encounters these effects in motion on some games. Anytime you have a lower-res effect overlaid on high-res scene geometry is where reconstruction can sometimes struggle, so hopefully those effects were re-enabled for most of the motion comparisons.
 
Last edited:
I mean your partially true (depending on the genre) but it's more complicated than that.

Im a console pleb, we have been playing 30fps(or below) games regularly since the 5th gen and the true advent of 3D graphics. But on the other hand people have become finicky about this stuff.

Now that 60fps and above modes have become popular due to crossgen overhead easily allowing for them, a large cross section of people are unwilling to go to back.

Which I unfortunately predicted before this gen started which is why I have been against PC like options in console games to control these things and had hoped devs would not so easily jump on the 60fps train across the board due to the increase in CPU power. But because they have, the expectation has been set and the dreaded fps snob has been reborn on console which is insufferable.

Devs will be castigated by certain people now for going to 30fps to get more out of these machines now unfairly, it's already been happening.

They will always expect devs to start development at 60fps on console and build effects on top for a 30fps mode as opposed to building a core 30fps experience truly pushing the hw which could not be taken to 60.

I find this take on things rather interesting.

Developers give users a better way to experience a game or at least a choice in how they want to experience it. Many users choose the alternative because it's a better way to play a game and leads to a far more enjoyable gaming experience for them. And many users still choose to play at 30 FPS for their own reasons.

Some users interpretation: better and more pleasing ways to play games (for many if not all players) is bad. :D Choice is bad. :D

Many people on console (obviously not all people) have been wanting and asking for 60 FPS games for a long LONG time now. They played 30 FPS games because they had no choice, and obviously now that they can play at 60 FPS, why in the world would they want to go back to 30? :) Prettier screenshots? No, that's not going to do it.

Regards,
SB
 
I play with Xbox Series X and most of my games run at 60fps, the few that run at 30fps are solved by my high-end TV with interpolation, which is good for singleplayer TPS games. 30fps? No way.
In this generation, many more games run at 60 fps than in the previous one, and also with high graphics settings. In my opinion, with the introduction of techniques like FSR 2 and its variants, 60/120 fps will become even more common on console. And this will be especially true for games running on engines based on modern DirectStorage, such as Turn10, The Coalition, Frostbite and IDTech. I am a high-tech gamer and I feel that the Series X fully meets my very high expectations.
 
Not sure I agree on this, people on here and probably some other forums talk about it. But if you look at population that plays games, I sincerely doubt most of them are even aware of the 30 vs 60 fps. They might feel it when they play, but most probably can not pinpoint why this game feels better than that, when/if its the 30 vs 60 fps that is the reason.

We have discussed this to the death and I think the forum consensus is against you here :)
 
I play with Xbox Series X and most of my games run at 60fps, the few that run at 30fps are solved by my high-end TV with interpolation, which is good for singleplayer TPS games. 30fps? No way.

Aside from the blurriness of interpolation, don't you hate the input lag?
 
Well I am not impressed by FSR 2.1. Even according to DF there are still some cases where 1.6 is better than 1.61 and in others areas it's clearly not better with the AA still failing at doing any convincing job.
I am surprised by this reaction. Based on Tom's testing FSR2 is a net gain in visual clarity versus non FSR2 for basically no performance cost. Seems like a success to me?

You mention AA hasn't improved but he says FSR has a direct improvement to AA and he doesn't say anything about FSR being inferior outside of a single edge case with a barbed fence.
 
Last edited:
Maybe he means in comparison to usual console AAA/AA games who have their own implementations. Perhaps taau. Those are usually just as good if not better.
 
I find this take on things rather interesting.

Developers give users a better way to experience a game or at least a choice in how they want to experience it. Many users choose the alternative because it's a better way to play a game and leads to a far more enjoyable gaming experience for them. And many users still choose to play at 30 FPS for their own reasons.

Some users interpretation: better and more pleasing ways to play games (for many if not all players) is bad. :D Choice is bad. :D

Many people on console (obviously not all people) have been wanting and asking for 60 FPS games for a long LONG time now. They played 30 FPS games because they had no choice, and obviously now that they can play at 60 FPS, why in the world would they want to go back to 30? :) Prettier screenshots? No, that's not going to do it.

Regards,
SB
Your just proving what I'm saying. There are a lot of people who are now gonna whine about devs prioritizing 30fps to push their engines despite at the same time crying about how next gen games don't seem "next gen enough" if they prioritize 60 and have to sacrifice precious resources to get there.

Just the idea that prioritizing lower framerate means "prettier screenshots" is one of the worst ignorant myths of casual console mindset out there.

Devs when optimizing games for console should just focus on what they want to do and stop trying to please people who can't be pleased.

I don't mind devs prioritizing 60fps or 30fps. But it's just common sense that things have to be sacrificed in a closed box architecture to get there.

Games in the future which have a 60fps option will inevitably have planned the game to run at 60 from the start with the secondary 30fps mode simply adding certain effects on top. Which while fine, people can't complain about it being an additive and not transformative experience since the game is still inherently built for 60fps. Adding RT shadows or reflections or lighting will always be an option but their impact is totally subjective and not an example of a game that is truly challenging the hw.

I guess my point with all this is to say people need to be realistic with their expectations and stop being mad at the reality of fixed hardware.

With something like the switch I can understand cause it was already a gen behind and is now 2 gens behind. But with PS5 and Xbox series? Nah...
 
Last edited:
Another impressive showing for FSR 2.1 on consoles.
not bad at all. Especially the huge improvement in image clarity. There are areas where FSR 2.1 can't do much though, 'cos there is some banding and stuff. How does it compare to DLSS?

I wish they add Xess, 'cos I can't try DLSS. If they added XeSS I'd certainly buy the game.
 
Unfortunately console gamers as far as raytracing is concerned will always go for the performance vs raytracing improvement angle

If raytracing doesn't meaningfully transform a scene they would take the hit to lighting accuracy for higher performance

By the time ps6 comes out this will no longer be an issue most likely. But with the limitations of RDNA2 on console it's just a factor in how people look at these things for now.

AMD is increasing their RT performance all the time though so that's good.
almost 100% of console games are running at 60fps in the PS5/XSX nowadays. Different people different ways. It's a reason to disable RT for many, which I understand. I prefer to play Shadow of the Tomb Raider with RT on, but when I don't want to hit the standard max power consumption -190W- of the GPU I disable RT and resort to Ultra shadows. It's not the same but I spent like 60% of the game playing without RT 'cos of that.

It's not the case with Shadow of the Tomb Raider, but one of my favourite features of RT is getting rid of one of the effects I dislike the most in videogames, SSAO. It's a consuming effect and while it improves the lighting at times, many scenes look unnatural thanks to it. I especially dislike characters with a deep shadowed halo around them when they are at a certain distance from a wall or surface, or whatever. I can't stand it.

SSAO imho is not even an artistic decision, it's an approximation. RT in that sense does "real life SSAO" justice. Even artistically, there is the port of Super Mario 64 fully raytraced and people can judge from themselves whether RT detracts from the original art style or not. :)
 
We have discussed this to the death and I think the forum consensus is against you here :)

I missed that conclusion then, but I am not sure the forums experts consensus really matches the daily life of the normal gamer :)
 
Status
Not open for further replies.
Back
Top