Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
Alan Wake Remastered uses MSAA.
It's technically the old engine, but it's still a new(ish) game (came out one month before FH5).
Quantum Break also uses it, but not so recent.
May very well see AW2 also stick with forward as well then.
 
What changed outside of deferred rendering being the preferred method?

it’s not set in stone yet. F+ is still around
Higher counts and smaller sizes for triangles, complex material shading and higher levels of foliage. FH5 is a shimmery mess with only MSAA.
 
Higher counts and smaller sizes for triangles, complex material shading and higher levels of foliage. FH5 is a shimmery mess with only MSAA.
This statement is correct for a common forward rendering pipeline and hardware msaa resolve process.
But MSAA in deferred rendering can have a vastly different implementation compared to MSAA in Forward path.
In RDR2 for example (on PC of course), they render gbuffer with msaa on (let's assume 4x here). Then in the lighting pass, for each pixel, they load the gbuffer of all 4 subsamples and compare the differences among their normals/specular/diffuse (or whatever metric they pick) to determine if this pixel could introduce aliasing and thus requires supersampling. If so they just gonna run the lighting computation on all 4 subsamples and average them out; otherwise they only run the lighting pass once for the first subsample. Basically they manually resolve the MSAA here using whatever schemes we want. (I couldn't guarantee the above process is 100% correct, but at least this is what I can see from RenderDoc captures)

I personally think the terminology is getting too misleading right now. There are a lot of different techs/implementations under the same term that doesn't specify anything... At the same time, game and tech companies keep inventing/throwing out new terms.
 
I miss MSAA, like honestly, you don't appreciate how much TAA blurs modern games until you play an older game with MSAA.

It's super heavy and rarely supported but my go-to AA option if it's compatible is 4xMSAA+4xTrSSAA as it looks mind blowingly good at 4k.
 
I think DF found 1296p + FSR2 from the gameplay shown running on XSX.
This was Alex around 8 mins 15 seconds into DF's video, and he says "I counted one specific shot..". Maybe @Dictator can clarify if he was saying this was typical or atypical of what they took from the video.

I'm just not seeing what is a massive CPU increase is at the moment. It could be a bunch of stuff, maybe they've retuned Havok to eliminate the renowned "Bethesda physics", maybe they're finally separated physics and other world simulation from rendering which caused utter weirdness on earlier games at higher frame rates. Fallout 4 on XSX and Skyrim on PS5/XSX copes well with cheesewheels theives. People were stealing and piling up game objects in crazy numbers long before the sandwich pirate!
 
This was Alex around 8 mins 15 seconds into DF's video, and he says "I counted one specific shot..". Maybe @Dictator can clarify if he was saying this was typical or atypical of what they took from the video.

I'm just not seeing what is a massive CPU increase is at the moment. It could be a bunch of stuff, maybe they've retuned Havok to eliminate the renowned "Bethesda physics", maybe they're finally separated physics and other world simulation from rendering which caused utter weirdness on earlier games at higher frame rates. Fallout 4 on XSX and Skyrim on PS5/XSX copes well with cheesewheels theives. People were stealing and piling up game objects in crazy numbers long before the sandwich pirate!
Even so. Starfield is either a 1296p + FSR2 game or a game with DRS. It's just not a native 4K game the way Skyrim is on PS5 / XSX. That era is over. Seems to me the game is GPU bound with demanding lighting effects considering the scale of the world. We'll know more with the PC version.

And sure considering those consoles CPUs are like 4 times the power of Jaguar, I don't see the Zen 2 CPU being the problem like it was in Fallout 4 (which is running at 30fps on Jaguar CPUs). Actually I remember the main framerate problem of Fallout 4 on XB1 was due to I/O streaming when switching guns, problem almost totally resolved when using a SSD. So maybe they still have I/O problems on XSX with bigger amount of data transfers?
 
Last edited:
Even so. Starfield is either a 1296p + FSR2 game or a game with DRS. It's just not a native 4K game the way Skyrim is on PS5 / XSX. That era is over. Seems to me the game is GPU bound with demanding lighting effects considering the scale of the world. We'll know more with the PC version.
Starfield is certainly a big step up over Fallout 4, Skyrim PS5/XSX. I am interesting to see what improvements they bring to Fallout 4 with the current gen updates when they land this year.

I'll be playing this one on PC, I cannot go back to 30fps for first person shooty games. I have tried revisiting games from last gen that don't have 60fps updates and noped out. But my original point was, Starfield is a game that demonstrated why a mid-gen hardware upgrade may appeal to some.
 
Even so. Starfield is either a 1296p + FSR2 game or a game with DRS. It's just not a native 4K game the way Skyrim is on PS5 / XSX. That era is over. Seems to me the game is GPU bound with demanding lighting effects considering the scale of the world. We'll know more with the PC version.

And sure considering those consoles CPUs are like 4 times the power of Jaguar, I don't see the Zen 2 CPU being the problem like it was in Fallout 4 (which is running at 30fps on Jaguar CPUs). Actually I remember the main framerate problem of Fallout 4 on XB1 was due to I/O streaming when switching guns, problem almost totally resolved when using a SSD. So maybe they still have I/O problems on XSX with bigger amount of data transfers?

I do wonder if Bethesda implemented VRS for the Series systems on gaining/maintaining reasonable performance?
 
I'll be playing this one on PC, I cannot go back to 30fps for first person shooty games. I have tried revisiting games from last gen that don't have 60fps updates and noped out.

For testing purposes, I actually locked Star Citizen at 30fps, just to see how space exploration felt at such a framerate. Needless to say, I just couldn't do it. That being said, maybe some 30fps gaming a day or so before the game's launch would help Series owners with the transition.
 
Last edited:
I'll be playing this one on PC, I cannot go back to 30fps for first person shooty games. I have tried revisiting games from last gen that don't have 60fps updates and noped out. But my original point was, Starfield is a game that demonstrated why a mid-gen hardware upgrade may appeal to some.

It's not saying your PC will even be able to do 60fps, I'm not confident my CPU will be up to it.
 
It's not saying your PC will even be able to do 60fps, I'm not confident my CPU will be up to it.

If the PC version launches in poor shape, especially with middling framerates, Starfield will get slaughtered on Steam and everywhere else PC gamers can leave comments. So, for Bethesda's sake, I hope the PC edition performance isn't anemic.
 
If the PC version launches in poor shape, especially with middling framerates, Starfield will get slaughtered on Steam and everywhere else PC gamers can leave comments. So, for Bethesda's sake, I hope the PC edition performance isn't anemic.

It might not even be because of that, there's a not a single CPU on PC that offers 2x increase in single threaded performance over what Series-X has (A highly tuned 13900k will get close but that's it)

So if the game can drop to 30fps due to a CPU bottleneck on Series-X you won't lock it to 60fps on any PC.
 
It might not even be because of that, there's a not a single CPU on PC that offers 2x increase in single threaded performance over what Series-X has (A highly tuned 13900k will get close but that's it)

So if the game can drop to 30fps due to a CPU bottleneck on Series-X you won't lock it to 60fps on any PC.
It completely depends on what type of calculations it's doing.
 
It might not even be because of that, there's a not a single CPU on PC that offers 2x increase in single threaded performance over what Series-X has (A highly tuned 13900k will get close but that's it)

So if the game can drop to 30fps due to a CPU bottleneck on Series-X you won't lock it to 60fps on any PC.

Um, Star Citizen has many CPU intensive activities/processes to handle, and it runs extremely well on my system. Anyhow, if the game runs with middling performance on PC, you can expect the PC community to hose Starfield's reviews.
 
Star Citizen is not Starfield, and Alex at DF showed how bad Star Citizen can tank well below 60fps.

I've have not experienced any bad performance (sub 60fps) with Star Citizen since owning multiple high-end cards throughout certain periods of it's development. What Alex/DF experience isn't always indicative of others experiences.
 
I've have not experienced any bad performance (sub 60fps) with Star Citizen since owning multiple high-end cards throughout certain periods of it's development. What Alex/DF experience isn't always indicative of others experiences.

I'm not talking about GPU's and neither was Alex.

I'm talking about CPU's possibly not being enough on PC to even get Starfield to 60fps.

Alex's video showed Star Citizen's frame rate go from an easy 60fps to completely tanking when entering huge cities due the game suddenly becoming massively CPU limited.
 
I'm not talking about GPU's and neither was Alex.

I'm talking about my overall system performance when having a proper GPU, inclusive of CPU. Hell, there are plenty of YouTube videos showing great 60fps performance, and even 100fps+ gameplay. But anything is possible, so I wouldn't necessarily doubt certain sections can potentially bring on sub 60fps framerates.
 
Status
Not open for further replies.
Back
Top