Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

He also claims ps5 runs on average 15 to 20% faster than his 2070. He doesnt match settings, doesnt show anything, just emits this piece of info. I like how he also states that his aging cpu wasnt a bottleneck here, probably due to many people pointing out that his Unreal 5 videos had his PC numbers much lower than what they should be. People with similar of weaker rigs than his offered him pics and videos with much better framerates. He mainly glossed over those.

He also says when his own footage shows a 10% lead for the Series X with RT that its margin of error difference. :)))
Hardware unboxed released a few tests about and processor evolution in the last week. From that data, we can also assume that his CPU is a bottleneck. Maybe not in all situations but in enough to cripple the average frame rate.

Really don't know why he still uses that CPU as he also has another PC available for testing with a newer CPU. Just one CPU gen newer makes a lot of difference in certain scenarios and comparisons more comparable.
 
Someone made a comparison of PS5s "native 4K".. as NXGamer says... and PC. Yea, there's no way in hell it's native.. It's definitely being reconstructed.. which means his entire comparison is flawed.

PS5 60fps mode:
5xN11jr.jpg


PC

gDQ68la.jpg



You can obviously tell they're not running at anywhere close to the same internal resolution.
I think the PS5 has Chromatic aberration while PC hasn't.
 
He also claims ps5 runs on average 15 to 20% faster than his 2070. He doesnt match settings, doesnt show anything, just emits this piece of info.

Yes this was also my biggest issue with the video. He briefly stated (basically washed over it) earlier on that the consoles use lower resolution RT than the PC but also stated they are otherwise running at the PC's max RT settings (roughness cut off, reflection distance). So what RT setting is he using on the PC for this comparison? Has he just whacked it up to maximum and called it a day? It wouldn't be the first time.

The fact is that a 2070 shouldn't be 15% slower than a PS5 is a game with heavy RT. If that's the result you're getting then your analysis is probably wrong. Assuming it's simply down to an RDNA2 optimised RT implementation that brings RT performance of RDNA2 up to Turing levels (something for which there is no precedent) is just lazy.
 

He says its not, but I'm pretty sure the consoles are using reconstruction "interlaced" mode on PS5/XSX... the only difference being that they have a better TAA solution which actually covers it up, unlike PC.. where TAA isn't nearly as robust.

I've gone back and forth between RE3 and RE2 on my PS5 and PC with and without interlaced mode, the PS5 is absolutely using reconstruction in both of those titles in every mode, RE7 at least looks to be native however (and would stand to reason based on native 4K performance of my 3060 which is solid).

I can sympathize to some extent with the difficulty in detecting this, the games are notorious with lots of post processing, and the evolved checkerboarding/interlaced mode is indeed very good! However, you can see striated patterns + breakup in hair during movement as well as faint RGB outlines on fine edges at points (not CA, native does not exhibit these with CA on) and the image is also pretty clearly less sharp than native.

iIt's one thing to just look at it running by itself and conclude that it's native, but if you have a PC with a 4K screen that you can flick to, it's uh...somewhat less understandable.

(sorry for gamma differences)

PS5, non-RT 4k interlaced mode:

sbmvIpJ.jpg


PC, 4K native (equivalent PS5 graphics settings):

ONzPS8l.jpg


PC, 4K interlaced (equivalent PS5 graphics settings):

OY8olZu.jpg


I mean come on, switching back and forth between the PS5 and native PC and it should be well, clear.

The PS5 has the best TAA no doubt, the Series X (from shots I've seen) is more similar to the PC in that it misses some specular edges, but I don't have one in front of me to truly judge. The PC's interlaced appear even sharper than the PS5 so in some scenes it can even look superior, but in motion I prefer the PS5's even if it's softer, it's just a little more stable.

The nature of the heavy post processing in this game obscures what might be more evident in other games, there's no doubt the end results is very good and close to native, but it's not native, which if not crucial to final image quality is at least relevant if you're doing performance comparisons between platforms.

Edit: Another area where this really should have stood out to NxGamer is the police station in RE2 without RT enabled: Those awful SSR artifacts are magnified when using reconstruction, the pixels are much 'chunkier':

PS5, no RT:

YmCOB0M.jpg


PC, no RT, 4k native:

ThyryGq.jpg


PC, no RT, 4k Interlaced

H82McRQ.jpg


Oddly enough even in interlaced the PC version seems to have slightly less breakup in those reflections, as well they stabilize if you stare at them for a bit - the PS5 never stops popping. Minor quibble though as you're not going to be staring at the floor for seconds at a time in this game, the SSR is shit no matter what platform you're on.


Of note:The PC version of RE2 in particular (and perhaps RE3, but it could be down to just TAA/sharpening differences in this version, it looks sharper in this version that before with native too) has likely incorporated Re:Village's interlaced mode, while not as 'perfect' as the PS5's implementation, it indeed does show massive improvement compared to how it was before:

RE2 4K Interlaced PC, Pre-RT Patch (zoomed in):

1BmtG1n.png


RT7 4k Interlaced PC, Post-RT Patch (zoomed in):

QoVfZ1s.png




He also claims ps5 runs on average 15 to 20% faster than his 2070. He doesnt match settings, doesnt show anything, just emits this piece of info. I like how he also states that his aging cpu wasnt a bottleneck here, probably due to many people pointing out that his Unreal 5 videos had his PC numbers much lower than what they should be. People with similar of weaker rigs than his offered him pics and videos with much better framerates. He mainly glossed over those.

He also says when his own footage shows a 10% lead for the Series X with RT that its margin of error difference. :)))

The 2070 might be slower than the consoles in interlaced mode with matched settings perhaps, on my 3060 in 4k interlaced mode with equivalent settings in RE3 at least, while the vast majority is 60+ fps, I do get the brief drop into the 50's in heavy alpha scenes, and a 3060 is very similar to an overclocked 2070 in performance. So not entirely unreasonable if that's what he's using.

However, if he's comparing it to to max settings on the PC, that would be incorrect. All 3 games (even RE7) do not use the PC's Max shadow settings, which can have a large impact on performance. They are also using lower volumetric lights, likely not HBAO, and even in the case of RE7, lower bloom (albeit could be just a design choice). Running those games fully maxxed out at native 4K basically requires something in the range of a 3070/3080 to stay completely locked at 60, simply extrapolating native performance from a 2070 even should give you a fair bit of pause when concluding they're native, it would be the most well optimized console release ever, significantly. I mean perhaps possible, Death Stranding kind of gets in that territory, but again just flipping back and forth between PC native and the PS5 screenshots should have put this to bed rather quickly.
 
Last edited:
I've gone back and forth between RE3 and RE2 on my PS5 and PC with and without interlaced mode, the PS5 is absolutely using reconstruction in both of those titles in every mode, RE7 at least looks to be native however (and would stand to reason based on native 4K performance of my 3060 which is solid).

I can sympathize to some extent with the difficulty in detecting this, the games are notorious with lots of post processing, and the evolved checkerboarding/interlaced mode is indeed very good! However, you can see striated patterns + breakup in hair during movement as well as faint RGB outlines on fine edges at points (not CA, native does not exhibit these with CA on) and the image is also pretty clearly less sharp than native.

iIt's one thing to just look at it running by itself and conclude that it's native, but if you have a PC with a 4K screen that you can flick to, it's uh...somewhat less understandable.

(sorry for gamma differences)

PS5, non-RT 4k interlaced mode:

PC, 4K native (equivalent PS5 graphics settings):

PC, 4K interlaced (equivalent PS5 graphics settings):
I mean come on, switching back and forth between the PS5 and native PC and it should be well, clear.

The PS5 has the best TAA no doubt, the Series X (from shots I've seen) is more similar to the PC in that it misses some specular edges, but I don't have one in front of me to truly judge. The PC's interlaced appear even sharper than the PS5 so in some scenes it can even look superior, but in motion I prefer the PS5's even if it's softer, it's just a little more stable.

Of note:The PC version of RE2 in particular (and perhaps RE3, but it could be down to just TAA/sharpening differences in this version, it looks sharper in this version that before with native too) has likely incorporated Re:Village's interlaced mode, while not as 'perfect' as the PS5's implementation, it indeed does show massive improvement compared to how it was before:

RE2 4K Interlaced PC, Pre-RT Patch (zoomed in):



RT7 4k Interlaced PC, Post-RT Patch (zoomed in):



The nature of the heavy post processing in this game obscures what might be more evident in other games, there's no doubt the end results is very good and close to native, but it's not native, which if not crucial to final image quality is at least relevant if you're doing performance comparisons between platforms.




The 2070 might be slower than the consoles in interlaced mode with matched settings perhaps, on my 3060 in 4k interlaced mode with equivalent settings in RE3 at least, while the vast majority is 60+ fps, I do get the brief drop into the 50's in heavy alpha scenes, and a 3060 is very similar to an overclocked 2070 in performance. So not entirely unreasonable if that's what he's using.

However, if he's comparing it to to max settings on the PC, that would be incorrect. All 3 games (even RE7) do not use the PC's Max shadow settings, which can have a large impact on performance. They are also using lower volumetric lights, likely not HBAO, and even in the case of RE7, lower bloom (albeit could be just a design choice). Running those games fully maxxed out at native 4K basically requires something in the range of a 3070/3080 to stay completely locked at 60, simply extrapolating native performance from a 2070 even should give you a fair bit of pause when concluding they're native, it would be the most well optimized console release ever, significantly. I mean perhaps possible, Death Stranding kind of gets in that territory, but again just flipping back and forth between PC native and the PS5 screenshots should have put this to bed rather quickly.
Yep. I mean I found it easy to tell even in his video as he was doing the comparison.. It does a great job at covering it up, but certain aspects of the image always give it away.

So essentially his whole performance comparison between the PS5 and PC is completely wrong. Not to mention we can assume he's using the PC version maxed out... Certain settings like Shadows and Volumetric lighting, can have little to no visual effect in most scenes, and yet drastically reduce performance on the side. I usually set my Shadows to high, and Volumetric lighting to medium. You get a big performance bump from adjusting those two settings and barely lose anything in visual quality.

I sincerely doubt the PS5 is using either of those settings as an equivalent to PCs max..
 
Yep. I mean I found it easy to tell even in his video as he was doing the comparison.. It does a great job at covering it up, but certain aspects of the image always give it away.

So essentially his whole performance comparison between the PS5 and PC is completely wrong. Not to mention we can assume he's using the PC version maxed out... Certain settings like Shadows and Volumetric lighting, can have little to no visual effect in most scenes, and yet drastically reduce performance on the side. I usually set my Shadows to high, and Volumetric lighting to medium. You get a big performance bump from adjusting those two settings and barely lose anything in visual quality.

I sincerely doubt the PS5 is using either of those settings as an equivalent to PCs max..

They're not. Max shadows can definitely have an impact in some scenes, not that commonplace but there are moments where there's banding in shadows on high vs max.

RE7, PS5. Pay attention to the wall on the right:

sXrqOYy.jpg


PC, native 4K, Shadows Max:

cd70ogv.jpg
 
They're not. Max shadows can definitely have an impact in some scenes, not that commonplace but there are moments where there's banding in shadows on high vs max.

RE7, PS5. Pay attention to the wall on the right:

PC, native 4K, Shadows Max:
Yea, you're right. I just tested it on my PC and see the same thing. In RE2 I didn't really notice it outside of a few areas where it was obvious, as it is in your shots here.. but it's definitely there.

So his whole comparison is completely wrong and inadmissible.
 
VGTech put up their analysis on their youtube channel, and they say the same thing, that it uses a form of reconstruction.


Resident Evil 7 on PS5 and the Xbox Series consoles has two graphical settings: High Frame Rate Mode and Ray Tracing. The High Frame Rate Mode and Ray Tracing can't be enabled at the same time, so there are three different modes available. PS5 and Xbox Series X in all modes render at a resolution of 3840x2160 using what appears to be a form of checkerboard rendering. Xbox Series S in all modes renders at a resolution of 2560x1440 using what appears to be a form of checkerboard rendering.

VGTech are right up there with Digital Foundry when it comes to pixel counting IMO, so for me, it's pretty conclusive here.
 
The 2070 might be slower than the consoles in interlaced mode with matched settings perhaps, on my 3060 in 4k interlaced mode with equivalent settings in RE3 at least, while the vast majority is 60+ fps, I do get the brief drop into the 50's in heavy alpha scenes, and a 3060 is very similar to an overclocked 2070 in performance. So not entirely unreasonable if that's what he's using.

Are you talking about RT or non RT modes? Because with RT on the PS5 drops heavily under 60fps quite often according to NXG's video (into the 30's at some points). So if the 3060 is holding to it most of the time and only dropping to the 50's when it doesn't then that looks like significantly better performance than the PS5

However, if he's comparing it to to max settings on the PC, that would be incorrect. All 3 games (even RE7) do not use the PC's Max shadow settings, which can have a large impact on performance.

In fairness he does state he dropped the PC's shadow settings to medium for his comparison (but doesn't mention any other setting). He also states he *thinks* the PS5 is running at the PC's max shadow settings.... but I'd take that with a healthy pinch of salt.
 
The fact is that a 2070 shouldn't be 15% slower than a PS5 is a game with heavy RT. If that's the result you're getting then your analysis is probably wrong. Assuming it's simply down to an RDNA2 optimised RT implementation that brings RT performance of RDNA2 up to Turing levels (something for which there is no precedent) is just lazy.


In special considering that generally the PS5 is around RTX2070/S performance in normal raster, below it in ray tracing performance.
 
VGTech put up their analysis on their youtube channel, and they say the same thing, that it uses a form of reconstruction.




VGTech are right up there with Digital Foundry when it comes to pixel counting IMO, so for me, it's pretty conclusive here.

Pretty surprised by this, RE7 doesn't really need reconstruction to hit 4k/60 especially at those settings. If it's true, it could be perhaps due to the HFR and/or RT mode which would probably need it, and they just don't bother switching to native for the non-RT mode. I generally trust VGTech as well, but this is one of those examples where just a framerate chart over a video is not enough, a detailed analysis and perhaps some voiceover to detail how this conclusion was made can help.

It's an truly excellent implementation though if so. Native 4K on my PC does look a little 'sharper', but I haven't seen one instance (so far) where I could pick it out, albeit you don't have the ability to zoom into moving hair at moments notice like you do with third person RE2/RE3. But getting as close as I can to find lines/meshes and I just don't see any breakup, with RE2/3 there are some telltale patterns like with neon signs/hair showing some reconstruction but for the life of me can't pick them out with RE7. Interlaced mode on the PC is better than before, but in some instances like with Mia's hair it breaks down hard, whereas on the PS5 I don't see any breakup at all.
 
Last edited:
Pretty surprised by this, RE7 doesn't really need reconstruction to hit 4k/60 especially at those settings. If it's true, it could be perhaps due to the HFR and/or RT mode which would probably need it, and they just don't bother switching to native for the HFR mode. I generally trust VGTech as well, but this is one of those examples where just a framerate chart over a video is not enough, a detailed analysis and perhaps some voiceover to detail how this conclusion was made can help.

It's an truly excellent implementation though if so. Native 4K on my PC does look a little 'sharper', but I haven't seen one instance (so far) where I could pick it out, albeit you don't have the ability to zoom into moving hair at moments notice like you do with third person RE2/RE3. But getting as close as I can to find lines/meshes and I just don't see any breakup, with RE2/3 there are some telltale patterns like with neon signs/hair showing some reconstruction but for the life of me can't pick them out with RE7. Interlaced mode on the PC is better than before, but in some instances like with Mia's hair it breaks down hard, whereas on the PS5 I don't see any breakup at all.
VGTech breaks always breaks it down in the description and provides the exact images they used to pixel count, inviting others to check their homework so to speak.

Stats:

Images pixel counted:
 
Are you talking about RT or non RT modes? Because with RT on the PS5 drops heavily under 60fps quite often according to NXG's video (into the 30's at some points). So if the 3060 is holding to it most of the time and only dropping to the 50's when it doesn't then that looks like significantly better performance than the PS5

Non-RT, interlaced. With RT/interlaced 4k, RT on high in the police station in RE2, walking around the first area gives me 53-60fps. Judging from how it 'feels' on the PS5, this is slightly higher performance. Medium RT bumps it up a few but nowhere near a sustained 60.

In fairness he does state he dropped the PC's shadow settings to medium for his comparison (but doesn't mention any other setting). He also states he *thinks* the PS5 is running at the PC's max shadow settings.... but I'd take that with a healthy pinch of salt.

Ah ok missed that. However the PS5 is definitely not using the PC's max shadow settings in any of the 3 titles, gone back and forth and there's definitely the banding you get with high and lower in all 3 on the PS5. At least on RE7 medium gives quite a bit more obvious gradients on that hallway so I don't think it's as low as medium though.
 
Last edited:
VGTech breaks always breaks it down in the description and provides the exact images they used to pixel count, inviting others to check their homework so to speak.

Stats:

Images pixel counted:
Thanks. Ok, went back to RE7. It's tough, but think I might see evidence of it here in Mia's hair with some slightly higher breakup.

Edit: Ok went back and took PNG's from the PS5, initial image I posted from the PS5 was from a recording which was just too compressed to compare to the PC which is from Nvidia shadowplay (SP has very little compression artifacts so PC is at a slight disadvantage here, but very slight). Spamming the screenshot button on PS5 is why these aren't as aligned as perfectly as they could be.

PS5 (no RT):

VqNXIy3.png


PC, 4K native (no RT)

ZrBk1dY.png


(try to ignore the shadow dithering as I think PC's shadows were on Very High for this as opposed to the PS5's setting of High, in those scenes my PC's GPU was 95%+ and I was just using SSAO so in certain spots this might not be a cakewalk to render at native as I first thought)

It could be down to TAA differences, it's hard to rule that out as if this is reconstructed there's not one of these 3 titles I can compare like-to-like. At least going for the interlaced comparisons, the PC implementation of it - while perhaps 'sharper' at times - definitely leads to more breakup/shimmering than what I see on the PS5, especially in hair at certain distances. If this was due just a difference in TAA and the res was the same, I would not expect to see more breakup on the PS5 considering the stronger TAA.

For comparison, Interlaced on the PC. The majority of the detail in interlaced is also hard to tell the diff between native 4K like the PS5, and even here it looks decent, but obviously the hair is more aliased:

Lm4uyhs.png


However when Mia leans back, her hair really falls apart, despite the rest of the image looking fine. The PS5 and native don't do this:


eCF86J3.png


Vs the PS5 (from video, so that's even adding some compression:

F6lyhHP.png


So the PS5 implementation works really well, but yeah there indeed might be reconstruction. Surprising but I guess Capcom felt it wasn't worth the bother to switch rendering modes between the 3 settings. RE2/3 are more evident when compared to a native image imo.
 
Last edited:
The RE games on console uses the interlacing /Checkerboarding hence the pattern of pixels in reflections.
The difference between versions is that TAA in PC kinda does Not accumulate at all... I mentioned that in my Village review. Pretty Sure TAA in PC is Not working correctly.
 
The RE games on console uses the interlacing /Checkerboarding hence the pattern of pixels in reflections.
The difference between versions is that TAA in PC kinda does Not accumulate at all... I mentioned that in my Village review. Pretty Sure TAA in PC is Not working correctly.

Yeah this looks to a complete swap in from Re:Village, and inherits all its flaws with its PC implementation in that game as well.
 
Back
Top