Current Generation Games Analysis Technical Discussion [2020-2021] [XBSX|S, PS5, PC]

Status
Not open for further replies.
NXGamer is ex game dev with years of experience so you can hide you concerns about his abilities. He noticed artifacts but he think its just not cbr but rather other performance tweak/reconstruction usage.

Yeah, the only way John or Alex could tell if RE8 is potentially using CBR was certain artifacting around the fireplace flame in the Maiden demo (see below), which could be lower quality renders/alphas giving the impression of CBR usage. We'll know the answer in two weeks anyhow. As for NX Gamer, he is great at his work and craft, but people can make mistakes. It doesn't make them less capable of doing their job, it just makes them human.

I think the most interesting part of the final game release will be the RT mode deltas' or fps differences between XBSX and PS5 in given scenes. Sort of the new 'Control' version of RT benchmarking with unlocked framerates, however, this time with actual gameplay.

 
Yeah, the only way John or Alex could tell if RE8 is potentially using CBR was certain artifacting around the fireplace flame in the Maiden demo (see below), which could be lower quality renders/alphas giving the impression of CBR usage.

Yeah, I mean. I could definitely be wrong, i'm watching this on a 1080p 14" screen, but this doesn't look like cbr to me -- it looks like the distortion is on a lower res buffer, or the TAA doing a bad job with the distortion (because it doesnt produce motion vevctors maybe?). Especially since this game uses (relatively low sample count) RT, I'd expect a serious doubling up of artifacts on those flickering RT reflections if it was cbr.

Like, a shot like this:
(timestamped, where the woman surrounded by flies attacks) -- extremely fast motion, nothing is similar from frame to frame, dozens of objects on screen, lots of transparency... there's no way cbr would ever reconstruct this perfectly. Is there anything noticably up in 4k?
 
Last edited:
Yeah, I mean. I could definitely be wrong, i'm watching this on a 1080p 14" screen, but this doesn't look like cbr to me -- it looks like the distortion is on a lower res buffer, or the TAA doing a bad job with the distortion (because it doesnt produce motion vevctors maybe?). Especially since this game uses (relatively low sample count) RT, I'd expect a serious doubling up of artifacts on those flickering RT reflections if it was cbr.

Like, a shot like this:
(timestamped, where the woman surrounded by flies attacks) -- extremely fast motion, nothing is similar from frame to frame, dozens of objects on screen, lots of transparency... there's no way cbr would ever reconstruct this perfectly. Is there anything noticably up in 4k?

Yeah, this looks too good to be CBR, but you never know. If it is, Capcom needs to share its techniques with DF or NX in a one-on-one interview.
 
Yeah, the only way John or Alex could tell if RE8 is potentially using CBR was certain artifacting around the fireplace flame in the Maiden demo (see below), which could be lower quality renders/alphas giving the impression of CBR usage. We'll know the answer in two weeks anyhow. As for NX Gamer, he is great at his work and craft, but people can make mistakes. It doesn't make them less capable of doing their job, it just makes them human.
sure, in the end its just guessing/hint
 
Yeah, this looks too good to be CBR, but you never know. If it is, Capcom needs to share its techniques with DF or NX in a one-on-one interview.
On resetera Liabe Brave lists RE2 and RE3 as being 1620t on Pro. Which means temporal accumulation, so reconstruction, but it's not CBR. I don't remember where it comes from (DF, NXGamer or his own tests), but that could be it. Some kind of exotic reconstruction method only used by Capcom.

C3DjFEq.png

https://www.resetera.com/threads/all-games-with-ps4-pro-enhancements.3101/
 
On resetera Liabe Brave lists RE2 and RE3 as being 1620t on Pro. Which means temporal accumulation, so reconstruction, but it's not CBR. I don't remember where it comes from (DF, NXGamer or his own tests), but that could be it. Some kind of exotic reconstruction method only used by Capcom.

C3DjFEq.png

https://www.resetera.com/threads/all-games-with-ps4-pro-enhancements.3101/

Could be. However, Resident Evil Village should be more performative (or offer more performance headroom for the possibilities of native 4K) than RE2/3 because it doesn't require the additional animation, collision, physics, and tracking of the main character as seen in third-person or over-the-shoulders games. Mind you, I'm not saying RE8 isn't using reconstruction, just offering an opinion for NX Gamer's conclusion.
 
What, exactly, would the cpu be doing? calculating extremely complex rigs I guess? High end cpu only cloth sim? Incredibly unlikely, especially when we see very obvious expensive rendering changes that explain the framerate drop. Not you in particular, but people in this forum throw out 'cpu limited' a lot without any justification or explanation for why the cpu would be under load.
Well, I don't know personally what the CPU is doing, but adding extra actors to a scene can increase CPU usage. And I would agree that people throw out "CPU limited" without any explanation, just like there are claims that entire games aren't CPU limited because some scenes run at X framerate.
 
I can't say for sure as I don't understand Japanese, but this video from Capcom seems to mention their use of Checkboard Rendering in RE7 on PS4 Pro and Xbox One X.
Great find! I translated using yandex. So it seems they indeed use some kind of CBR on Pro, XBX and suprisingly XB1. Pro CBR seems to be different than the one used on Xbox machines. This could explain the difference of sharpness as the game is sharper on Pro. In both cases (XB1 and XBX), DF thought some AA difference was explaining the comparatively blurrier image on the Xbox versions so it's great to have some explanation from the devs.

wE0fdmp.png
 
Great find! I translated using yandex. So it seems they indeed use some kind of CBR on Pro, XBX and suprisingly XB1. Pro CBR seems to be different than the one used on Xbox machines. This could explain the difference of sharpness as the game is sharper on Pro. In both cases (XB1 and XBX), DF thought some AA difference was explaining the comparatively blurrier image on the Xbox versions so it's great to have some explanation from the devs.

wE0fdmp.png

Thanks for the translation. IIRC the base Xbox One originally used interlaced rendering and then this was changed to checkerboard rendering at the same time that the Xbox One X update was released.
 
Last edited:
This could explain the difference of sharpness as the game is sharper on Pro. In both cases (XB1 and XBX), DF thought some AA difference was explaining the comparatively blurrier image on the Xbox versions so it's great to have some explanation from the devs.

I don't remember XBX version being blurrier (the image was slightly soft, but not blurry), I actually own RE7 on Pro, XBX, and PC. From what I can remember, RE7 imagery on Pro was noisier and muddier when compared to XBX. Overall for me anyhow, it was PC, XBX, then Pro.

https://www.eurogamer.net/articles/...one-x-offers-a-big-leap-over-standard-console
If it's 1800p on Pro against the 2160p on Xbox One X, there is a catch. Anti-aliasing differs between the two, and that seemingly has a bigger impact than either machine's base resolution. Running side-by-side comparisons really is a fascinating litmus test here and in motion each console has pros and cons. Xbox One X, for example, offers the cleanest image on console by far and PS4 Pro does produce more shimmer and visual noise, especially in busy areas like the opening forest. The visual output may suggest that Microsoft's machine is running equivalent to the FXAA+TAA mode that's available on PC. It's extremely thorough, but judging by the ghosting artefacts behind moving objects, PS4 Pro may also have a temporal component.

The downside to Xbox One X? Well, despite the higher base pixel count, the image as a whole is a touch softer than PS4 Pro's. This may be down to varying anti-aliasing solutions. There's no doubt Xbox One X gives a more pristine result overall, and shimmer is reduced - but perhaps its anti-aliasing approach is a double-edged sword here. Regardless of the suggested 1800p vs 2180p contest, it all boils down to a choice between X's clean but softer image, or the sharp but visually noisier output on PS4 Pro.

The article also highlights the issues of guessing Capcom's rendering solution (although Capcom states CBR). Either way, Capcom needs to share some of this black magic with other developers who CBR methods are complete trash.
In our original analysis of the PS4 Pro edition, Capcom told us that the game rendered at 2240x1260, with additional lighting enhancements. Since then we've heard reports of a different number being used on Pro's 4K output mode. YouTube channel VG Tech in particular dissected the most recent patch 1.06 on PS4 Pro, finding a 3200x1800 resolution with what they believe to be checkerboard rendering.

We've approached Capcom for clarification and will update should a response arrive, but those stats hold up in our tests, which also suggest 1800p rendering - though there's a big question mark over the use of checkerboard rendering. The fact is that the stippling artefacts you'd associate with this technique are very well hidden behind the game's waves of post effects. In motion, the signs just aren't there; but it would explain how such a jump from 1260p is possible, if true. Checkerboarding or not, the good news is PS4 Pro's results are visibly upgraded next to a regular PS4's 1080p, and it's a good mode to have available if you own a 4K TV.

Using similar tests, Xbox One X appears to have an even higher target, with pixel counts resolving at the full 3840x2160. This 2160p is again obscured by a thick post-process pipeline, but there's no question that the raw frame-buffer is higher than Pro's. And much like Sony's console, there is the possibility Capcom is using checkerboarding or some kind of temporal reprojection technique to hit that number, but the evidence of it in motion is minimal. Again, it's really to the developer's credit that any wire-work to get these resolutions this high is so effectively disguised.
 

- All versions run in backward compatibility mode. This means that the settings are limited to the oldgen versions.
- PS5 runs at 1080p resolution. Xbox Series S and Series X run at 792p and 1440p respectively.
- The framerate remains stable at 60FPS. - The anisotropic filter is higher in the Xbox versions.
- Loading times are faster on Xbox versions.
- The drawing distance is greater in the Xbox versions.
- Shadows are superior on PS5 and Series X compared to Series S.

something is wrong with ps5 load times 14.6s vs xss/x 5.6s
 
Status
Not open for further replies.
Back
Top