I spent a few hours in the game on PC and it seems there is no DDGI as I initially thought based on PS5 demo that I had tried before and GameGPU results with the same RT time in 1080p and 4K.
There is clearly per pixel AO/GI because noise and low res artifacts are clearly visible in PC version (didn't pay attention to them in PS5 demo, probably PS5 demo had better denoising).
Based on game benchmarks, we know that RT performance in RE8 is not tied to screen resolution, RT takes the same amount of time in 1080p and 4.
Scaling like that can be explained by either DDGI, which doesn't produce noise in screen space and has fixed cost in all resolutions, or by 1/4 res RT tracing resolution for 4K and full res in 1080p. The issue is that with their current denoiser, you can easily spot noise even in 4K, this could have been easily fixed by increasing RT resolution to 1/2 res with checkerboarding or to full res (this should be super easy for devs, probably even possible via config hacks).
In Metro EE, you can easily spot noise too with 1/4 res RT resolution ("Normal" preset iirc), but at least there are options for 1/2 checkerboarding and full res RT for higher end GPUs with highly reduced noise, but such options increase performance lead of 3090 over 6900 XT from 39% to 52%, which makes a perfect sense since with RT resolution increases RT portion of frame.
So what are the reasons behind limiting RT resolution by 1080p and leaving noise in RE8 without proving better quality options?