Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
What is wrong with analyzing and reporting on this apparent fubar? They would not have known otherwise. Folks would have been curious about a Resident Evil entry anyway.

Agreed. The point in DF is supposed to help people make purchase decisions. A poor performing port for a title that will most likely get reasonable sales for a discount title is something people might be curious about.
 
Are you getting framerate drops with a 970??
No sure if its frame drop, but I notice the character animation looks super choppy (like 30 fps) for a few second then becoming smooth again. Some very inconsistency in the input lag when aiming with the hand gun. So yes? Using nvidia adoptive vsync, so it shouldn't lock the frame rate.
 
It's a lot of speculation, but if the game is bottlenecked by single-thread performance, then X1 could have an advantage over PS4 because of the marginal clock speed improvement, assuming that the PS4 version is using GNMX which is more CPU intensive. The only other thing I could see is a bandwidth issue, where for some reason the ROPs are bandwidth starved, like if they have a memory contention issue that's plummeting the effective memory bandwidth. That seems unlikely, but X1 having some dedicated GPU memory with ESRAM may alleviate that bottleneck. That would hinge on the assumption that fog, foliage are fillrate limited. I don't know what the performance implication of the flashlight are.

Either way, I wouldn't read anything into it in terms of general system performance, because it's not the most highly-polished optimized AAA game out there. It would just be a couple explanations for this specific game.
 
Last edited:
I think 16AF combined with GNMX makes the most sense in light of previous discussions, assuming that the game isn't running some kind of weird 3DS emulator with upscaling support.
 
I think 16AF combined with GNMX makes the most sense in light of previous discussions, assuming that the game isn't running some kind of weird 3DS emulator with upscaling support.

But why would 16AF impact performance on PS4 and not X1? Same GPU family, so in theory you'd think it would follow the same line of performance, if not worse on X1 because it's just a weaker GPU. If there's an issue with 16AF, it has to be a bandwidth issue, exasperated by foliage and fog. GNMX I can see being an issue on the CPU side, or maybe the increased CPU overhead also leads to the CPU hitting memory more often, leading to contention and bandwidth issues. All speculation, of course.
 
That's why initially I would think it's the GNMX implementation of AF somehow having a bandwidth issue or something like that.
 
There are some weird issues with the PC version, mostly stutter and huge frame dips within certain areas. I haven't seen this type of performance impact since Watch Dogs (PC Edition). It's not ACU bad, but noticeable and breaks the immersion at times.

I found this solution (cheap fix), but haven't tried it yet (below). But yes, it seems the PC edition is buggered too...
stutters and freezes :

For all of those who are having stutter issues and such, go to your settings page and put HDR precision on LOW and disable both V-Sync and Antialiasing while putting your framerate on variable. This will help in fixing stuttering issues. If not, you need to update the driver
 
#notallPCs ?

John mentioned he was using an i5/GTX780 setup, so maybe he just never came across the issue.

I'm using both my beast setups (R9 295x2 and Titan Z), and I still get this weird stuttering / framerate issues during certain sections. What's weird, it doesn't repeat sometimes within the same section. Once I have some more time with the game, maybe I'll post some performance shots.
 
I'm using both my beast setups (R9 295x2 and Titan Z), and I still get this weird stuttering / framerate issues during certain sections. What's weird, it doesn't repeat sometimes within the same section. Once I have some more time with the game, maybe I'll post some performance shots.
And we all know how stable multi-GPU configs are with all games...
 
Yeah.... probably going to have to hope/wait for nVidia if anything. The budget for this game seems rather awful (but I suppose the writing was on the wall with the episodic format).
 
There are some weird issues with the PC version, mostly stutter and huge frame dips within certain areas. I haven't seen this type of performance impact since Watch Dogs (PC Edition). It's not ACU bad, but noticeable and breaks the immersion at times.

For the record, I haven't had any issues at all on ACU since I used RivaTuner to limit the frame rate to 35fps. That's with vsync on and graphics max except textures and shadows on their second highest (Nvidia) settings with 1080p/FXAA.

The game is perfectly smooth and consistent on my modest 670.
 
For the record, I haven't had any issues at all on ACU since I used RivaTuner to limit the frame rate to 35fps. That's with vsync on and graphics max except textures and shadows on their second highest (Nvidia) settings with 1080p/FXAA.

The game is perfectly smooth and consistent on my modest 670.

I was talking before all the patches... especially the performance issues that plagued AMD's GPUs during ACU launch timeframe. That being said; hopefully Capcom will patch/resolve the performance issues that RER2 (PC edition) is exhibiting.
 
I don't really understand why this game could be CPU limited, but what else can it be?
Well, this is a little crazy theory but the PS4 should trump the Xbox One because it has a more powerful hardware, GPU side, although..I wonder..could it have something to do with the fact that 16xAF impacts framerate on the PS4? (SDK bug?) It's just a theory, because nothing else makes sense to me.

On a different note, related to Full RGB/Limited RGB & the trouble Digital Foundry has had with it on the Xbox One at times, a NeoGAF user took a couple of screenshots showing the differences between Full RGB and Limited RGB as taken by the screen capture feature from the framebuffer.

Full RGB:
pool_fulls8uxb.png


Limited RGB:
pool_limitedxduu9.png
 
I don't really understand why this game could be CPU limited, but what else can it be?
There could be an easy explanation: Like the game could use all CPU cores on XB1 but not on PS4 (using just a few cores).

Doesn't XB1 hypervisor automatically allocate CPU ressources, all if necessary?
 
Maybe the CPU is having to spoonfeed the GPU data in this old ass DX9 targeted engine, and so CPU and the CPU bus are a significant bottleneck to the graphics cores.
 
It seems very possible that the PC engine was very reliant on single-threaded performance, and maybe that carried over to these console ports.
 
It seems very possible that the PC engine was very reliant on single-threaded performance, and maybe that carried over to these console ports.

If it is the RE6 engine used again, then I can attest to its poor single thread performance. And I mean, it runs REALLLY bad on my old PC, often dropping to 20fps and sometimes lower, when even humdingers like Crysis 2 and 3 run pretty well.

I have a slowish Phenom 2 X4 handicapped with DDR2 RAM.
 
Status
Not open for further replies.
Back
Top