Man from Atlantis
Veteran
According to PCGW only 21:9 and 32:9 supports no black bars.haven¡'t tried 32:9 yet on my screen, but at 4K, 2800x1600, 1080p, etc, the game has black bars.
According to PCGW only 21:9 and 32:9 supports no black bars.haven¡'t tried 32:9 yet on my screen, but at 4K, 2800x1600, 1080p, etc, the game has black bars.
32:10 sits in between 21:9 and 32:9, so unless Ninja Theory used spaghetti hardcoding to manually specify the black bar behavior for every single aspect ratio they were aware of and forgot about 32:10, it shouldn't have black bars either.How about 32:10 ?
the ROG ALLY has 8,3 teraflops. I had forgotten, and this video by Olie reminds me how crazy that is tbh.
Up next: Series S and Steam Deck.
Very respectable version of the game on Series, though at a lower resolution and with some settings reduced. The loss of Lumen RT reflections is a pity IMO, as at times it compromises the visual consistency beyond just things looking softer and noisier.
On Steam deck, DP4a XeSS once again better than FSR for a similar performance cost. Playable on Steam Deck but needs to be cut back to the point where it looks pretty rough.
Predictably, there's a point below which UE5 doesn't scale so well in terms of final output image.
You are talking about the soap opera effect. Sometimes I can see it in cutscenes, but during gameplay I have never felt it.Is it weird that I vastly prefer the Hellblade2 Series X visuals over whatever the maxed out PC looks like?
The animations especially the running look superior on series X as the motion blur smooths out the animation and gives it a more natural look. I prefer 4K 60fps in other titles, but on Series X it just looks more consistent IMO.
Does anyone understand?
Dual issue very rarely gets used and the Ally only clocks around 2 GHz, so it's more like a 3 TF RDNA 2 part.the ROG ALLY has 8,3 teraflops. I had forgotten, and this video by Olie reminds me how crazy that is tbh.
The running animation at 60fps looks fake as you see the run-cycle repeating itself, whereas at 30 fps the blur hides it and your mind fills in the gaps. Also at 4K with the highest resolution you see for example the lava-splashes could be ‘texture-plains‘, they don’t even interact with the lava when they fall down. On the xbox series x settings however they are less ‘separated‘ from the lava and thus it you assume that it merges when it splashes down in the pool of lava as you don’t see it as clearly.You are talking about the soap opera effect. Sometimes I can see it in cutscenes, but during gameplay I have never felt it.
You can lock the PC version to 30 fps or 24 fps if you care. Also, if you find the fidelity to high due to resolution, you can adjust the resolution down or downsample it to bluray quality.The running animation at 60fps looks fake as you see the run-cycle repeating itself, whereas at 30 fps the blur hides it and your mind fills in the gaps. Also at 4K with the highest resolution you see for example the lava-splashes could be ‘texture-plains‘, they don’t even interact with the lava when they fall down. On the xbox series x settings however they are less ‘separated‘ from the lava and thus it you assume that it merges when it splashes down in the pool of lava as you don’t see it as clearly.
There are some other examples as well, but it is not basic “too smooth” as I prefer 60fps in most titles, it is just that the animation is probably keyframed at 30 or 24 and they interpolated from there
I don't think he meant that the Xbox version is superior, just that he likes the 30 fps on the console more than a PC running at 60.You can lock the PC version to 30 fps or 24 fps if you care. Also, if you find the fidelity to high due to resolution, you can adjust the resolution down or downsample it to bluray quality.
There is nothing wrong with the PC version regarding its visual make up as far as I know.
A poster with the username XboxKing doesn’t sound very impartial.Is it weird that I vastly prefer the Hellblade2 Series X visuals over whatever the maxed out PC looks like?
The animations especially the running look superior on series X as the motion blur smooths out the animation and gives it a more natural look. I prefer 4K 60fps in other titles, but on Series X it just looks more consistent IMO.
Does anyone understand?
the ROG ALLY has 8,3 teraflops. I had forgotten, and this video by Olie reminds me how crazy that is tbh.
Dual issue very rarely gets used and the Ally only clocks around 2 GHz, so it's more like a 3 TF RDNA 2 part.
Kudos to @Dictator for pointing out the heavy PCIe bandwidth utilization and frame rate/time drops thereof that @yamaci17 also pointed out in the Ratchet and Clank thread. I've noticed the same in that game after reaching TorrenIV and it's a pretty bad user experience. Considering how Avatar does handle it much better and how Ghosts is just a PS4 port, this seems like an issue of how Nixxes in general handles the memory usage in these ports.
Yeah, I definitely noticed this moving my 2080 Ti from my PCIe 3.0 motherboard to my 4.0 one. The frame rates are much more stable on the 4.0 one. There are weird and random massive performance drops with PCIe 3.0. I don't understand why because it's 16x which should have more than enough bandwidth to handle any game two times over.Kudos to @Dictator for pointing out the heavy PCIe bandwidth utilization and frame rate/time drops thereof that @yamaci17 also pointed out in the Ratchet and Clank thread. I've noticed the same in that game after reaching TorrenIV and it's a pretty bad user experience. Considering how Avatar does handle it much better and how Ghosts is just a PS4 port, this seems like an issue of how Nixxes in general handles the memory usage in these ports.
That is definately a crucial achilles heel in these Nixxes port and I hope Alex will forward his findings to Nixxes.
Thanks for making PC gaming better Alex, the suggestions you've made in your video would already help a lot.
A minor upgrade from the PS5 version, that was a minor upgrade from the PS4 version. PS4 is pretty much running almost all high settings lmao.