davis.anthony
Veteran
Checkerboarding is like interlacing and interlacing has never been good for a stable image.
Never played Days Gone on PS4 Pro did you?
Checkerboarding is like interlacing and interlacing has never been good for a stable image.
A very good video with lots of details. Unfortunately, we can’t say the same about the game. Those animations in particular don’t look mocaped. It looks really bad. 720p internal res for this poor looking game? This game just looks so bad.
00:00:00 Introduction
00:00:43 The Good - Frame Rates and PC UX
00:04:05 The Good - Sky and Water Simulation
00:05:22 The Bad - Water Reflections
00:08:13 The Bad - Animations, Textures, and LOD
00:11:12 The Bad - Lighting, Particles and Post-Processing
00:13:00 The Ugly - Xbox Series X/S and PS5 Performance Mode Image Quality
00:16:25 The Ugly - Quality Mode Controls
00:17:50 The Incomprehensible - AAAA Game Design and Bugs
00:21:34 Video avast!
Any game using FSR2 looks bad and worse than any good looking previous gen game. Alex on FSR2 on Skull & bones:A very good video with lots of details. Unfortunately, we can’t say the same about the game. Those animations in particular don’t look mocaped. It looks really bad. 720p internal res for this poor looking game? This game just looks so bad.
For me any game using FSR2 should be labelled as such. With FSR2 reflections / transparencies are using the lower native resolution and they are terrible to look at. It's incredibly distracting and immersion breaking. Worse than playing a PS3 or PS4 games which were at least consistent.It is hurting more than helping due to how many issues it has now
It’s bizarre that they wanted to build a game around the pirating gameplay of black flag but excluded some major things that made black flag fun to play. Who thought that was a good idea.
We could also do that in AC Odyssey. Super fun indeed.Agreed. The inability to actually board the enemy ship and fight the crew is mind boggling. That was easily one of the best bits of Black Flag.
Never played Days Gone on PS4 Pro did you?
Hard disagree here. I think Guerrilla with Decima basically solved the reconstruction problem on consoles for sharp, clean and 60fps games with Horizon Forbidden West on PS5 (native res being only half CBR 1800p so about native 1270p which is pretty decent for alphas). Every others developers should look at what they have done here. With FSR2 they are reconstructing from too low, cost is super high cause they try to target too high final output. Here for Skull & Bones:Yeah, it's great in DG. But it's also rendering starting from a native 1920x2160.
Checkerboarding is unfortunately likely not a saviour for FSR2's image woes today on console games though, because the inherent problem is the load modern engines are putting on this hardware (at least when going for 60fps) necessitates that the native resolution is going to be drastically lower than past gen games which employed checkerboarding (and often with a goal of 30fps to boot). Checkerboarding in these games would very likely look quite poor too, albeit I'd still be very curious to see the differences, particularly with stuff like occlusion issues and low-res buffer treatment.
The scaling of this game is also very weird. FSR2 has a cost, yes - but going from 720p native to 4k native "only" halves your frame rate?! I appreciate them aiming for a very stable 60fps but I'm wondering if it really needs to be 720p for that.
Something is bloody wrong here. if they can do native 4K 30fps with TAA (like Horizon FW) then they could do CBR 1800p 60fps. Cost of FSR2 must be higher than CBR. And native resolution is too low creating all those artefacts. You don't polish a 720p turd. Also CBR has other advantages compared p to FSR:
- Quality: Native 4K 30FPS with TAA
- Performance: 720p 60FPS upscaled to 1440p
But with checkerboard rendering, the distance between rows and columns in the rendered image and resolved image will still remain the same, which means that most edges (being typically either mostly horizontal or mostly vertical) won't be able to 'fall through the cracks' as easily and therefore won't as easily get missed completely one frame and be visible the next again.
Horizon Forbidden West's upgraded performance mode is now the best way to play
I know checkerboarding from enough games to see that TSR and DLSS are vastly superior.Never played Days Gone on PS4 Pro did you?
I know checkerboarding from enough games to see that TSR and DLSS are vastly superior.
In Death Stranding at least, DLSS easily beats Checkerboard Rendering. DLSS is simply a better upscaling method and really shouldn't lose under any circumstances.That's an over exaggerated claim as depending on the quality of the implementation will depend on what looks better.
In Death Stranding at least, DLSS easily beats Checkerboard Rendering. DLSS is simply a better upscaling method and really shouldn't lose under any circumstances.
I wouldn't say broken, since it's optimized for RDNA2 on the PS5, which should translate to nice performance on the RDNA2 PC side as well. A more reasonable explanation is the limited memory bandwidth on the RX 6700 compared to PS5.It trashes the 6700 in that scene because of the 6700's clearly broken RT implementation there
Exactly, and in Ray Tracing as well.Yes the 6700 has the infinity cache, but particularly given it's limited capacity compared to the high end parts, it's utility at higher resolutions is going to be quite limited
Once you become GPU limited, using a higher CPU won't get you any more fps than possible. I used an i7 3770K for years, I upgraded to a 7800X3D recently, in heavy RT titles I didn't gain a single fps, because I was limited by my GPU at all times. Digital Foundry made sure to test only GPU limited scenes on the PS5.Professional GPUs benchmarks are being done for decades on PC, the first rule being obviously to use similar CPUs...
No you can't, in the GPU limited scenes used by DigitalFoundry, you won't find a single difference between a 3600X and a 7800X3D.I can find several 6 core 12 thread cpus that would perform worse with the RX6700 than with the same gpu paired with a 7800x3d
I think Battlefield 5 still has DLSS 1, no?Better on paper and in a perfect world, but we live in the real world.
I recently bought Battlefield 5 as it was on sale and DLSS in the game is atrocious.
I wouldn't say broken, since it's optimized for RDNA2 on the PS5, which should translate to nice performance on the RDNA2 PC side as well. A more reasonable explanation is the limited memory bandwidth on the RX 6700 compared to PS5.
Doubt it’s VRAM because the 2070S with its 8GB does much better.I'd expect there are different implementations between console and PC with the PC implementation favouring Nvidia due the sonsorship deal. I don't know specifics for CP2077 but for example I know there are games that use in line RT on consoles but switch to DXR 1.0 on the PC side.
Avatar is another one that Richard mentions in the video uses a custom RT implementation for consoles.
I do agree the memory bandwidth, and perhaps VRAM capacity may also be hobbling the 6700 at those settings though.
They can only be sure for the PC version, not the console using underclocked laptop CPU. From 60 - 80fps I'd say as a general rule we can suppose most games will be CPU limited on consoles notably when many games use DRS so reducing the risk of being GPU limited.Exactly, and in Ray Tracing as well.
Once you become GPU limited, using a higher CPU won't get you any more fps than possible. I used an i7 3770K for years, I upgraded to a 7800X3D recently, in heavy RT titles I didn't gain a single fps, because I was limited by my GPU at all times. Digital Foundry made sure to test only GPU limited scenes on the PS5.
No you can't, in the GPU limited scenes used by DigitalFoundry, you won't find a single difference between a 3600X and a 7800X3D.
DF picks the areas in which PS5 suddenly falls below 30/60 fps, while also being GPU limited on the RX 6700. Thus ensuring they are testing the areas with actual GPU limitations across all platforms. They don't randomly pick areas, they synchronize areas across both PC and PS5.They can only be sure for the PC version, not the console using underclocked laptop CPU. From 60 - 80fps I'd say as a general rule we can suppose most games will be CPU limited on consoles notably when many games use DRS so reducing the risk of being GPU limited.
The code on PC could be more CPU limited, due to difference of APIs between platform. Generally speaking DX12 games are more CPU limited on PC than their PS5 counterparts. So using identical CPUs will actually harm the PC GPU experience. Never the less, this point is moot, considering what I said above.This is why you need to use a modest CPU on PC as well to see which games start to be CPU limited with such a weaker CPU!
That's an over exaggerated claim as depending on the quality of the implementation will depend on what looks better.
Against DLSS1, sure CB wins. But against DLSS2? of course not. BF5 has the very first DLSS1 implementation.I recently bought Battlefield 5 as it was on sale and DLSS in the game is atrocious.
But against DLSS2? of course not. BF5 has the very first DLSS1 implementation.
TAA has horrendous ghosting as well, often times both are tied (when TAA is ghosting hard in a game, DLSS ghosts as well, it's an issue of temporal solutions in general), but ghosting with DLSS is often significantly lower. However, we are debating the reconstruction of fine details and image components here. DLSS2 is without a question the superior solution here.I'm sorry but no, some games have horrendous ghosting with DLSS2 and to a level that I've not seen in any CB games I've played.