It's not a terrible review but he really needs to be more scientific with his settings comparisons. For example, at 1:50 he shows the PC dipping below 60fps and uses this as an example of why the PC CPU is unable to keep up with the PS5 CPU. However He doesn't clarify the settings being used. On screen, it says "Fidelity match" which could mean either the settings are equivalent to PS5's Fidelity mode - which runs at 30fps on PS5, or what he considers to be a match to PS5 at Performance RT mode. In any case though, it's not clear the PC is dropping below 60 there due to the CPU or the GPU. Because he's running at 4K DRS, and the PC DRS works different to the PS5, it's quite possible the PC is running much closer to actual 4K at those points where the frame rate drops while the PS5 is running at 1440p, and hence the bottleneck could actually be on the GPU. It would have been better to lock the resolution at 1440p on the PC to remove that possibility.
At 2:30 he does another direct side by side comparison to the PC but this time bizarrely labels the PC as running at 1440p DRS with FSR 2.1 while the PS5 is running at 4K DRS (using IGTI). It's a bit all over the place. As a side note he also states here that the PCIe bandwidth is being "flooded" with 18GB/s at times which is in fact not all that much compared to the 32GB/s (each way) limit of the PCIe 4.0 graphics interface.
At 8:00 he starts to talk about the disadvantages of the RT shadows mode which
@Dictator also picked up on, however we've since had a patch which improves the RT shadows. It's unclear whether this is one of the improved aspects though. I think that patch landed on the same day as the NXG video so just unlucky timing there.
The biggest issue by far for me was on the IGTI implementation where I think
@davis.anthony is right on the money there.
The PS5 in Fidelity mode runs at 30fps and so should have little problem hitting 4K native while the PC (I'm going to assume the 6800) is clearly targeting 60fps and not hitting it, thus DRS is going to be reducing the resolution, likely quite aggressively. I mean, looking at those image comparison it should be blinging obvious that the PC isn;t running at the same internal resolution. That level of difference (it was fricken huge) could not possibly be down to a less optimally implemented upscaling solution, it was far bigger than FSR1 performance vs DLSS Quality for example!. I mean, why the hell did he run this test with DRS on at all? It clearly has the ability to significantly skew the results.