Which makes you wonder if they actually are even more tbh. When you need to make a video to explain why your benchmark results are different from most other outlets then this in itself is a problem.
A problem which surprisingly comes down to viewers/readers not thinking about different test scenes making a difference in the results. Hardwareunboxed does show their test sequences during the reviews, and many other sites do too. What more are they supposed to do?
Not having a huge headline saying their results may differ because of using different scenes is something many others also could be criticized for.
And that is why saying what the tests are is one of the key aspects of testing. It's basic stuff in any kind of testing, and that's when we possibly can speak of a de facto standard, you describe/show what you did and what tools you used, so if anyone finds the results questionable, they can repeat it in order to verify the results.
And that's the big negative I constantly see in the rants against HUB...Sure, accuse them of having a test suite that favour AMD, that can be a real discussion, but the constant accusation of sloppy tests and wrong results, people should stop with that unless they actually can show it.
At least some of the new posts here discuss more how HUB possibly can come to this and that conclusion after saying/showing that earlier, rather than these attacks about the tests being faulty.
To be honest though, I have never prioritized the various authors' own conclusions. I focus on the actual tests and draw my own conclusions, using a mix of multiple different sites.