So, instead of showing how good nVidia and AMD can handle Raytracing at 1080p HBU is doing low-end CPUs tests with >$800 graphic cards in games which have broken DX12 paths on nVidia hardware...
They could have used Call of Duty Black Ops. It runs with >200FPS on a 3090...
I'm pretty sure neither 5700 XT nor 5600 XT ever sold anywhere near $800 let alone over it, 3070 might be over it in the current situation though.
You're actually suggesting NVIDIA sponsored title with RT and all is having
broken DX12 path on NVIDIA hardware? You really think they'd let that past QA at NVIDIA?
Do we really have users here who simply can't see any fault ever in their favorite companies or wth is going on?
So NVIDIA does better than AMD at maximum settings/4K...when I play games I mostly GPU limited...aka doing it the right way.
This isn't about what some company is doing better than some other company at whatever settings.
This is about on companys video cards across at least two generations showing higher CPU usage on lowlevel APIs compared to another companys video cards from 2 generations.
Shouldn't be that hard concept to grasp, even when it doesn't apply to you. Not everyone is running highend CPU and GPU.
Also, I'm pretty sure that for example RTX 2060 won't do much better than 5600 XT in 4K max settings so maybe use, you know, specific model instead of company?
If I remember correctly AMD is sub 30 FPS in WDL at 4K/DXR/Ultra settings, while the 3090 is 60FPS with DLSS Quality /shrugs
Apples to oranges is always fun.