The "cherry-picking contest" is what HUB did back when they've claimed this bs with higher driver overhead. This has been disproven time and time again by their own data. What I've posted is just the latest from that.
Really? How was this disproven? Let's go back to your own video shall we...
So apparently Nvidia doesn't have "a driver overhead problem due to s/w scheduling" (or w/e b.s. Steve said back when he benched five games and made wide reaching claims).
The video clearly shows the opposite of what you are saying. Let's just put some basics here...
1) Your framerate is primarily capped by either the CPU or the GPU, provided there are no software-induced limits like frame ratecaps etc.
2) The game engine is either more CPU heavy or GPU heavy at any given point in time.
3) The higher your resolution, the more likely that your GPU will become the bottleneck
4) The lower the resolution, the more likely the CPU will be the bottleneck
5) Considering that the CPU does not care about resolution, all else being equal, it is the driver of the GPU that limits how many frames the CPU can at most handle.
6) If the game engine is low enough on CPU usage to not limit the GPU in any way, then the power of the GPU will be visible at all resolutions, including 1080p and 1440p.
Hogwarts Legacy: 7900XTX Faster than 4090 at 1080p and 1440p, but over 25% faster at 4K. Same RT performance at 1080, but slower 7900XT at 1440p and 4K. -> Clearly the 4090 is inducing more CPU resources than the 7900XTX.
Spider-Man Remastered: 7900XTX Faster than 4090 at 1080p, about equal at 1440p but with much better 1% lows than 4090, but over 35% faster at 4K. Generally equal at RT except 4k-> Clearly the 4090 is inducing more CPU resources than the 7900XTX.
Far Cry 6: About equal at 1080p but much better 1% lows for 7900XTX, 7900XTX faster at 1440p, 7900XTX slightly slower at 4k -> Clearly the 4090 is inducing more CPU resources than the 7900XTX, limiting even its speed at 4K.
Hitman 3: 4090 faster at all resolutions -> Low CPU usage game.
Horizon Zero Dawn: 4090 faster at all resolutions -> Low CPU usage game
Rainbow Six Siege (DX11): 4090 faster at all resolutions -> It's quite well-known that nVidia had the lower driver overhead in DX11. This switched with DX12 in most cases.
The rest are all pretty much a variant of the above. The console ports generally use the 4090 very well at 1080p, i.e. no CPU bottleneck at all. Something like CoD MWII the 4090 is clearly more CPU limited than the 7900XTX is, just like Hogwarts Legacy. It's also quite clear when the 4090 is being CPU bottlenecked when the 7800X3D is giving it more performance than the i9.
So tell me...
Is the 7900XTX just as strong as the 4090
or
does the RTX 4090 have driver overhead?
You can't have it both ways.