davis.anthony
Veteran
I would have loved that the mounting evidence of bad DX12 games on both sides is enough to actually make you realise the situation, but apparently that's not enough.
So we end this conversation here, instead of you bullshitting all over logic, forum and the whole place.
I'm not the one branding devs as 'lazy' with no tangible evidence to back that up, stop acting like a child because your Nvidia feelings are getting hurt.
But if the games and 'lazy devs' are the problem how have Nvidia managed as much as a 24% performance increase in DX12 games in 522.25 driver?
To quote Nvidia themselves
Our DirectX 12 optimizations apply to GeForce RTX graphics cards and laptops, though improvements will vary based on your specific system setup, and the game settings used. In our testing, performance increases were found in a wide variety of DirectX 12 games, across all resolutions:
- Assassin’s Creed Valhalla: up to 24% (1080p)
- Battlefield 2042: up to 7% (1080p)
- Borderlands 3: Up to 8% (1080p)
- Call of Duty: Vanguard: up to 12% (4K)
- Control: up to 6% (4K)
- Cyberpunk 2077: up to 20% (1080p)
- F122: up to 17% (4K)
- Far Cry 6: up to 5% (1440p)
- Forza Horizon 5: up to 8% (1080P)
- Horizon Zero Dawn: Complete Edition: up to 8% (4k)
- Red Dead Redemption 2: up to 7% (1080p)
- Shadow of the Tomb Raider: up to 5% (1080p)
- Tom Clancy’s The Division 2: up to 5% (1080p)
- Watch Dogs: Legion: up to 9% (1440p)
You don't get those sort of performance increases from a driver update unless there was a problem with your driver in the first place and the 24% increase in AC: Valhalla, 20% increase in CP2077 and 17% in F122 shows there was a monstrous problem with Nvidia's drivers in DX12.
So it was Nvidia and their software after all.
There goes your whole argument.
Last edited: