Actually no, they didn't. There was back and forth NVIDIA saying it's devs/publishers code they don't care and devs/publisher saying it's NVIDIAs code so they can't.There were two incidents in the past regarding this, with the release of Batman Arkam Asylum (sponsored by NVIDIA) in 2009, the game locked MSAA to NVIDIA GPUs only, AMD cards couldn't access that feature, AMD complained about this move in a blogpost, and declared their GPUs perfectly capable of running the new MSAA, following that, there was a huge uproar on the internet, enough that NVIDIA and the developer caved in and allowed MSAA to run on AMD cards with no problems.
The logic behind this was a bit unusual though, NVIDIA came forth and stated they helped develop the MSAA implementation in the game, as the game used UE3 with deferred rendering which is incompatible with MSAA, and the developer couldn't make traditional MSAA to work with the game, so NVIDIA stepped in and helped them make it, the developer then locked it to NVIDIA's GPUs. The developer reiterated this narrative as well. Stating they probed AMD on the matter but AMD didn't care enough to make MSAA work on their hardware, before the launch of the game.
Only GOTY version could run it on AMD, the original not even when patched to the latest version. Only way to run it on AMD on the original release was fooling it to think you had NVIDIA card.
Also AMD wasn't quiet, they did offer them code to do MSAA. The problem was, it was the same what NVIDIA had already given, minus vendor lock, because it was supposedly the standard way of doing it in deferred UE3.
GOTY can be considered separate game release, which apparently allowed the change.
(also you forgot the part where all cards were forced to do some steps only required for said MSAA support even when MSAA wasn't or couldn't be used)
It's a shame I can't find those old comparisons, since I'm almost willing to put my head on plate claiming that every single rendering mode in Assassin's Creed except DX10 looked the same in those supposed "glitches", but only DX10.1 was removed.The second incident happened in 2008, with the release of Assassin Creed, the game was sponsored by NVIDIA and supported DX10, but then Ubisoft added DX10.1 support in a later patch, at that time DX10.1 was a rare occurrence, and it was only supported on AMD HD3000 GPUs.
With DX10.1 the game worked faster on AMD GPUs, but then Ubisoft stepped in and suddenly removed DX10.1 in a following patch, there was another uproar on the internet and people pointed fingers at NVIDIA and claimed they pressured Ubisoft to remove DX10.1, Ubisoft denied the allegations, and so did NVIDIA, it was later revealed that DX10.1 altered the image quality of the game, and removed some post processing effects, which is why it worked faster on AMD GPUs, this was reproduced by testings from independent media, Ubisoft removed DX10.1, but never cared to fix it or add it back later. The entire ordeal was transparent from start to finish though, with NVIDIA and Ubisoft responding directly to the press, and denying any kind of deal, also AMD never made any accusations or complaints about this matter.
In fact, In both incidents, NVIDIA and the developers came forth and made official statements that clarified their position, unlike the radio silence we have now, despite the massive uproar.