nVidia's tessellation advantage was mitigated by AMD "optimizing" their drivers to cap the maximum tessellation level to about a quarter of what nVidia uses. There is a case to be made that those optimizations are valid because there is little to no visual difference, but as is often the case in PC games, when you start increasing resolutions beyond the scope of what was available at the time, you starts seeing things you may not have seen at a lower resolution. I'm curious what some of those games look like with AMD's altered tessellation on and off at 8k. It's a clever trick, but it's also a false equivalency because both cards aren't doing the same amount of work.
An example of something similar historically is when 3Dfx had a feature called mipmap dithering. It dithered the transitions between mipmaps when bilinear filtering was enabled, giving you less noticeable steps between mip levels without the performance penalty of trilinear. This is fine for plenty of games when I had an older Voodoo card, but when I upgraded to a Voodoo3, that thing supported resolutions much higher than before, and while thing weren't what I would call playable at those resolutions, I did mess around with some games at extreme resolutions for it's time. I don't remember what the maximum resolution was in 3d for the Voodoo 3, but I do remember that my monitor could handle it. Might have been 1600*1200, or maybe just 1280*1024, but it was much higher than before, and one of the things I noticed was the mipmap dithering was much more... Identifiable I think is the right word. It still looked better than plain bilinear, but also clearly not as good as true trilinear, and there was a telltale pattern where the mip transitions were. But while at lower resolutions trininear and dithered bilinear looked nearly identical.
AMD never forced the tessellation optimization, it was manually selectable in drivers. Benchmarkers never enabled it for their comparisons.