I'm not going to argue over the definition of "bug." It's obviously undesirable, as I defined in the first post. If you think that example is desirable, your system of values is incompatible with a logical discussion.
In the example, they did it correctly for some textures, but not others. Even more evidence.
If you know anything about digital signal processing, like from audio domain, you know there are mirror images AKA "aliases" of the signal beyond the Nyquist frequency. Same problems occur in video signals, it's just that video people generally are forced not to care due to performance or hang their hats on anti-aliasing compromises, because most AA techniques aren't really sound in terms of Digital Signal Processing.
Like saying you prefer car wheels spinning the wrong direction in video.
So when anyone mentions sacrificing "sharpness," you have to realize there's a good chance this "sharpness" you're trying to preserve is incorrect in terms if DSP. It should have been filtered away but never was, so although you may subjectively "prefer" the sharpness and resultant shimmer, the signal you're preferring is technically invalid.
As an extreme example, I might consider 480p upscaled to 1080p with standard bilinear as preferable to "sharpness." Obviously I'd much rather preserve as much detail as possible by rendering at a higher resolution (again "sample rate" if thinking about audio), but not at the expense of shimmering 2D elements.
A little "blur" is probably more correct.
The Gloomhaven example, even when rendering at 480p, if I remember correctly still exhibits the shimmer, with is further evidence the developers screwed up AKA made a mistake AKA it's a bug.
If some of you are developers and that offends you somehow, good. Maybe you'll improve your skills and not do it next time. At least try to use 2 pixels for your skin thickness.