Gamma correction doesn't alter the amount of popping and aliasing, it just alters the distribution of the color ramp for inbetween shades. The total number of different shades and EER shouldl be the same.
The fourth situation, I analyze differently. *Some* of the bushes look like they have been gradually been faded out at different distances, but some of the others look like they've been culled completely, and frankly, I don't find the draw distance at which these billboard sprites are being faded or dropped to be very compelling.
Is this really an artifact of the ATI drivers? I just can't imagine the drivers doing this except by app-detection and some really wonky hack. It must be the game engine somehow using a different rendering path/setting for ATI.
Also, the idea of that non-2.2 non-trinitron gamma curves are "crappy" is the wrong way to look at it. CRT display technology from the 1980s is no longer the gold standard. These days we have CRT, LCD, DLP, OLED, PDP, and upcoming SED/FED. Trying to shoehorn these displays into (imho INFERIOR) IQ parameters would be wrong. Better, is that future graphic cards be architected to deal with the gamut and gamma curve of the underlying display dynamically, rather than fixed.