That's the main
cause of the problem actually, because it causes upscaling. For many gamers in search for the ideal image quality, hence better graphics, 1080p in itself is not a checkbox, it's the native resolution of their monitors. For instance in Europe we played our old gen games on 768p TVs meaning upscaling, sometimes double upscaling for games not even rendering at 720p.
And for me upscaling is the worst image quality you can get from a game, it makes great otherwise graphics bad comparatively.
Take BF4 on PS4/XB1. For all its rendering technique and stuff it displays on screen, everything it does is wrecked by the imposed upscaling from 720p/900p to 1080p which negates completely what the effects and pyrotechnics it shows. And many people that play their PS4 games at 1080p immediately feel the blurriness of BF4 at 900p hence less impressive gfx than others native games.
What many PC gamers do when they set their games (what I was doing in the early years of LCD screens with my PC without really knowing at first the superiority of native resolution like I do now) is to first select the native resolution of their monitors because they know that everything not native will be awful and that better effects + upscaled resolution is worse than less effects + native resolution.
I agree that now, this fullHD quest is a checkbox list for clickbaiters articles, but it wasn't at all that at first, it was just the quest of the best image quality which directly helps the immersion.
It's the developers that should understand that on LCD screens, effects, shaders, rendering techniques, tessellation, number of polygons pushed etc. will be completely negated/wrecked by any upscaling (and blurry post effects for that matter
). Once you play most of your games at a native resolution, it's hard to go back to upscaled games IMO and I think many devs still did not really understand the importance of native resolution in the communication era of Internet and social medias.