And for me upscaling is the worst image quality you can get from a game, it makes great otherwise graphics bad comparatively.
That's hyperbolic. Native 1080p res with no AA and texture shimmer and shader aliasing + moire-type patterns is clearly going to be worse overall IQ to most than slightly lower res with nicely solved aliasing issues. Native 1080p adds fidelity, but that's only part of the whole quality aspect of the image we're seeing. Another one, very important, is framerate, which even affects 2D resolution perception (detail can be perceived at lower spatial resolutions when supplied at higher temporal resolutions).
Take BF4 on PS4/XB1. For all its rendering technique and stuff it displays on screen, everything it does is wrecked...It's the developers that should understand that on LCD screens, effects, shaders, rendering techniques, tessellation, number of polygons pushed etc. will be completely negated/wrecked by any upscaling.
Again with the hyperbole! Look at The Tomorrow Children. It looks almost photoreal. It looks great. That game at 720p, if it was necessary, would still look better than a lot of other games at 1080p. Rendering lower resolution doesn't wreck the art style and rendition. It just removes some fidelity. One can even scientifically undo your crazy comment. Tesselation is not affected by resolution unless you're tesselating to the pixel level. Number of polygons pushed isn't affected at all. Shader's aren't affected unless they are producing high frequency details. Ergo they can't all be completely negated/wrecked by upscaling. Upscaling can damage image fidelty - nothing else. A subsurface shader making skin look realistic is going to produce a realistically skinned face in a blurred upscaled image as it will a native image.
What many PC gamers do when they set their games (what I was doing in the early years of LCD screens with my PC without really knowing at first the superiority of native resolution like I do now) is to first select the native resolution of their monitors because they know that everything not native will be awful and that better effects + upscaled resolution is worse than less effects + native resolution.
In the days of LCDs with nearest-neighbour upscaling, sure. But nowadays that's not an issue, and I doubt you have any stats to support you're view that
most PC gamers pick native res first and then tweak everything else. Personally I played Awesomenauts at 720p instead of native 1680x1050 because it ran smoother (although still not PS3 smooth). I expect PC gamers run the full gamut of preferring resolution and preferring framerate and preferring eye-candy.
Hence it's ridiculous to favour one aspect over others as a technical requirement. If you go with higher resolution at the cost of framerate, you'll please some people and offend others. If you choose 60 fps instead of 30 fps, you'll upset people who prefer better rendering detail, and if you favour 30 fps instead of 60, you'll upset those who prefer higher framerates. If you go with 1080p60, you'll upset those who want photorealism who'd prefer lower resolution and framerate like TV/movies and allow games to look like they're real. It's impossible to please everyone, and ridiculous to try and prioritise according to some scientific analysis, so it should be left to the devs to make those choices.
Considering One started it by analyzing Halo 3 promotional material before its release, I'm pretty sure it was before 2008.
You're right, that thread is a continuation of an older thread.