... talking about a clear labeling scheme for the samples per-pixel, etc that the ART team deems reasonable in terms of performance to hit the game's artistic vision being labeled as "console" or "original" settings. It just so happens that these "original" or "console" settings generally always happen to be the visual/performance sweet spots on nearly every GPU out there
...
Why is the "console" setting of lower quality not available on PC?
I think I already covered this with pjbliverpool now, but just to reiterate: I certainly agree that *individual* graphical settings on PC should cover the range of whatever is used on the consoles, and ideally both higher and lower where appropriate. In most cases it's fine to label one of these options as the "console" or "recommended" or similar name if it makes sense too.
The subtlety I'm pointing out is that due to a variety of factors it's common for entire systems to perform differently enough between console and PC to make the "overall best choice between all those settings" different. Ex. post-processing might be cheaper on console while shadow depth rendering might be cheaper on PC, so in terms of the best bang for the buck it would generally be better to bias towards more quality post-processing on the console side, but higher resolution shadows on the PC side for instance. While it's more convenient to evaluate these settings in isolation, in the end it is a zero sum game where the options do compete for the fixed performance budget.
There are of course even times when this happens within one subsystem. For instance it *tends* to be that virtual shadow map depth rendering can be made relatively faster on console, but shadow projection is relatively faster on PC. (These tradeoffs change over time as implementations change of course but just picking a current example.) We could of course separate out two scalability settings for these different aspects for shadow rendering, but in this case the difference isn't so stark that its worth the mental overload and confusion. Obviously there's always a balance to be struck between something like exposing the developer console and config files and all that entails (which some games do!) and grouping things into settings buckets.
Unfortunately much of the time the settings buckets that make conceptual sense (ex. "shadows", "effects", "GI", etc.) are not super well aligned to the places that are actually expensive in the engine, which leads to the very common case of settings that make almost no performance difference on a given system. Further complicating this is things that primarily impact VRAM usage where they will make almost no difference to performance until you entirely fall off a cliff based not on whatever setting you last changed, but the combined set of settings. Some games try and give an estimate of the vram use of various combinations of settings; this is certainly nicer than nothing but quite approximate still. And of course on PC users may be running chrome on a second monitor and eating half your VRAM anyways
Now as enthusiasts we might say "just expose everything and we'll figure out the best settings" but at some level the argument about labeling console settings admits the reality that the majority of users set an (overall!) preset and move on, if they even change settings at all. Thus while it's fun for us power users to tweak things and certainly nice to provide some knobs so that folks like Alex and the IHVs can come up with their own optimized recommendations, those overall preset settings and how they trade off against one another are still of primary importance I think.