It's one thing to scale the visuals, which is a concept that extends back to the 1980s, but it's another to have makes fundamental decision about what the lowest common specification that can run your game in. Developers have had to do this on PC for a long time and many devs/publishers have had to make decisions like making 6Gb or 8Gb RAM the minimum when a third a Steam users have 4Gb or less, or dropping support for dual core processors because the game just will not run.
This is very different to scaling the visuals and is the problem I am talking about. Assassin's Creed Unity would not have worked on 360/PS3 because those machines just could not render the several hundred NPCs on screen needed for an immersive revolutionary Paris. How long before the need to support Xbox One holds back developers for exploiting Scorpio?
Given the CPUs in the consoles are so relatively underpowered, GPU is often the only place to put certain processing tasks. This is not the case for most PCs.
And again, the X360/PS3 is irrelevant in this discussion of a rolling generation. Developers wouldn't be required to support them, although they'd have the option to if they wanted. They'd need to have support for the consoles that released after the X360/PS3. While it wouldn't have 8 GB of ram, it'd likely have come with between 2-4 GB of RAM. The current gen consoles released with over double the amount of ram available on high end GPUs at the time (7970 had 3 GB) and 4-8x the memory of the GPUs they were directly correlated with (7770 had 1 GB and 7870 had 2 GB). GPUs at the time had between 256 MB to 1 GB of memory although 1 GB was very rare.
Speaking of memory. You have to include system memory in your calculations. Greater than 83% of PC's on steam have more than 4 GB of system RAM. Greater than 98% of users have 1 GB or more of VRAM. Greater than 55% have 2 GB or more of VRAM.
At 1080p console settings, games rarely need more than 2 GB of VRAM. And with tweaks to graphics setting will happily run with only 1 GB of VRAM. At 720-900p (what the hypothetical pre-current gen consoles would likely run at), there'd be no problems supporting a current generation title on the hypothetical console WRT to memory requirements.
AC: Unity is also perhaps the worst (or best) posterchild of this. As it is by far the worst optimized and worst performing title (along with the first Watchdogs) released so far this generation on PC. Users had problems getting consistent frametimes much less consistent framerates on PC. And that's with some users using dual Titan X and i7 CPUs in the 4+ Ghz range. Even the minimum requirements have a CPU that is far more powerful than the one in consoles. AC: Syndicate actually reduced the minimum requirements.
We can also look at another example of a current generation title that takes advantage of technology advancements available in the current gen console GPUs. Technology advances that weren't even available on PC until a few years after the release of the consoles due to graphics API limitations on PC. Doom 3. It is incredibly well optimized. And as a result it'll even run on a Radeon 9550 (a 2004 era GPU).
The biggest issue would have been how reliant are modern games on actual hardware differences when scaling to lower quality levels. Doom shows a case where a developer can take advantage of new technology while still scaling down extremely far. AC: Unity obviously shows the opposite. However, would that be still be true if they were required to support that hypothetical console. Would they instead have put more time into optimizing their game and engine? Or would they have just not bothered.
That's a hard question to decipher as consoles are the primary target of development. There's no use in supporting older consoles as there is no compatibility. This effect is seen when you look at ports to PC which developers spend the minimum amount of time doing. GPUs will be mostly in the same generation (GCN1) of the console GPUs. So there's not much that we can infer from that.
However, what about a console title that while it had an X360/PS3 release had an update that did not support X360/PS3? GTA V. The updated engine cannot run on the previous generation of consoles, it's just not possible. However, this was a release that was initially developed for PC and ported to console. I use this as a good case in point. It's a large open world and it is one of the better looking titles of this generation. It's also a title that would have been required to support the hypothetical console (while something like Horizon Zero Dawn would only have had to support PS4). And when you look at the minimum requirements, it supports the 4870 (a 2008 era GPU).
Or how about Battlefield 4? It features large levels, great graphics, and needs a 3870 at low end. This would be a great example of a developer taking advantage of new technology while also optionally supporting 2 generations into the past. It's a good example of a launch title. One where you won't have a lot of time to optimize heavily for the advances in GPU technology.
Battlefield 1 later in the generation shows a much better grasp of the technology and in the case of PlayStation wouldn't have to support PS3. Would support of a hypothetical Xbox console launched between X360 and XBO have completely precluded the development of the game? Probably not, as it runs just fine on a 5870 (a 2009 GPU) on low settings despite the minimum recommend specs listing a 7850.
When looking at Exclusives. I'd argue that the developers that actively take advantage of new technologies would follow the Doom, GTA V, Battlefield 4 and 1 examples. They'll want to take advantage of the newest features and will find a way to make it work on the older console. Multiplatform developers will have varying degrees of desire or capability of doing this, pretty much exactly the same as it is now.
Also note, that titles released around this time frame wouldn't necessarily be required to support that hypothetical console anymore. IE - on the PlayStation Side, the only consoles that would have required support would be the PS4 and PS4-P since late last year, while on the Xbox side it wouldn't be dropping until later this year.
At least to me, it shows that if the developer is willing, they can take full advantage of a new generation (in as much as any developer can in the first few years of that generation) while still ensuring compatibility with a previous generation.
BTW - an interesting link.
https://translate.google.com/translate?sl=auto&tl=en&js=y&prev=_t&hl=es&ie=UTF-8&u=https://www.computerbase.de/2017-01/radeon-hd-7970-290x-fury-x-vergleich/2/#abschnitt_benchmarks_von_skyrim_bis_battlefield_1&edit-text=
A lot of current generation game title releases will happily run on a 2009 era GPU. Benchmark scores there aren't meaningful however due to the settings and resolutions used. They may or may not run on a 2008 era GPU (Radeon 4xxx) due to API and GPU driver limitations. A limitation that wouldn't apply as it does in the PC landscape in the case of a console.
Regards,
SB