And this is the biggest barrier to rapid adoption of new consoles in a generationless ecosystem. Can you imagine if Xbox One and PS4 were generational advances over 360 and PS3, with those consoles having to have been supported upto the release of PS4 Pro and Scorpio? How games released since late 2013 would be? No Witcher 3, that'd be PC only. Horizon Zero Dawn would have been very different. No first person in GTA V.
Sure, they'd look better but fundamentally games wold have to work on 512mb machines because manufacturers mandate that games must run on both tiers. Only now (PS4 Pro) or lager this year (Scorpio) could larger games be possible. Every game released only on this gen of consoles would have been massively paired back in terms of features and only now, years after release, would games actually be able to make use of the hardware.
The problem with that analogy is that the PS3/X360 generation was ~7-8 years. What if it had only been 3-4 years? The "next generation" comes out and they have to be supported for another 3-4 years. And so PS3/X360 would have gotten game support for 6-8 years. A new "generation" would have come out at roughly the same time as PS4/XBO with the same performance.
Witcher 3 would have ended up the same regardless of which path you take. Horizon Zero Dawn would have ended up exactly the same. GTA V would have looked the same on the PS3/X360 generation (it'd still have to be supported), better on the generation after it, and the same as on PS4/XBO on the generation after that one.
To expand. TLOU would have looked the same on PS3, but likely would have looked better on the generation after and the same as PS4 with the current generation.
This all, of course, assumes that each console had been using hardware that could have advanced rapidly, which means a switch to x86 (ARM was far too slow at the time) and PC GPU tech (at least the PC GPU part had already happened) such that you could have rapid (3-4 year) iterations in hardware.
The jump in quality between each iteration of PS3/X360 -> next "generation" -> current generation would be smaller than the one between PS3/X360 and the current generation. But the jump in quality between PS3/X360 and the generation after that would have been the same.
Speaking of the Steam survey, we can look at it in terms of how this would work in 3-4 year "chunks" representing each introduction of a new console "generation"
Unfortunately, it's tough to do that with CPUs. And we can't just look at core count to determine if a CPU has been upgraded as 2 core CPUs are still sold.
We can, however, look at GPUs. To make this simpler we'll look at GPUs released in 3-4 year chunks starting with 2005 (X360 release) and attempting to match console launches. I'm going to regret doing this as it's going to be a bit time consuming.
GPUs, series (not listing mobile variants but including them in percentages when possible) and rough percentages in current Steam survey released in
- 2005 - 2009: unknown%*
- AMD: unknown%* - X700, X800, X1k, X2k, X3k, X4k, X5k
- NVidia: unknown%* - 7xxx, 8xxx, 9xxx, 1xx, 2xx
- 2010 - mid 2013: >33.17%
- AMD: >3.05% - 5xxx, 6xxx, 7xxx
- NVidia: >30.12% - 3xx, 4xx, 5xx, 6xx, 7xx
- late 2013 - 2017: >45.63%
- AMD: >12.48%
- NVidia: >33.15%
* - Unknown means it shares a pool of 12.7%. Can't show a % number as some members are only listed as having a share of Dx[9/10/11] but too small to register on overall share.
So, for the PC analogue, virtually everyone that was gaming on Steam when the PS3/X360 was launched have upgraded their graphics card in some way. Whether by buying a new computer or buying a replacement for their graphics card.
DIving deeper into the statistics provided by Steam Survey we also see that PC gamers that are analogous to console gamers (EG - "core" gamers represented by part of midrange GPUs and up) upgrade far more frequently. For example, while there are plenty of NVidia 7xx and 6xx cards in the survey, the GTX 780 has just 0.45% (nearly every owner has upgraded to a 980, 980ti or 1080 at almost 4.5% combined) and virtually all owners of GTX 680, 580, and 480 have upgraded. The AMD side is similar up until you hit the Rx generation as there is no viable upgrade for AMD in the performance and enthusiast segments.
So, if the PC is to be a role model of a rolling generation of hardware and whether people would buy it?
- PS3/X360 owners would be a negligible number.
- The generation after would represent a sizeable but still lower number of owners.
- The current generation would hold the majority of owners.
Or to think of it another way. Each console would enjoy support for a reasonable amount of time (6-8 years). Most people would move to the newest "generation" of consoles. When looking at PC users that are analogous to console users, that demographic shifts dramatically in favor of people upgrading to the latest generation of consoles.
In fact, looking at Steam, PC users upgrade their hardware far more often than console users despite "rolling generations". As X360/PS3 generation hardware is less represented than X360/PS3 consoles. IE - not nearly as many people (as a percentage) upgraded from X360/PS3 to the current generation. And that includes casual gamers buying budget cards that can't match console graphics.
But I hear people say, game developers won't optimize for the latest generation of consoles. What?
Let's use a recent example, Crysis 2 was far beyond the capabilities of non-high end current generation graphics cards, much less consoles at the time, but that didn't prevent the developer from making it and then scaling back the graphics in order to run on much older PC hardware as well as consoles. Additionally, that was complicated by the fact that they also had to port the engine to 2 vastly different architectures.
If the console hardware had remained similar (as with PC) it would have required significantly less effort to support it on console similar to prior generations of PC hardware.
PC is considered by most developers and publishers as secondary to the console market, yet even then you see some developers rush to try to take advantage of new advances in technology. Imagine if it was considered the primary market?
Yes, but you still have to support older hardware and that hampers the ability to take advantage of newer hardware. OK, but then what excuse do console developers have for doing the same thing? Yakuza 0 while a great game doesn't exactly do much to take advantage of modern tech (character models are quite good though). Berserk The Band of the Hawk doesn't exactly do much either with a very dated look. There's plenty of cases where developers on consoles don't have the ability to take advantage of the latest rendering technology. Same as PC.
But just like Crysis, or a more modern example, Star Citizen. PC developers that are analogous to something like ND have been known to push the technology as far as it can go, while at the same time supporting older generations of hardware.
In other words, games for PS4-P and Project Scorpio would have to support at least 1 generation prior, but that doesn't mean you can't push the hardware and still have those games run on the previous generation hardware at lower graphical settings.
Regards,
SB