But couldn't someone make that same argument with the launch games for Xbox 3, PS4 right now? They've been in development for 2 years no? They could overshoot their targets too. Same with Vita when it launched. Same with Wii U, 3DS. I mean devs have been working with target hardware for a long time. Sure, you won't squeeze the most out of a new system, but you can make a conservative estimate.
Absolutely, yes!
Look at what was said about X360 and PS3 games when the consoles launched. Look at the complaints about how it wasn't much better than the previous gen.
But here's the catch. The X360 and PS3 were around for longer than 2 years. People that have been console gamers for a while knew to expect game graphics to increase in quality as the generation went on. Look at GoW on PS2 as a prime example of something like that. Compare Halo 3 to Halo Reach.
Compare COD 3's graphics to COD: BO2's graphics. Especially the PS3's versions.
Now imagine if you just cut that all off. And every game that is ever launched from now on is basically like the first 1-2 years of the X360 and PS3.
In other words, no learning the best programming practices of each console. generation. Optimizing developement and programming practices as the generation goes on for more efficient graphics rendering.
Imagine that every console from no on will always not be efficiently targetting the consoles. Games will get better as times goes on, yes. But you end up with the following.
Buy a console. 2 year lifespan games are rough and not totally optimized similar to the PS3/X360's first 2 years.
New console 2 year life span. Games are not totally optimized similar to PS3/X360's first 2 years. Games now targetting new console so are never optimized for previous console which cannot run these games well.
New console 2 year life span. The chain snowballs. And basically you're in PC gamer territory where you need to update your hardware to be able to play the newest PC optimized (not ports from console) games at high quality.
Or as someone mentioned before do developer's only target that first console in that chain as it offers the biggest chance to recoup their investment. So any new console see's absolutely no benefit? Or do we go with a PC model of trying to address all consoles potentially introducing bugs? And then have people complain that too much time was spent optimizing for the low end console or too much time was spent trying to optimize for the higher spec'd console. I see the console industry entering into it's death throes if it attempts to do this.
Compare that to right now.
Buy a console. First 2 years just like above.
Years 3-4, games are starting to be optimized, games look better and perform better just like above. Difference? Customer doesn't have to buy a new console to play it well.
Years 5-6, games reach high efficiency in graphics rendering and programming. See increases as above. Except you can still play them on a console purchased 5-6 years ago.
Years 7-8, a game here or there may be able to exploit some more efficiencies but in general games have plateau'd. Time for a new console.
And just look at iOS and PC everytime there's a new generation of hardware.
On iOS, Apple fixes bugs, introduces new features, etc. that break existing applications and then expects the developers to fix things. Users have to hope that if it's an app they use that get's broken that the developer is still around and still interested in fixing it.
On PC, similar things. New graphics card comes out with new drivers that break compatibility with older games. Or updated OS breaks compatibility. Or some other component breaks compatibility. But on the PC, instead of the developer having to fix things and user crossing their fingers, the hardware manufacturer's or Microsoft ends up fixing or making workarounds for the broken compatibility. Or they work with the software developer to try to fix any incompatibility assuming the game isn't old. At which point the developer won't be interested anymore.
What's that going to be like on the consoles?
Regards,
SB