*spin-off* Importance of Backward Compatibility Discussion

Aah, you appear to be under the impression that the console manufacturer has infinite resources and can just easily absorb all the design, manufacturing and test cost for adding a new sku with what would have to be significant hardware differences and huge software differences. I too would like to live in your world, unfortunately, I live in the real world, where the xbox team didn't have enough resources for even the stuff they _did_ want to ship on one sku.
Well, there was a PS3 with PS2 Hardware and one without. The work has been done, cost been spend and software maintained. The world doesnt end with MS :D

just Europe never had the chance to choose (and even then I wouldnt be sure if Id take a BC-Model over one that generates less heat and noise). To me having 2 SKUs would be the only way to gauge the "worth" of BC.
 
And let's not also forget the BC doesn't fully exist in mobile. Buy a new handset and all your old handset peripheral like the dock and car jack and such stop working because the connector has been changed. People are having to buy new stuff to support their upgrades all the time.

I think it's easier to relate to incompatibility on a hardware basis as a customer when you see that the part just doesn't physically fit anymore. Even so, headsets have been compatible over various devices for a long time now (the technology and unified interface is called bluetooth and in other cases, simply [micro|mini]-USB). The exception might be the iPhone where the propriertary interfaces are a hassle, but also a very lucrative market for those that produce accessories. For a long time though, at least iphone3-4s (if I'm not mistaken) were pretty much compatible from a hardware point of view. It just changed with iPhone5 and I would except them to continue this.

I don't think there's much analogy between PC/mobile and consoles. Consoles are a dedicated box where owners are used to leaving the old behind.

Why has Microsoft all these years stuck with legacy components throughout all their OSs (since pretty much Windows95/98) to remain compatible with most of old software? In order to keep their monopoly and their influence - because if they had started from scratch every single time they launched a newer version of Windows, they'd be losing a lot of customers or making themselves vulnerable to other competitors gaining a foothold in their market. Backwards/Forwards-compatibility ensured on a grand scale that it was easy for their cusotmers to upgrade to their new OS while still being able to use the old software. Continuity is the key word here.

Continuity that I think will become more and more important in console games as well. As my image further up shows, games are reaching some sort of dimishing returns. Now, I don't mean that on a pixel level, which there are enough technically minded here to argue otherwise, but more in a broader sense in that game development costs are rising and that at some point, visuals might still be improving, but not on the scale that we have seen before when we went from pixelated shimmering mess we used to call games to relatively high image quality rendered frames. When the difference between console generations become smaller, people will want to have good reasons to upgrade to newer, more expensive hardware. And every single console launch without backwards-compatibility - you are vulnerable to lose existing marketshare if you can't bind your customers to your platform. Backwards-compatibility somewhat does this, so does linking them to an eco-system.

Take the Android/iPhone/Windows eco system; As an Android user, I have probably invested into upwards of 20 programs I purchased and use on my phone. Continuing with an Android phone means I get to keep using those 20 programs even if I buy a new phone. I have gone through this process now already 3 times - From a LG Optimus Speed 2x, to a Samsung Galaxy S3 and now a HTC One M8. That's 3 different OS revisions (Gingerbread [2.3] -> ICS [4.04] -> JellyBean [4.2.2]). All programs, even down to the silly Angry Birds, work.

Why would I switch to lets say, an identical phone using Windows, if I would have to repurchase all my programs?

I might consider it for different reasons, lets say if the phone was better or suddenly the other OS does something which I have always wanted, but I will always stack it against my existing investments that I made. At some point, it will be crucial for console makers to go the same route, or else they will risk not only losing customers to eachself (Sony to Microsoft or vice-versa), but potentially to other uprising markets as well (Smartphones, Tablets).
 
I must admit, that I really don't care for backwards compatibility on consoles...but I like the trend of getting remastered titles (as long as they are not full price). Improved graphics is what pulls people to new consoles...so why would I want to play an ol stinky rusty PS360 game on my shiny X4
 
Why has Microsoft all these years stuck with legacy components throughout all their OSs (since pretty much Windows95/98) to remain compatible with most of old software?
Cyclice arguments already said. In short, if you've spent hundreds of dollars on PC software you still use (Office, Photoshop, video editor), you want to keep access to that software. On consoles it's different because most gamers move on from old games and don't look back - the software investment is disposable. A revisit for nostalgia is nice, but by-and-large everyone's playing the latest, greatest games, while still using their 5 year old copies of Word and Photoshop which do everything they want and don't warrant a significant repurchase of the latest version.
 
I'd argue that I, just as probably most other more mainstream gamers outthere, probably spent more money on games than most people (average consumers) have on average for PC software. I'd argue that over the life of a console, an average gamer ends up with probably up to 10 games and quite a few more if you include last generations smaller digital investments.

Even if gamers move on to better games (well better may be wrong, lets say more immersive) and upgrade for that reason, I'd still argue that an investment is an investment. It might not be relevant after 20 years, but it sure is when you're in the process of upgrading. Also, as I pointed out earlier; the way newer games distinguish themsevles from the games they're replacing is getting smaller and smaller. That adds value to your older investments.

It's a bit with movies; Simply because we've already seen a movie we bought earlier, doesn't mean we wouldn't go back to watch them again sometime. Even if newer movies make older movies seem even more dated. The higher the quality becomes though, the less we will notice the difference between old and new. In that sense, I find it increasingly difficult to tell apart a movie out of ~2004 to today, but it's quite a bit easier to tell a movie from the early 90ties to anything post 2000. I see a similar trend happening in games as well. The difference is getting smaller and smaller (for the average non-technical minded casual anyway).

Anyway, even if you are correct that the two markets (PC and console) aren't comparable to that level, it doesn't negate the other points I raised on why continuity will become an even more important factor looking into the future:

Phil said:
Continuity that I think will become more and more important in console games as well. As my image further up shows, games are reaching some sort of dimishing returns. Now, I don't mean that on a pixel level, which there are enough technically minded here to argue otherwise, but more in a broader sense in that game development costs are rising and that at some point, visuals might still be improving, but not on the scale that we have seen before when we went from pixelated shimmering mess we used to call games to relatively high image quality rendered frames. When the difference between console generations become smaller, people will want to have good reasons to upgrade to newer, more expensive hardware. And every single console launch without backwards-compatibility - you are vulnerable to lose existing marketshare if you can't bind your customers to your platform. Backwards-compatibility somewhat does this, so does linking them to an eco-system.

Take the Android/iPhone/Windows eco system; As an Android user, I have probably invested into upwards of 20 programs I purchased and use on my phone. Continuing with an Android phone means I get to keep using those 20 programs even if I buy a new phone. I have gone through this process now already 3 times - From a LG Optimus Speed 2x, to a Samsung Galaxy S3 and now a HTC One M8. That's 3 different OS revisions (Gingerbread [2.3] -> ICS [4.04] -> JellyBean [4.2.2]). All programs, even down to the silly Angry Birds, work.

Why would I switch to lets say, an identical phone using Windows, if I would have to repurchase all my programs?

I might consider it for different reasons, lets say if the phone was better or suddenly the other OS does something which I have always wanted, but I will always stack it against my existing investments that I made. At some point, it will be crucial for console makers to go the same route, or else they will risk not only losing customers to eachself (Sony to Microsoft or vice-versa), but potentially to other uprising markets as well (Smartphones, Tablets).
 
To be fair, Sony did already experiment with this. There are games you can buy with one fee for PS3, PS4 and Vita, and three or so games I bought on PSN in the past, like Flower, got a PS4 update that I didn't have to pay again for on PS4 but could immediately download. So I think it's definitely something they are working on.

At the same time, five years for a phone is a long time. I don't know how many apps I bought 4 years ago (when I had my 3GS) that I still use today (although there are one or two). My iPod 3S 8GB lost OS support at 4.x, and the 3GS at 6.x ... but at least the Apps do generally live on. Not that many good apps actually die from lack of support - I haven't had that happen at all yet.
 
That's true. I'd also add that backwards-compatibility isn't important going on a singular game. It becomes important as a whole if I look back at what I invested into each console I bought. Similarly, the same applies to my Android devices as well - it's not the single software that matters, but the sum of the entire investment.

I may not have used my PS3 (that came with particial backwards-compatibility) to replay many PS2 games at the time (as I pointed out, PS3 games were so far ahead visually and more importantly from an image-quality point of view), but being a PS4 owner now, I can say that I have actually hooked up my PS3 quite a few times to get some play time from games I haven't completed yet or to revisit some games that will remain a classic for a long time (Wipeout HD among others, I've also hooked it up to play some Uncharted). The difference between PS3 and PS4 games are there, but it isn't that large that I wouldn't gladly revisit some older games every now and then.

My PS3 is so noisy though - and having to hook up multiple consoles to my TV (projector) is tiresome and messy - to the same degree why a PC user replacing his old PC wouldn't want to keep it to play legacy software either.
 
I must admit, that I really don't care for backwards compatibility on consoles...but I like the trend of getting remastered titles (as long as they are not full price). Improved graphics is what pulls people to new consoles...so why would I want to play an ol stinky rusty PS360 game on my shiny X4

Because I've already purchased the digital version of the 360 version & they're making me rebuy the remastered version without some kind of credit or discount. I want to take my digital collection going forward. Can't afford to buy the new system without selling the older one.

Tommy McClain
 
On consoles it's different because most gamers move on from old games and don't look back - the software investment is disposable.

I'm not sure that's true. Gamers are forced to abandon since they have no choice. It's not disposable to me for sure.

Tommy McClain
 
Because I've already purchased the digital version of the 360 version & they're making me rebuy the remastered version without some kind of credit or discount. I want to take my digital collection going forward. Can't afford to buy the new system without selling the older one.

Tommy McClain

You can probably only get $100-$150 at most for a 360 hardware. Maybe unless Gamestop runs a special or something.

So that argument is kind of suspect imo. You can afford $300 but not $400/$500? Where you going to buy XB1 with Kinect? Because that's the same extra $100 you'd save by selling a 360. Remember you specifically say you wont sell the games (and you can sell them either way).

I do wish I could sell my 360+games to help finance a one, not because I need to, but just because a weird rationalization I guess. Somehow my brain likes it better as "I spent net $200 on this" than "I spent $400 on it". But I'll keep my 360 so I can compare graphics to One, and also for certain games.

I find it a bit odd how only Xbox gamers seem to supposedly care about BC. I dont see Playstation gamers waging vendetta's over BC like this, but maybe I'm wrong. At least that I notice. Suspicious.
 
I'm not sure that's true. Gamers are forced to abandon since they have no choice. It's not disposable to me for sure.

Tommy McClain
This has been discussed on this board before. Some people like their old games, but most don't care. And that's on a board of fairly hardcore gamers. The old arguments revisited : I've lost my Spectrum catalogue, my Master System, a load of DOS games, all my Amiga stuff, PS1 stuff, PS2 games. Don't care. There's a few I'd revisit, and plenty I tried revisiting only to find them dull and tired now. I'd much rather play new, better games. The argument then becomes one of scale - what proportion of gamers value what degree of past games to what amount, and factoring that into the cost of including BC in a new system. Qualifying the existence of people who like their old catalogue and people who don't care doesn't help quantify a value for BC.

Also, one can numerical determine BC isn't that big a deal. People have a finite amount of gameplay hours. If new games are selling as many as ever, then the ratio of new games being played versus old ones must be significant as there's less time to actually spend playing old games. Unless there's another time consumer being displaced, so people are playing x hours a week new games and y hours a week old games and y less hours a week doing something else people used to do (watching TV?), you literally can't squeeze old games alongside new games. He says, trying to find some new angle on a topic that really has had pretty much everything said about it already! ;)
 
You may not care, others might. Or in other words, if someone cares or not might very well depend on the level of investment he's about to lose by migrating to new(er) hardware. Also, there's a difference between missing out on losing the investment of games that are 20 years old and look outright horrible on modern TVs compared to investments made since last generation where there is a clear trend that people are going digital and tied to a login account that continues to exist in the future.

There are two arguments here;

- people that want B/C because they get to keep their existing investments
- that B/C makes sense from a market point of view as you tie your customers to their investment and your brand and eco-system

You can't ignore what makes Android/iTunes a strong eco-system and disregard it in the console space on the premise that console gamers are different. They're not.
 
That's true. I'd also add that backwards-compatibility isn't important going on a singular game. It becomes important as a whole if I look back at what I invested into each console I bought. Similarly, the same applies to my Android devices as well - it's not the single software that matters, but the sum of the entire investment.

I may not have used my PS3 (that came with particial backwards-compatibility) to replay many PS2 games at the time (as I pointed out, PS3 games were so far ahead visually and more importantly from an image-quality point of view), but being a PS4 owner now, I can say that I have actually hooked up my PS3 quite a few times to get some play time from games I haven't completed yet or to revisit some games that will remain a classic for a long time (Wipeout HD among others, I've also hooked it up to play some Uncharted). The difference between PS3 and PS4 games are there, but it isn't that large that I wouldn't gladly revisit some older games every now and then.

My PS3 is so noisy though - and having to hook up multiple consoles to my TV (projector) is tiresome and messy - to the same degree why a PC user replacing his old PC wouldn't want to keep it to play legacy software either.

Get a receiver with multiple HDMI inputs?

I have had one since before the PS3 I think, but I'm looking towards getting a new one that supports HDMI-CEC as my new TV has one, and so far I've preferred to use that instead - currently I use PS4, PS3 and the TV has a built in cable receiver, so I can do everything with the remote of the TV, including controlling the PS3 and the PS4, although the latter still has a few niggles (Netflix for instance works with HDMI-CEC on PS3 but not on PS4).

What I dislike most is that especially in music games I notice that the audio through the TV has some pretty bad lag, whereas with the HDMI-In on the receiver that was almost zero.
 
B/C is important but hampered by existing agreements and old technical approach.

I feel that with PS4, Sony may have already planned for h/w upgrade paths. In other words, PS5/6/7... may have better B/C chances than previous Playstations.

If they push for 60fps or higher, then it is less likely to rely on PS Now on a worldwide basis.
 
You may not care, others might.
Which is why personal anecdotes don't really help quantify the value of BC.

There are two arguments here;

- people that want B/C because they get to keep their existing investments
- that B/C makes sense from a market point of view as you tie your customers to their investment and your brand and eco-system

You can't ignore what makes Android/iTunes a strong eco-system and disregard it in the console space on the premise that console gamers are different. They're not.
I haven't ignored it, and I'm pretty sure back further in this thread I've pointed out the same. The ecosystem adds value. As a console designer, you have to weigh up the benefits of the ecosystem with the costs of implementing it. It's the understanding of all the pieces together, and not the isolation of any particular factor, that is essential to making the right choices. I'm not looking at PS4 and thinking Sony made the wrong choice. If PS5 isn't BC with PS4, I'll be scratching my head, but then if Sony manage something amazing that justifies the transition (whole new tech paradigm perhaps) then console gamers will forgive them.

I'm also unconvinced by your argument that diminishing returns means older games a retaining their value. GTAV on PS4 is WAY better than on PS3, and worth an upgrade to many. But most importantly if gamers are happy to play old games going forwards because they don't look too bad, it means less reason to buy new games, which is turn would mean the end of the gaming industry as is. Unlike old movies which one can put on for 2 hours, or an old CD you can listen to in the background of doing something else, playing an old game can mean dedicating ten+, even tens of hours, which is leaving less time for new games. I think the limits of the time resource means what gaming time people have will be spent mostly on the latest, greatest thing. Why fire up Ryse on your XB4k when Epic's Roam is better and a new experience and what everyone else is playing? Or play GT6 when GT8 is better in every way*? Like I say, if people do spend their time playing the old games, Roam and GT8 et al are going to get worse and worse sales until the business collapses. Unless, I suppose, core gaming becomes as ubiquitous as watching TV and the market increases to 400 million. Then the old libraries may be revisited by new gamers.

* Clearly GT is one of those franchises that eventually hit a wall and then there'll be no point buying a new version, although by then it'll maybe become an online service with content. That's actually the direction gaming should probably be going, and a lot of this discussion is probably moot, referring to a legacy model of fixed hardware and long hardware iterations.
 
Well, there was a PS3 with PS2 Hardware and one without. The work has been done, cost been spend and software maintained. The world doesnt end with MS :D

just Europe never had the chance to choose (and even then I wouldnt be sure if Id take a BC-Model over one that generates less heat and noise). To me having 2 SKUs would be the only way to gauge the "worth" of BC.
There was no 2 SKU system in the PS3. The PS3 launched with a single hardware SKU (ignoring hard drive size), and then later _removed_ functionality from a new SKU. Removing functionality, especially BC functionality is significantly cheaper to do than adding functionality. They stopped manufacturing the BC model as soon as they introduced the non-BC model.

Now the partial BC solution they launched in the EU must have cost them a pretty penny, but they obviously thought the removal of the components a bigger win than the cost of testing all the titles, and developing the emulation layer. Adding BC is the opposite, you don't have any cost wins. It's higher development cost _and_ higher component cost.
 
There was no 2 SKU system in the PS3. The PS3 launched with a single hardware SKU (ignoring hard drive size), and then later _removed_ functionality from a new SKU.
I recollect there were two SKUs in the beginning (one had a black Blu-ray faceplate, one silver) but the hardware differentiators were the memory card reader slots and number of USB ports. One had 4x USB ports and a bunch of card reader options and the other had 2x USB ports and no card reader options.

By the time PlayStation 3 made it to Europe Sony had already yanked one of the chips required for backward compatibility and replaced it with a software layer. The original Japan/US launch included full(ish) hardware compatibility. And shortly after the European launch, all the B/C went.

I can't recall any time where, except for natural stock rundowns in retail, where Sony offered a PS3 with B/C and a PS3 with no B/C.
 
There was no 2 SKU system in the PS3. The PS3 launched with a single hardware SKU (ignoring hard drive size), and then later _removed_ functionality from a new SKU. Removing functionality, especially BC functionality is significantly cheaper to do than adding functionality. They stopped manufacturing the BC model as soon as they introduced the non-BC model.

Now the partial BC solution they launched in the EU must have cost them a pretty penny, but they obviously thought the removal of the components a bigger win than the cost of testing all the titles, and developing the emulation layer. Adding BC is the opposite, you don't have any cost wins. It's higher development cost _and_ higher component cost.

I think the chromeless version also lacked wireless, damn time goes fast.
 
Back
Top