*spin-off* Importance of Backward Compatibility Discussion

Fully consistent emulation heavily depends on a massive advantage in straight-line speed over the architecture being emulated.
Matching or exceeding throughput is also a requirement, but that has proven to be much more easily done with the increase of transistor budgets.

There's no avoiding that Xenon and Cell were physically clocked at 3.2 GHz, and the serial component of the workload can't be consistently emulated at speed. If the parallelism isn't there, or the older architecture didn't fall down on its face, there's no getting past the physical reality that the new chip takes longer to take each step.

That's without taking into account the that at least some emulated steps will have to be mapped to multiple clocks on the running architecture.

To get closer, the hardware design has to shift to match the architecture it is emulating. The best example of emulating a different architecture is the the hardware cracking of x86 instructions into internal instructions, which pushes the emulation steps into the pipeline and forces most of the activity to happen at a clock cycle granularity again.
It still doesn't cover all scenarios if the clock is slower, and in this case it is much slower.
 
I am no fan of BC especially in this day and age and under these circumstances.

I feel reinvesting in hardware I already own just to unclutter my entertainment center is a wasteful endeavor.

Does it make sense that you can facilitate PS3 gameplay on a PSP or Vita through remote play, but you can't do it through your PS4 (Sony hasn't readily described such functionality)?

Does it make sense that both these companies are talking about cloud streaming games over hundred of miles from huge data centers, when there has been practically no talk of using my purchased hardware and cloud streaming from 50 feet away using that little data center I call my closet? Can't we use that ethernet port for something other than reducing multiplayer lag?

My biggest disappointment of the next gen console is that home streaming has taken a back seat to cloud computing. Media and game extenders is what I was ultimately hoping for this gen. The ability to game in every room of my house without the need to purchase additional consoles to facillitate this desire. What makes it worse is that there are companies that are doing this today. Why can't I buy a couple XBOX or PS branded onlive like boxes and set them up in my living room, son's room and bedroom? Streaming games and other media from my PS3, 360, future PS4 and future XB1 sitting in some corner in a cabinet with a modem.

Its my opinion the BC discussion this gen should have been reduce to "I want BC in my new console because I don't want a PS3 or 360 in my house".

Make current compatibility a more feasible solution and practically no one would be complaining about the lack of BC. How does Sony and MS expect to dominate our living rooms with hardware thats last 8-10 years when they lack the foresight to provide solutions that everyone else is going to offer as standard features in next few years?
 
Last edited by a moderator:
...Its my opinion the BC discussion this gen should have been reduce to "I want BC in my new console because I don't want a PS3 or 360 in my house"...

Who knew xb360 and ps3 were built to last forever?

And of course, everyone loves having double the number of controllers in the livingroom.

c'mon man.

I'd be all for dropping BC if it meant vastly superior hardware in the box. But as is, the new cpu's are more lateral moves than anything else. Thus breaking BC for not much gain.

For Sony, it's a shame as cell was a rather forward looking design that scaled REALLY well, but I understand that some devs never liked the architecture and found it too time consuming. That being the case, drop in an x86 and lean on gpgpu moving forward.

Makes sense.

For MS though, how many devs complained that xb360 was a huge pain to dev for? The weakspot was obviously OoOE, but that can be mitigated with a larger cache. Or stick an OoOE PPC core in there and work out the quirks.

Either way, we are talking about a loss (no BC), with not much gain (performance) which was completely avoidable.
 
Last edited by a moderator:
Fully consistent emulation heavily depends on a massive advantage in straight-line speed over the architecture being emulated.
Matching or exceeding throughput is also a requirement, but that has proven to be much more easily done with the increase of transistor budgets.

There's no avoiding that Xenon and Cell were physically clocked at 3.2 GHz, and the serial component of the workload can't be consistently emulated at speed. If the parallelism isn't there, or the older architecture didn't fall down on its face, there's no getting past the physical reality that the new chip takes longer to take each step.

That's without taking into account the that at least some emulated steps will have to be mapped to multiple clocks on the running architecture.

To get closer, the hardware design has to shift to match the architecture it is emulating. The best example of emulating a different architecture is the the hardware cracking of x86 instructions into internal instructions, which pushes the emulation steps into the pipeline and forces most of the activity to happen at a clock cycle granularity again.
It still doesn't cover all scenarios if the clock is slower, and in this case it is much slower.


Thanks for going more in depth with the issues involved with BC.

I'd have to say that the Engineers at MS had to be aware of these issues in choosing jaguar right? They also had to have a pretty solid understanding of what performance to expect out of jaguar too.

Either that, or like many things at MS lately, it seems one hand didn't know what the other hand was doing.
 
I wouldn't consider it a form of BC if you have to pay for it. Its just a game streaming service with an old library.

I have no insight here, but I very much doubt it will be free, and I doubt that a PSN+ subscription will cover the costs.
 
Thanks for going more in depth with the issues involved with BC.

I'd have to say that the Engineers at MS had to be aware of these issues in choosing jaguar right? They also had to have a pretty solid understanding of what performance to expect out of jaguar too.

Either that, or like many things at MS lately, it seems one hand didn't know what the other hand was doing.

The engineers would have known this going in.
That's why having an emulator was probably skipped in favor of testing for recompilation, which can get a lot closer than being forced into a software loop running a software loop.
There are issues with that as well, particularly if the algorithms and data structures don't match the hardware. Totally getting past that would require a rewrite of the code, which is the same as not having BC.

It's not a matter of perfection, either. The most difficult to emulate well would be performance-critical and highly optimized code. A lot of code isn't important enough to worry if it's slower, and less demanding titles could get away with it.

Having a BC library that has many, but not all, titles is an option.
From the sound of it, whatever threshold was set for "good enough" wasn't met, but that wouldn't be proven without going out there and testing existing titles.
 
Who knew xb360 and ps3 were built to last forever?

And of course, everyone loves having double the number of controllers in the livingroom.

c'mon man.

I'd be all for dropping BC if it meant vastly superior hardware in the box. But as is, the new cpu's are more lateral moves than anything else. Thus breaking BC for not much gain.

For Sony, it's a shame as cell was a rather forward looking design that scaled REALLY well, but I understand that some devs never liked the architecture and found it too time consuming. That being the case, drop in an x86 and lean on gpgpu moving forward.

Makes sense.

For MS though, how many devs complained that xb360 was a huge pain to dev for? The weakspot was obviously OoOE, but that can be mitigated with a larger cache. Or stick an OoOE PPC core in there and work out the quirks.

Either way, we are talking about a loss (no BC), with not much gain (performance) which was completely avoidable.

Your old PS2 probably lasted longer than the inclusion hardware BC in the PS3. LOL. Plus PS2 was manufactured quite awhile after the launch of the PS3 and used consoles are readily available on the internet.

And if you are streaming the only reason you would need multiple controllers for multiple devices from the same manufacturer, is because some bonehead designed it that way. Do you think Gaikai BC will require PS3 controllers?
 
70million+ Xbox 360 owners is not incentive? Wow!

Tommy McClain

How many of them will actually play their old games? If I'm to believe the gaf crowd most of them sold their games anyway so they don't have any. The question becomes whether the cost is worth the effort? I'd just as soon they not waste engineering and silicon dollars on a feature I will never use, but how much would you be willing to pay for BC? would you give up 15% performance for it? Or pay an extra $50? Which of those strategies gets MS the most customers.
 
Your old PS2 probably lasted longer than the inclusion hardware BC in the PS3. LOL. Plus PS2 was manufactured quite awhile after the launch of the PS3 and used consoles are readily available on the internet.

And if you are streaming the only reason you would need multiple controllers for multiple devices from the same manufacturer, is because some bonehead designed it that way. Do you think Gaikai BC will require PS3 controllers?

Part of it is access to the old library, the other (bigger) part is cross generational play luring other gamers into the new experience.

Without either of these, switching platforms becomes an easier proposition.
 
...how much would you be willing to pay for BC? would you give up 15% performance for it?

How about 0%?

How about dropping in a quad (or more) xcpu (12 core 24 thread)? Couple that with either larger caches, or a single large 64-128MB EDRAM pool to mitigate in-order execution stalls. Viola! BC and higher performance.

But that avoids the real question. It isn’t how much the customer is willing to pay for this expected feature (at this point, it is expected).

The question is: how much is MS willing to lose for not including it?
 
How about 0%?

How about dropping in a quad (or more) xcpu (12 core 24 thread)? Couple that with either larger caches, or a single large 64-128MB EDRAM pool to mitigate in-order execution stalls. Viola! BC and higher performance.

But that avoids the real question. It isn’t how much the customer is willing to pay for this expected feature (at this point, it is expected).

The question is: how much is MS willing to lose for not including it?

So you're willing to give up nothing for a feature. It apparently isn't very important to you, why would any designer make sacrifices in design to include a feature you value at nothing? You have to give up something to add BC, it's down to money or performance. Is BC worth more than the cost? MS and Sony obviously don't think so

It is not expected, BC was fairly limited in ps360 and mostly killed off during the generation and judging from the record preorders of both of these BC free consoles it doesn't seem to be a major impediment.
 
So you're willing to give up nothing for a feature.

Nothing was needed to "give up".

MS already owned the license for Xcpu. The had/have the talent in-house to push that design beyond the 3 core 6 thread design currently in xb360.

Switching from this architecture to x86 gave us not much of a performance bump (any?), but it broke BC and I'm quite sure MS had to pay AMD for the design in xbone.

So the question is how much did it cost to switch from ppc to x86?
 
It is not expected, BC was fairly limited....

Not expected here (after the specs leaked of weak x86 core), but the general public doesn't get why the new box can't play the old games. They've been living in a world where the new gadget works with the stuff from the old gadget.

It's been this way for years now with ipad, iphone, android, etc.

It's expected.

Ask around outside the tech world, the general public doesn't get why the new box won't work with the old content. All they know is, "that's stupid, why would MS do that?".

Followed by a blank stare as I/you get into details of architectural differences in the hardware which make it impossible.
 
Switching from this architecture to x86 gave us not much of a performance bump (any?), but it broke BC and I'm quite sure MS had to pay AMD for the design in xbone.

Sure it did, the Xenon cores were pretty lame when they were released and decrepit by any modern standard.
Jaguar is much more consistent in performance and it has a wealth of resources and fewer glass jaws than Xenon.
To say that there are certain classes of code like vector code leveraging Xenon's FPU perfectly and certain kinds of low-ILP pointer chasing that also doesn't get gutted by everything else that's terrible in Xenon doesn't mean Jaguar isn't a vast improvement.
 
Sure it did, the Xenon cores were pretty lame when they were released and decrepit by any modern standard.
Jaguar is much more consistent in performance and it has a wealth of resources and fewer glass jaws than Xenon.
To say that there are certain classes of code like vector code leveraging Xenon's FPU perfectly and certain kinds of low-ILP pointer chasing that also doesn't get gutted by everything else that's terrible in Xenon doesn't mean Jaguar isn't a vast improvement.

Higher efficiency for sure.
OoOE is a win.
Larger caches is a win.
Prefetch is a win.
Branch prediction is a win.

But the overall performance gain isn't an order of magnitude over xcpu.

Many of these wins could be found in other ppc designs from IBM, but even without that, the existing design could be leveraged by having a larger cache (EDRAM) and more cores.
 
Nothing was needed to "give up".

MS already owned the license for Xcpu. The had/have the talent in-house to push that design beyond the 3 core 6 thread design currently in xb360.

Switching from this architecture to x86 gave us not much of a performance bump (any?), but it broke BC and I'm quite sure MS had to pay AMD for the design in xbone.

So the question is how much did it cost to switch from ppc to x86?

They own the license for Xenon and Xenos. Its doesn't mean they are free to create derivatives of those designs without AMD's or IBM's permission or compensation.
 
Back
Top