*spin-off* Importance of Backward Compatibility Discussion

Why not just DL one VM and not a hundreds of MB with each game?
I think this is inherent in XBO. The game OS requires that the game contains its own package entirely. So each game essentially ships with its own drivers, whatever the Sdk calls were, the game etc. I don't think they can get away with having a universal player. That doesn't exist for current gen games on Xbox from my understanding.

Upsides are they each package can be customized, downsides are that each package is unique and data comes as a cost to the user.
 
Why not just DL one VM and not a hundreds of MB with each game?
The game seems to need to be in XB1 format to feed to the 360 emulator, as it were. Do the 360 games show up on the dashboard, or do you have to boot into 360 mode to see available 360 games?
 
As far as I remember, Sony were positioned to be one of two (?) blue diode suppliers and one of very few BRD manufacturers. They stood to do very well from the format, and the inclusion in PS3 was possibly the key to securing some /a lot of the content providers. This going by the format war and watching which way the likes of WB were going to swing. We all know plenty of folk bought PS3 for its BluRay potential, to the point some even argue that's the only reason a lot of the early machines were bought.

Sony were, I'm sure, among those able to provide blue-violet lasers operating the wavelengths suitable for Blu-ray players but I seem to recall Sony had considerable and expensive production issues. You know what they say about the best laid plans of mice and men. And in any case profits from new media formats have always been in the media and not in the hardware so Sony surely weren't naive enough to think they could price key components at high prices for a format they wanted to see wide adoption for.

Why? MS have described it as just a repackaging. The emulator runs on top of the XB1 system, and needs to integrate with the system for things like game capture and services. Hence it makes sense to structure the 360 game as an XB1 app. This also plays out in why two-disc titles aren't supported yet, because the disc image file only supports a single disc and they haven't emulated dual discs in the package.

I agree the emulator has to run on the Xbox One system and needs to integrate with the system like game capture and services but this is the purpose of an emulator; to provide a bridge between the emulated environment and the host environment of the system the emulator is running on.

I know about the lack of support of multi-disc games, which could have been avoided with a conventional emulator approach you know like used by just about every computer/console emulator written in the last 25 years. It reads the data off the disc in the drive, when you need to swap the disc, you swap the disc. It works almost like magic! :yes:

They probably could have created an emulator that boots in 360 mode, but that'd lack convenient usability. And given emulation likely isn't 100%, QA is still going to be needed to limit list of available titles to those that work. So having a system where a game has to be repackaged in an XB1 format and is tested in the process makes sense. I see little reason to think MS are lying about this and they are having to actively tweak every title to run.

Inserting your 360 disc and just having it work like other emulators lacks conventional usability? But inserting your disc, having to download a repackaged version of the same game on the disc you just inserted, then using that grants conventional usability? Well I have to say that makes zero sense to me. As for QA issue, surely the obvious solution is for the emulator to have a list of "known good" game disc IDs which it will run or throw up a warning as not tested yet. This is what most emulators with patchy support do - an approach popularised by MAME from the early days.

Also I'd refer you not suggest I'm indicating that Microsoft are lying, what I said was "I am not convinced this is pure emulation in the traditional machine emulator sense", which is true because on the Giant Bomb Podcast Phil Spencer confirmed they need to tweak the emulator on the per-title basis which is not something you typically need to do on an emulator other than on a crude level like MAME where the directory of supported games will also indicated what chips, RAM and controller configurations needs to be emulated for each game but at last count, MAME emulated more than 20,000 distinct configurations of hardware.

I would be interested in learning how this tweaking is done, particularly how much do they play each game to determine the tweaks, the degree which one emulator may differ from another and what they can tweak and to what degree.

While it's undoubtably preferable than no b/c on PS4 I do feel that Microsoft could have engineered a smarter solution than the current implementation. I wonder if they just rushed it for the E3 announcement and the final solution will be better.
 
They all behave like normal XBox One games. There is NO X360 Mode.
As expected. So as Phil Spencer describes, they appear to the XB1 system as XB1 games. Thus they need to be in an XB1 package. I've no idea what that entails, but it clearly requires a chock load of data. Perhaps the data is encrypted differently or something? The bloat is curious, but I don't see it as reason to think the PR is lying about the implementation.
 
Inserting your 360 disc and just having it work like other emulators lacks conventional usability? But inserting your disc, having to download a repackaged version of the same game on the disc you just inserted, then using that grants conventional usability? Well I have to say that makes zero sense to me.
Switch on XB1. See 360 games alongside XB1 games. Launch 360 game. Take screenshot. Get invite to XB1 game. Swap out and to the XB1 game seamlessly. When finished, swap back to 360. Chat across Live with XB Live users on XB1 while using 360 game.

You lose all of that with a dumb fixed emulation solution.
I would be interested in learning how this tweaking is done, particularly how much do they play each game to determine the tweaks, the degree which one emulator may differ from another and what they can tweak and to what degree.
What's the difference between a per game emulator tweak and a per game emulator shim?
 
I think this is inherent in XBO. The game OS requires that the game contains its own package entirely. So each game essentially ships with its own drivers, whatever the Sdk calls were, the game etc. I don't think they can get away with having a universal player. That doesn't exist for current gen games on Xbox from my understanding.

No, but you only need a package to provide the emulated 360 package environment because the 360 didn't have a concept of each game package including a virtualised OS. It may be a security limitation where games OS code can not access data that isn't signed for the application - because this could be literally any 360 game disc in existence. Having said that, this would be a fairly arbitrary security limitation - I don't know what kind of vulnerability this would be designed to protect against.

Microsoft / Phil Spencer have had ample opportunities to explain why they went with this implementation so I'll be surprised if they feel inclined to share any more. Whatever its limitations, it beats Sony's lack of any b/c!
 
Switch on XB1. See 360 games alongside XB1 games. Launch 360 game. Take screenshot. Get invite to XB1 game. Swap out and to the XB1 game seamlessly. When finished, swap back to 360. Chat across Live with XB Live users on XB1 while using 360 game. You lose all of that with a dumb fixed emulation solution.
And you trade other things, like not playing multi-disc games and having to keep 360 games installed on the HDD rather than just running them from disc, with the current implementation. Choices, choices, choices.

What's the difference between a per game emulator tweak and a per game emulator shim?

Since Microsoft haven't gone into detail, it's hard to say. It could be as simple as a core emulator with options to tweak aspects of the emulation or it could be completely bespoke builds of the emulator for certain games using certain hardware features. We use both approaches to hardware emulation where I work. The first is preferable for flexibility and quick reconfiguration but if your host isn't sufficiently faster enough relative to your target emulation hardware (which is probably the case here) then you may want to compile a binary that emulates just specific things very, very well (or fast).
 
If there is a difference between a shim and a tweak, neither of which is a technical term for emulating AFAIK, then you have a decent argument. The understanding for most of us would come from taking 'shim' to mean per title adjustments to the emulator as per 360. Spencer specifically calls out that they are doing things differently to 360 where some games needed tweaks to the core emulator. I recall the description being something along the lines of 'levels' and simpler games running on the lowest level ran fine on the emulator without direct itnervention, but more complex titles needed specific adjustment. Perhaps if someone had details as to what XB emulation on 360 was exactly, it'd help narrow down the possible differences?
 
Shim is a term in virtualisation and emulation and is a layer within the emulated environment that manages API redirection and translation, e.g. you're running your 360 game code on Xbox One and the player gets an achievement and calls standard 360 OS function Achievement() but rather than pass that to the emulated 360 system it redirects to Xbox One's equivalent. Maybe the Xbox One API has more arguments so the shim layer takes care of this as well so the 360 game uses the Xbox One API even though its different, without any comprehension of what's going on. All the 360 knows is it called a 360 API and everything is good.

An extensive shim layer is something you want to do if your emulated system uses a different instruction set to your host system like 360's PowerPC and Xbox One's 80x86. You can rewrite the most common functions in the emulated environment in native code in the host environment then redirect them saving the overhead of emulating those CPU calls in non-native instructions. :yep2:

I speculated on tweaks above - could just be a setting to change the emulators behaviour or it could be a bespoke binary version of the emulator. Unless Microsoft tell us, we'll never know.
 
I took his descriptions to mean that on the x360 they needed to customize the shim (code layer) pretty much on a per title basis.
where as on the XO, the emulator/code doesn't need changing, but there's parameters say in the manifest that can be changed on a per title basis.
hence it scales to being able to do a lot of games easier and quicker etc
 
I guess that depends on a person's expectations. The initial list of games that worked properly was short, it included GTA Vice City but not San Andrea although that was added in time along with many of PS2's signature games like Ico, Jak & Daxter, Ratchet & Clank, Sly Cooper etc.

True enough. I remember Jak & Daxter not working, though it was kind of a no-brainer, being one of the games that went to extreme lengths to exploit performance gains by using the EE&GS in unique exotic ways. From this point of view, it's of course frustrating, because these games required some form of patching, which if I remember correctly, at least made it more or less playable. I think one of the main problems with BC in regards to PS3 is that they never could really emulate the GS - a basic rasterizer with huge bandwidth. Even with a much more complex GPU with dedicated hardware, it's difficult, if not next to impossible to emulate it. How well that in the end can be achieved, depends on how the software was written for it.

Anyway, backwards-compatibility is an interesting discussion. I admit, I'm a huge advocate of backwards-compatibility.... yet, I've never really used it. So why am I an advocate of this feature? Because I'm viewing it predominantly from a business / marketability point-of-view. And because if I view my own usage of it (or lack therefore), I am starting to realize that this feature is becoming more desirable as generations move on. Perhaps it's all to do with nostalgia, or perhaps it's because games are reaching some sort of diminishing returns. There might still be huge improvements to be gained in the complexity of games and how realistic they appear, how plausible, authentic, yet the game mechanics have not progressed that much since last generation. I think some of the biggest improvements found in the PS4 and Xbox One over the previous generation of consoles are not complexity, but image-quality, perhaps resolution and a much better multiplayer / online aspect/service. This time around, I am finding myself wanting to play a lot more PS3 games than I did find myself ever wanting to play PS2 games on my older PS3. Because PS2 games were hard to look at, once we started getting HD content, when the market moved away from CRTs to fixed resolution LCD displays. But now? I recently played with my sister GTA5 on the PS4 (she still owns a PS3 on which she owns and completed GTA5) and I asked her if she felt the graphics were better in the remaster. She said not really, and if yes, she couldn't pin point what. Yes, the remaster is better - more complexity, better draw distance, denser traffic, much more consistent framerate, better IQ. It's perhaps not the best game to showcast how much PS4 graphics have progressed since the PS3 days, but it isn't that bad either. There are worse looking games that have been programmed on PS4 from scratch. And IQ differences aren't that easy to quantify, especially when you move away from techies like us who post here, analyze screenshots and marvel over graphics on the subpixel level. To people who just play games for the enjoyment of it, there isn't a huge difference between PS3 and PS4.

So essentially, I think backwards-compatibility is going to increase its relevancy. Because hardware vendors can't afford to lose customers going into a new iteration of hardware. Microsoft is the one who is 'learning' this the hard way right now, because they are seeing lots of X360 users upgrading to a PS4 (essentially losing marketshare in the US, their biggest market). Sony could care less right now, because with or without backwards-compatibility, their market is growing. Next time around, I'm sure Microsoft will be thinking hard about what they can do that this doesn't happen again; They'll be thinking about how well they can tie their customers to their platform.

They'll do that by

1.) strengthening their product synergy (tie in Xbox with Windows and vice versa)
2.) increase the bond to their eco-system (Xbox Live) - essentially the platform that stays alive irregardless what hardware is accessing/running it
3.) make past purchases live on on newer hardware through B/C so that people see less point in changing when they've already invested much into a given platform

Either that, or eventually the market might change into something more akin to the PC market - constantly evolving hardware and not new hardware with every iteration that calls for new software etc.

And that all because of a 'feature' that isn't really used that much...
 
So when people say Sony had a vested interest in Blu-ray, I say sure they did, naturally. Sony were heavy vested in Blu-ray but not invested in financial terms because there was never the expectation Blu-ray would be more popular than DVD, their hope it would be as popular as DVD and even that fell far short because we live in a world where Netflix quality is good enough for most.

The potential financial gains is a separate issue and seemingly not the only reason for Sony's participating in Blu-ray. Imo their level of err... vestedness regardless of the nature was clearly so high that a digital only PS3 (even one version) was simply out of the question. You seem to downplay its inclusion in PS3 as just a reason Sony was visible with Blu-ray, but they went to great lengths to include it in. It's inclusion in PS3 alone shows their large commitment to it. PS3 was late and too expensive, this was mainly do to Blu-ray. X360 showed that DVD was pretty much enough for games in that generation, yet Sony had to have the Blu-ray in. Their commitment to it was very large. It's a valid argument that it didn't perhaps made sense, but the commitment was real. Also like you implied earlier Sony isn't just one out of 2 dozen participants in it. I'm guessing as far as royalties to go, they are in the top 3 and that the "patents don't cancel each other". In the end the financial figures aren't life changing for them, but I'm guessing they see some easy income streams from it.
 
Last edited:
The potential financial gains is a separate issue and seemingly not the only reason for Sony's participating in Blu-ray. Imo their level of err... vestedness regardless of the nature was clearly so high that a digital only PS3 (even one version) was simply out of the question. You seem to downplay its inclusion in PS3 as just a reason Sony was visible with Blu-ray, but they went to great lengths to include it in. It's inclusion in PS3 alone shows their large commitment to it. PS3 was late and too expensive, this was mainly do to Blu-ray. X360 showed that DVD was pretty much enough for games in that generation, yet Sony had to have the Blu-ray in. Their commitment to it was very large. It's a valid argument that it didn't perhaps made sense, but the commitment was real. Also like you implied earlier Sony isn't just one out of 2 dozen participants in it. I'm guessing as far as royalties to go, they are in the top 3 and that the "patents don't cancel each other". In the end the financial figures aren't life changing for them, but I'm guessing they see some easy income streams from it.

Yep, it makes no sense - BR wasn't overly important to Sony however by including it they gave MS a head-start and a big price advantage...so again I say why include the BR if it didn't mean that much to them - it makes zero sense.
 
And as I pointed out, there were many options for removing costs that Sony chose to ignore.

Sony did not choose to 'ignore' any options. They chose the options that made sense to them at the time. Losing things like piano casing would diminish the 'premium' image they cultivated for PS3, and losing the blu-ray drive would hurt their optical media plans (plus DD-only was not viable/sellable at the time).

Let's take this particular line of discussion back to the source:

I firmly believe Sony had a good idea of how little people played PS2 games on PS3 and dropped their b/c efforts for this reason alone.

They did not. They dropped BC along with other hardware features because their manufacturing cost was unsustainable and these things they deemed to be low priority sales drivers.


My original comment was that Microsoft's current solution was similar to Sony's - in regard of the limitations.

The similarities between their limitations are that you have to download a package to run a BC title, and that the BETA emulator currently runs a limited number titles.

Taken out of context many disparate systems have similarities. Looking at the bigger picture, MS and Sony's solutions are very different in terms of design, goals, feature sets and implementation.

MS' BC solution is sustainable for the future. With little development effort and no hardware development cost it will carry over to new platforms, with full integration into those platforms OS and software suite. It isn't just a patch to smooth over generational transitions. It isn't going to be removed in hardware revisions. This is your back catalog, physical and digital, potentially carrying over for the forseeable future.

You are using an outdated BC design from a different era of console software and infrastructure in your arguments against current BC efforts, effectively saying 'it didn't pan out for Sony so it won't pan out for MS'.

So as I said the comparisons are pointless. Financially, architecturally, and from a usability/convenience point of view the solutions are very, very different.
 
As expected. So as Phil Spencer describes, they appear to the XB1 system as XB1 games. Thus they need to be in an XB1 package. I've no idea what that entails, but it clearly requires a chock load of data. Perhaps the data is encrypted differently or something? The bloat is curious, but I don't see it as reason to think the PR is lying about the implementation.

The implementation of PS Now beta is able to present any game available within PS Now on the PS4's dashboard so that it appears just like any other PS4 game or app. Effectively it's a shortcut to a game within the environment and selecting to play any such game just bypasses the entire PS Now UI and starts the game. It's completely transparent and this approach could work just as well within in a single emulator application environment. Every game you install just appears on the dash and, like any other game that you need the disc to play, you get prompted to insert the disc.

I think the mostly likely reason for Microsoft's current approach is not that it's not desirable to have it work like this, but that it's necessary. I.e. emulating three multi-threaded 3.2Ghz PowerPC cores on seven 1.6Ghz Jaguar cores in in realtime is impressive even using modern emulation techniques. I think - at least for now - Microsoft are unable to offer a single emulator that just runs all games and that the tweaking process they refer too is critical for performance.

True enough. I remember Jak & Daxter not working, though it was kind of a no-brainer, being one of the games that went to extreme lengths to exploit performance gains by using the EE&GS in unique exotic ways. From this point of view, it's of course frustrating, because these games required some form of patching, which if I remember correctly, at least made it more or less playable.

I think the significant detail, which you said in your previously post, is it isn't the games that required patching but the firmware, i.e. the emulator itself. This is the convention for computer, console and game system emulators. Somebody writes the emulator and the emulator evolves and compatibility and like-for-like performance improves, new versions of the emulator are released.

I am curious if Microsoft will be updating their 360 game packages for Xbox One and whether they are roll out minor delta updates.

I think one of the main problems with BC in regards to PS3 is that they never could really emulate the GS - a basic rasterizer with huge bandwidth. Even with a much more complex GPU with dedicated hardware, it's difficult, if not next to impossible to emulate it. How well that in the end can be achieved, depends on how the software was written for it.

The aspect of the GS that makes it challenging to emulate is the 2,560-bit internal bus. I'm sure I've told this story before but I'll tell it again. Back in 1998/99 I was on a secondment to the UK's Department of Trade & Industry (the Government Department that deals with commerce issues) which was home to the Export Control Organisation (ECO). The ECO is a regulator that controls the export of strategic goods (like military equipment), dual-use goods (like advanced manufacturing equipment) and advanced technology (like high-grade cryptography and supercomputers). Some things need licence before they can be exported (or indeed now, traded between overseas countries) because of their inherent opportunity for mis-use.

We received an enquiry for Sony Europe about this new supercomputer technology they had and wanted to come and talk to us about so they could work how they could manage exports of a new product between EU Member States. They came in and told us the product was the "successor to the PlayStation" and they believed it was classified as supercomputer under EU Dual-Use regulations (the EU's law). Bear in mind, this was before Sony had announced PlayStation 2. We were sceptical of course but the Sony guys explained that the internal bus was 2,560 bits wide and operated at whatever frequency and the EU had a definition on supercomputers that includes a threshold for data throughput of about half that the bandwidth of the GS. We some spent time on this and eventually concludes that it wasn't a supercomputer because the bandwidth was not user addressable - i.e. it was high bandwidth but limited in use case scenarios.

I assume Sony had similar discussions with other regulators and I recall the press story that was carried in many places that Iraq's Suddam Hussein was planning to build a super computer comprised of networked PS2s.

As a PlayStation owner it was utterly cool to know that PS2 was coming but awful that I couldn't tell anybody about it. My most relevant claim to non-fame thanks to the Official Secrets Act. Thanks, Government :yep2:

The potential financial gains is a separate issue and seemingly not the only reason for Sony's participating in Blu-ray. Imo their level of err... vestedness regardless of the nature was clearly so high that a digital only PS3 (even one version) was simply out of the question. You seem to downplay its inclusion in PS3 as just a reason Sony was visible with Blu-ray, but they went to great lengths to include it in.

Actually, the financial return is 100% entirely the reason Sony (and most other companies) develop products. Cast your mind back to the launch period for PS3 where the prevailing view, which you can find with minimal googling, is that PS3 was Sony's Blu-ray Trojan Horse and Google shows 84,200 similar results. The anti-Sony narrative was Sony would do almost anything to make Blu-ray succeed and "destroy HD-DVD" (573,000 Google results), oblivious that Sony was one of a large number of companies with similar interests and not the company with the largest financial gain from Blu-ray were it to become popular.

However if you have evidence, or even a theory, as to Sony's pursuance of Blu-ray as a standard that isn't related to financial return I'd like to see/hear it. :yep2:

It's inclusion in PS3 alone shows their large commitment to it. PS3 was late and too expensive, this was mainly do to Blu-ray.
Actually not true if you ignore the rabid internet and actually look at attributable press statements. There was diode production problems that pushed PS3 back a few months but the principle reason for PS3 being as far behind 360's launch were due to Nvidia and RSX.

Let's not allow internet fanboyism to rewrite history or the 2006/07 Sony vitriol to distort reality. :nope:
 
Yep, it makes no sense - BR wasn't overly important to Sony however by including it they gave MS a head-start and a big price advantage...so again I say why include the BR if it didn't mean that much to them - it makes zero sense.

Because Bluray was important. Sony doesn't/didn't just receive income from royalties related to blu-ray disc sales. There are royalties related to the selling of the drives themselves. Sony is/was also in the business of manufacturing and selling the drives and discs (Sony got out of the drive business back in 2012 when it dissolved Sony Optiarc).

The motivation was to push blu-ray to become to defacto replacement for DVD. While optical disc royalties amounts to pennies per unit. The licensing fees related to the drives themselves are measured in dollars per unit. Its $2.00 now for DVD drive but I recall it being as high as $8-$15 dollars per drive. And we aren't talking only DVD players but the ubiquity of DVD drives in the PC market. The DVD format was driving tons of profits for those involved. Hindsight being 20/20, Sony, pre-2006, had no ideal how quickly internet would disrupt the optical business. It just saw how much money DVD was generating for itself and other stakeholders and wanted to make sure DVD was replaced by BluRay. HD-DVD biggest advantage was it was the cheaper format. HD-DVD wasn't trying to outduel bluray with specs. The strategy was to drive adoption by getting to prices more attractive to the mainstream more quickly. The PS3 and its subsidized price basically nullified that advantage.

A quote from Howard Stinger in 2007.
https://www.techdirt.com/articles/20070117/104753.shtml

"The people who like Blu-ray are the people who play PlayStation 3, just as people who play PS2s were the early proponents of the DVD format. It drove the DVD format."
 
Last edited:
Actually, the financial return is 100% entirely the reason Sony (and most other companies) develop products. Cast your mind back to the launch period for PS3 where the prevailing view, which you can find with minimal googling, is that PS3 was Sony's Blu-ray Trojan Horse and Google shows 84,200 similar results. The anti-Sony narrative was Sony would do almost anything to make Blu-ray succeed and "destroy HD-DVD" (573,000 Google results), oblivious that Sony was one of a large number of companies with similar interests and not the company with the largest financial gain from Blu-ray were it to become popular.

However if you have evidence, or even a theory, as to Sony's pursuance of Blu-ray as a standard that isn't related to financial return I'd like to see/hear it. :yep2:

How about just having control and presence over a situation instead of giving them to someone else. You think MS was thinking to make big bucks supporting HD DVD? A decision can be financial AND something else at the same time.

Actually not true if you ignore the rabid internet and actually look at attributable press statements. There was diode production problems that pushed PS3 back a few months but the principle reason for PS3 being as far behind 360's launch were due to Nvidia and RSX.

Let's not allow internet fanboyism to rewrite history or the 2006/07 Sony vitriol to distort reality. :nope:

An article from May 2005... There was no need for the Graphic chip to be ready by then. It was pretty much an off the shelf PC chip/design, that launched on desktop in the summer of 2005, well before the PS3. I wouldn't be surprised if even the Xenos GPU in the 360 wasn't ready by then. Blu-ray had huge issues, not allowing Sony to do WW launch and they wanted the console to have HDMI 1.3 output, mainly because of Blu-ray. HDMI 1.3 was finalized in the summer of 2006 and the Blu-ray production wasn't really running great at that point either.

I think you should improve your google-fu instead of pretending to remember how things were :)
 
Last edited:
How about just having control and presence over a situation instead of giving them to someone else.
To what end? What's in for Sony? Don't peddle non-specific open conjecture, evidence or theorise why Sony would do this. This is your position, I'm sure you've thought it through.

You think MS was thinking to make big bucks supporting HD DVD? A decision can be financial AND something else at the same time.

You should probably read this (Ars Technica).

An article from May 2005... There was no need for the Graphic chip to be ready by then. It was pretty much an off the shelf PC chip/design, that launched on desktop in the summer of 2005, well before the PS3.

Modern GPU cores (which includes RSX) are more portable than ever but the core isn't the challenge, adapting the new core to work with GDDR3 (should have been easy) and XDR (more challenging) across new bus types (much more complicated) is the real trick. In addition develops will want production representative silicon as far in advance as possible so games can be ready for launch.

Blu-ray had huge issues, not allowing Sony to do WW launch and they wanted the console to have HDMI 1.3 output, mainly because of Blu-ray. HDMI 1.3 was finalized in the summer of 2006 and the Blu-ray production wasn't really running great at that point either.

Blu-ray had issues that affected production and therefore PS3's launch but they couldn't have launched much earlier because of RSX and the fact that the OS obviously wasn't ready. Basics like background downloading wasn't even a thing until March 2007. So let's not be hyperbolic and blame everything on Blu-ray. This is patently untrue and is well documented. Of course if you want to link to what you believe are more accurate accounts for reputable and knowledgable sources, feel free. How about some reputable sources to backup your opinion?

I think you should improve your google-fu instead of pretending to remember how things were

Google is great for finding things on the internet, most of the things on the internet is bullshit. Just because a view is popular doesn't mean it's accurate.[/QUOTE][/QUOTE]
 
Back
Top