PS3 Strategy/Confidence Retrospective

Status
Not open for further replies.
I agree with much of that post, Joshua, except this bit. Including an HDD would add all of $30 to the XB360 and it's use is far more prominent now than XB because of the sudden growth in download content. You only need look at the uptake of HDD enabled XB360 SKUs to see the primary gamers buying it value the HDD, and MS, due to their price-point positioning of the SKUs, clearly wanted the HDD in there. If we see a note-worthy adoption of HDD-less SKUs where that $30 makes a difference, then I'll concede that it was an okay choice, but at the moment, the retrospective back to the systems' launches, the HDD-less SKU did MS no favours. It hasn't netted them sales, has added an extra bump for developers, and offers a lower income potential due to lack of storage, where including the HDD and attracting owners onto Live! would be beneficial. The choice being made on the outcome of last gen was a little short sighted IMO. XB's HDD wasn't taken advantage of principally because the lead console PS2 didn't have one so developers didn't bother to leverage XB's. Of the XB's exclusives, some couldn't have happened without the HDD, which were the major product differentiation points with PS2. I think the HDD-less SKU was just to get a $300 launch price, but expectations that that would attract custom have been proven false, and the result is a system where they no longer have the option to go HDD only, limiting HDD use to XB like functions and giving developers an unnecessary headache where 90% of the market (PS3 and XB360) has HDDs but developers still have to develop around its absence.

I will step back into the time machine and preface my position from August 2005: I strongly disliked the non-standard HDD and wrote one of my infamous epic-style posts on the topic looking at the pros and cons. I 100% agree that going the direction of making a HDD standard and compensating for such through DLC and Microtransactions (even music and movie downloads) and the leveraging demos and the like to generate new revenue streams as well as encourage increased software adoption could, in an online enabled console, justify the cost.

But I still think Microsoft made the right financial decision.

1. The HDD is only $30... ok, but that $30 difference allows a couple things. (a) It allows MS to create a more expensive SKU with a price point determined by value over cost. The 360 Pro at $399 was financially a win for MS over the $299 Core. (b) I still believe in price points in the market, and have long held the non-HDD SKU wasn't a year 1-3 move, but a year 4-6 move. In lifetime sales I would gamble that Microsoft will sell more units at-and-below $199 than above. An extra $30 means they either have to wait longer to hit that price point or have to lose more money. Instead they cut the $30 and have a nice upsell in the form of a Pro SKU and HDD add-on. (c) Consumer perception is a powerful device. $299 is easier to stomach than $399 and sits better with consumer frame of mind/perception... so you save up for the $299 360 knowing you need some games and such, so at the check-out counter you also pick up a game or two, an extra controller, and start looking at the cables and decide that the Pro SKU, after the cost of cables, isn't quite as bad as you thought and "justify" the upsell. Upselling is very popular--it isn't uncommon at all to lead with the "lowest price of entry" item as your flier headline (loss leader even) only to capture the upsell at the point of sale.

You are right that what, 10% (?), of 360 owners have the non-HDD SKU. But the effects of the SKU, imo, aren't directly tangible in direct sales at this point of time for the above reasons.

2. the sudden growth in download content... True true! But on the reverse is it only 60% of 360 owners have accessed online (Silver) at all? It appears to me that a core percentage of users has the majority of the activity with some mildly involved and 40% not at all. As the product begins penetrating more casual markets I wouldn't expect the HDD to be as valuable in this market. If the first 10M early adopting gamers use it at 60%, the more casual consumers thereafter (in general) would be less inclined. Online gamers are essential to capture early on, but I don't expect the trends/impact seen in this group to be equally strong among all consumers as the marketbase grows.

3. It hasn't netted them sales... It has netted them about 1M sales :p Bringing this back to Sony, what has more sales impact at this point, a standard HDD or mimmicking MS's move and offering a lower cost SKU without a HDD?

4. and offers a lower income potential due to lack of storage, where including the HDD and attracting owners onto Live! would be beneficial... It does lower theoretical potential, but does going w/o a standard HDD translate to real lost potential. Consumers can get on Live without a HDD. They can even download arcade games and some DLC. And I could be wrong, but I think the strongly-active online consumers are (a) a small, but important, demographic that (b) happily sprung for the HDD model. I think the sub-$200 crowd won't find the HDD as much of a compelling feature. They game less and buyer fewer games/accessories (we here are the ones who inflate the attach rate! With a median of 26 games on our primary last gen console). I am not downplaying online services as I think online is an important sales point to early adopters and is extremely important for various reasons, but I don't see the "birthday and Christmas crowd" hot and heavy on it... and those that are can easily obtain Live and a HDD.

5. XB's HDD wasn't taken advantage of principally because the lead console PS2 didn't have one so developers didn't bother to leverage XB's... I agree the PS2 held it back, but I think the problem is deeper than that (in regards to games). I have been a PC gamer since the 80s and can say safely that most genres, notably those selling on the consoles (shooters, RPGs, racers, sports, etc) work great without a HDD. I can think of very few PC games that make use of a HDD in a way that cannot be compensated for with a memory card--and even fewer of those games that are in genres that appeal to sit on the couch console gamers.

The killer app for HDDs are MMOs and those haven't done well on the consoles compared to the PC (even though a console like the Xbox had a larger gaming base than the PC). Even now the Xbox 360 with over 10M consoles with HDDs lacks a significant MMO. Mods are another popular outlet for HDD usage, but on the PC you are looking at a couple million consumers who even engage in such. The Xbox 360 Core SKU didn't prevent MMOs from coming to the 360.

At this point I don't think either of those items justifies the "standard" cost when people can add it on. The 60% of users who have Silver accounts does raise an eyebrow... until you look at their activity levels and realize they probably have a HDD anyhow :smile: The 10% who don't have a HDD aren't holding back DLC at all, nor would they be helping the situation if they had a HDD.

Of the XB's exclusives, some couldn't have happened without the HDD, which were the major product differentiation points with PS2.

Which titles do you have in mind? I cannot think of many Xbox titles that couldn't have been done without a memory card. The big differemtators on the Xbox IMO were, technically, the graphics and online. The big games like Halo and and Halo 2 and... (haha) didn't scream, "HDD only". A HDD made life easier, but that didn't stop ports and same-genre (and design) games from being on the PS2 and GCN.

I think the HDD-less SKU was just to get a $300 launch price

That and to get to $249, $199 and $149 as quick as possible.

but expectations that that would attract custom have been proven false

That it would attract the attention of early adopters is proven false. Then again someone willing to shell out $300 for a console (and $40 for a memory card) probably won't bat an eye in most cases at $400. Price sensativity of the different consumers plays a major factor.

has added an extra bump for developers... giving developers an unnecessary headache where 90% of the market (PS3 and XB360) has HDDs but developers still have to develop around its absence.

True, true, and it takes away a feature that I love. I personally hate the move as a gamer. And with some smart design choices I think MS could have averted the situation. e.g. They could have put TRCs to effectively state "expect no more than 4GB" and eventually shipped a model with 4GB of Flash memory (or even 2GB). So in 2005/2006 ship only HDD based models, and in 2007+ you can address the market. If a non-HDD based SKU would give you the financial flexiblity you need you have it, or you can then relax the TRCs knowing everyone has a HDD and continue to exploit and explore the advantages of upselling DLC and Microtransactions... expansions even (an upsell missing from the console market!)

But, in the end, I think MS wanted the $299 launch price. It is the same price the PS2 and Xbox shipped. It reduced sticker shock of a $399 console and actually looked "cheap" compared to the PS3.

Which brings us full circle.

A $499/$599 PS3 was a huge gamble.

The Core may have fed into this, to a degree. e.g. The HDD wasn't said to be standard in early 2006. In that window we saw 1) the Core largely ignored and 2) a large demand for the Xbox 360 w/ HDD at $399. The demand was strong into early 2006 and we can only guess how many units they would have sold in the strong holiday period and how many people, after widespread shortages, took a "wait and see" approach for the PS3 in Spring. If I am Sony I am feeling pretty good about $499. It is, afterall, only $100 more--and you get the really strong PlayStation brand as well as a Blu-ray drive. Easily a $100 value.

But value is a funny thing as sales bases different and it isn't even linear. The higher your price, even at a similar or greater value ratio, the market shrinks due to cost of entry. Likewise, I think there is an "early adopter" factor where some adopt first, and then only adopt later releases whereupon they offer something they cannot have with their initial investment. I would also venture the guess that the hardcore gamers who came over to the Xbox brand from the PC are more willing to throw down a lot of cash for gaming over and above what most typical console gamers are used to. $400 for a console is nothing when you pay that much for a GPU alone. $500 is a lot for a console when the most you have ever paid is $300 (PS2). Two different mindsets.

I think this may have given Sony some false confidence. I know it inflated my view of PS3 sales. I thought 6M was gonna be a cake walk by Spring 2007 simply because the inferior Xbox brand had sold amazingly well at $400 with really strong demand during the first 4 months. Sony was able to hit 3 territories where they were strong in each (MS was flat in Japan and Europe with the Xbox wasn't overtly strong) and appeared to have covered a lot of bases--Blu-ray for AV geeks, Cell (and stuff like folding) for Technophiles, RSX/NV and Epic/UE3 to ensure a heftier share of good PC ports, a free online network, good BC, and so forth.

Execution was horrible though, software wasn't compelling, and the price obscured all the value. The big selling points (HDD, Blu-ray, Cell, PSN, RSX) didn't translate into software that delivered a superior experience which made justifying the additional price even more difficult.

Looking back, I do wonder if the Core at $299 had the double impact of a lower cost of entry on consumer conscious as well as giving a false sense of confidence of how consumers perceived value. Demographic differences, while undocumented, would probably tell us a lot in this regards. We know there are territorial differences in price sensativity, brand appeal, and software tastes. FPS in Japan and Nintendo in general show this much.
 
And as for BluRay, I'm still waiting for counterarguments for a lot of my questions on how a 16x increase in RAM (from the PS2->PS3 perspective) is not going to require a significant growth in assets requirements (where you could question if a 5x - taking the 9GB that PS2 DVDs could hold - increase won't even be a limitation).

The only solution that Microsoft has for the future is basically to have HD installed games.

Still? Only?

I started a full reply, but this sort of banter isn't real discussion. Your first part of your statement has been answered before. Long explainations about the cost of content generation and that time is a significant factor, examples from the PC world (which has large pools of memory and HD resolutions for many years), even practical ways of how developers are using space more judiciously and using better compression techniques have been submitted. Putting aside the PR trumpting from exclusive developers, looking at the most pressing issues facing independant developers right now isn't storage space. Cost of development (both content creation and getting your game out on time) are far more pressing issues and they have raised their ugly heads already. I don't see 3rd parties clamouring for for more space for content they cannot afford to create. Especially when most games do fit quite nicely on a single layer DVD. We know there will be (and are) exceptions...

Which leads to the "only". Two words: Disk Spanning.

It works great for a lot of games. The PC, PlayStation, and Xbox 360 have done quite well with it so far. It works for a lot of games, too. And MS has done a lot of work in disk compression. HDD storage, compression techniques, online storage, disk spanning, etc are all things MS has available and has actively engaged in. They are not limited to only 1 recourse in terms of advancement or options.

And even if this was all true, this is one of the problems with the PS3: Potential.

It is so closely tied to Cell and Blu-ray as sales points to justify the cost, and when they don't deliver at the point of purchase (there isn't a PS3 game out that couldn't be done reasonably on the 360 or PC) the onus is moved away from why they aren't delivering now and back to the "potential" card and how the competition has little recourse. Great... but that opinion isn't resonating with general consumers. And it makes the assumption that consumers will see these difference, if they show up, as compelling selling points.

I also find it backwards. Console gaming is very much about delivering games (not technology) to consimers. Games like GTAIV get delayed due to the PS3 version lagging behind--but it is the developers fault for not designing the game correctly. But these same developers will be depended upon to demonstrate the importance of these underutilized selling points. And quite frankly that is the problem the PS3 is facing: software. Blu-ray isn't solving the problems developers have right now. It is hurting their game sales due to the install base issues, but if you ever want to see a BDR filled by a 3rd party with real content they need to sell games now, not tomorrow. And for all of Cell's potential, how it was implimented into the PS3 (memory issues, RSX assistance, design complexity meeting game complexity) is being counterproductive to this very point.

Sony needs to focus on what they have now--not what they may have tomorrow. Ultimately the focus on potential makes discussion circular and pointless as it relies heavily on PR and less on tangible realities consumers can purchase.

When the discussion breaks down where the basic facts become obscured you have a circular discussion that never addresses the issue. And that is where we are at with the storage issue--MS has but 1 option and all the counterexamples just don't exist.

Even if true the early issues paint an uphill battle: what developer is going to create a game for NA/European release that doesn't work on the 360? Install base is very relevant and there is no incentive to invest more time/money into content when the larger consumer base won't even have access to it.


Talking to myself, the storage issue should have been pretty obvious this generation. PCs were already churning out high resolution games with high quality graphics for years. But more importantly is how most games use their content. You would think that a game like a sandbox game would be huge. But to the contrary, these games need gamers to be able to interact with the entire world and travel in any direction. So streaming into system memory is a huge issue, so transfer speeds are a huge bottleneck. Having a 10GB with 512MB of memory and 15MB/s transfer speed is suicide! How data is transfered is very unpredictable so you need to keep the source content within budget to effectively stream. Further, objects need to be really interactive and accessible. Which brings into view why games like Oblivion and GTA are quite small. The odds of a developer needing 10s of GBs of data--the ability/$$$ to create such a world and then to execute it effortlessly to stream to main memory remains a huge hurdle.

On the other hand linear games can be more aggressive. Player progression is much more predictable, hence more predictable caching patterns. Further, you control everything the gamer sees, so you can afford to concentrate your art to the linear path. Instead of a lot of generic items repeated (as in a GTA game and spread out) a linear game can have a slew of unique content. A lot of RPJs fall right into this mold. And this problem can be addressed the old fashioned way: disk spanning. Your "Base world" can be on both disks, but you can separate story progression. Once a story driven movie sequence has been played it is no longer necessary due to the linear nature of the game. And even cruder resolution works for large games with MP and SP (just split them).

Some games won't have elegant solutions. Driving games, that can leverage a lot of resources sampled from the realworld, will be an example where they hit storage size in its weak spots as spanning causes frequent and tedious disk transfers because gameplay isn't linear. But even then few companies can afford to create 8 cities with nearly 200 courses (plus 2 tracks with another couple dozen courses).

Yeah, using incompressed audio, packing your data with repeated content for better streaming speeds, poor compression, 'bonus' language tracks, etc can all inflate your "need" for space, but aren't necessary (or always desirable). Most game designs are fitting will within current DVD limitations and have reasonable solutions. There will be exceptions, but game design and content creation costs are huge limiters to such.

No way that the 360 is still around in 10 years time. The Playstation 3 is the only one that might be. The main question in that regard is how successful Microsoft is in shortening console lifespan and at the cost of which market that will be.

The PS1 and PS2, according to Sony, are 10 year platforms. PS1 continued to be one after DVD was introduced, PS2 after Blu-ray. Pure marketing spin from Sony, put out by Sony in the context to message concerns about the longevity of the invesment (right from Kaz himself).

What determines the longevity of a platform is consumer adoption and developer support.
 
Still? Only?

I started a full reply

Considering this post, I shudder what a full reply would be like. ;) You're the only poster here making longer posts than I do I think ... :p

but this sort of banter isn't real discussion. Your first part of your statement has been answered before. Long explainations about the cost of content generation and that time is a significant factor

Which holds during the first generation, but not necessarily during the second generation. Also, technologies like MegaTexture and lots of other third-party and in-house developed technologies help overcome these problems.

examples from the PC world (which has large pools of memory and HD resolutions for many years)

... but rarely have the budgets to make real use of them, as well as having been technology limited, and basically incapable of doing much better for the simple reason that they're hardware limited even in terms of distribution media. It took them longer than any other platform before they could get away with not also offering CD based releases (which ended up being many discs at the end) because they couldn't assume DVD as a standard even much later than consoles could.

even practical ways of how developers are using space more judiciously and using better compression techniques have been submitted.

But with PS2 to PS3 going from 32 to 512 MB (16x) and the BluRay disc space going up 'only' by 5x PS2 disc size in comparison (which could in fact hold 9GB for games rather than the 7.4GB on 360), even with compression techniques making much more drastic improvements than seems to be likely based on historical development that might just as well only be enough to compensate for the 16x improvement in RMA to 5x improvement in ROM (or 7x if you base on 360's ROM format). Compression was potentially even more essential in last gen anyway because of ROM to RAM factor being much worse than now, so it's not like it wasn't used before.

Putting aside the PR trumpting from exclusive developers

Let's move on from disputable differences like having all the languages on one disc, which several 360 games aren't managing, but which is not likely to matter to the majority of consumers. Though in Europe I hear some complaints from people only being able to play Halo 3 with German voices for instance, and not the original English, but for now these are small groups I'm sure, at least for adult games like Halo. In the Netherlands, it's a big thing for us though that when kids watch, say, Disney's Aladdin, they can watch with Dutch language (not being proficient enough to read the subtitles), while adults can and prefer to watch English language (Robin Williams!). Pixar/Disney feature films in cinemas will be aired with (still high-profile) Dutch voice actors during the day, and with the original English voices during the night. But Resistance and Heavenly Sword do offer a lot of different languages to play in. It may matter at some point, even in the US - can you get Halo 3 in Spanish? Is it an option on the disc to switch between the two? It seems trivial now ...

But, as I said, let's move on. ;) What about games like Ratchett & Clank and Uncharted? First is out now, and the second is coming out end of this month. They are very clear examples of games fitting only on BluRay. I don't see them happening on 360. Stranglehold has had its share of issues, sure, but it's out now and does feature the best version of Hard Boiled, with some crazy nuts buying the PS3 game more for the movie than for the game. Either way, it's a clear value add (and note to EU guys, does not have a region lock, so importable). Also due out in December now, is Singstar, with 30 HD videoclips on disc.

looking at the most pressing issues facing independant developers right now isn't storage space. Cost of development (both content creation and getting your game out on time) are far more pressing issues and they have raised their ugly heads already. I don't see 3rd parties clamouring for for more space for content they cannot afford to create. Especially when most games do fit quite nicely on a single layer DVD. We know there will be (and are) exceptions...

They have raised their ugly heads, but that's not new, and I still believe they will be quickly overcome. Look at R&C, a 12 hour game looking all beautiful and shiny, and produced in just a year. After all, now that they have the engine running (obviously still improved, but the basics were there from Resistance and they probably had most of those improvements like streaming textures already figured out even before the Resistance release date), it's all about art creation, but especially about clever art creation.

Which leads to the "only". Two words: Disk Spanning.

It works great for a lot of games. The PC, PlayStation, and Xbox 360 have done quite well with it so far. It works for a lot of games, too. And MS has done a lot of work in disk compression. HDD storage, compression techniques, online storage, disk spanning, etc are all things MS has available and has actively engaged in. They are not limited to only 1 recourse in terms of advancement or options.

You've repeated this, but the combination of disc spanning this early in the generation (content generation is only going to get easier, as everything else is basically fixed so all the effort will go into optimising the art pipeline, and already has been if you've been paying attention), and not being equally trivial for all games, and the lack of a HD drive present by default in the 360 is a problem. It's one they can fix (make HDD required), but the combination of all these issues is a mess that will continue to build.

It is so closely tied to Cell and Blu-ray as sales points to justify the cost, and when they don't deliver at the point of purchase (there isn't a PS3 game out that couldn't be done reasonably on the 360 or PC)

That is if the noise of a DVD drive spinning at 12x (or 8x if you want to use dual layer discs apparently) isn't a factor during purchasing. For me, all else being equal, I'll gladly pay for a quiet system, though I'm sure they can get the system more quiet in other ways eventually. But that's the same discussion of potential.

But even then, we disagree. I don't believe that Uncharted can be done on the 360. I don't think Ratchett & Clank can be done either, though maybe by spanning multiple DVDs you could get close. Every DVD you're going to bring it down from 4-5 though, is going to mean a sacrifice in some area of the package.

I also find it backwards. Console gaming is very much about delivering games (not technology) to consimers. Games like GTAIV get delayed due to the PS3 version lagging behind

Isn't that false? I remember that it was the 360 version being shown to the decision makers, and based on that version, the decision to postpone was made. Even if there is in fact a deal in place that requires the two versions to be released on the same date (which is possible), do you honestly believe they needed a full year more just to get the PS3 programming issues out of the way, despite having a HDD default? I find that highly, highly unlikely. The game just wasn't done, and wasn't good enough, and they needed more time. Considering that this would also bring them a bigger installbase at the time of release, that made the delay a little less painful to them (though more to Microsoft).

And quite frankly that is the problem the PS3 is facing: software.

While I agree with you that software is its main problem in the market right now, it won't be in the very near future. To be precise, when this November and December have come and gone.

Blu-ray isn't solving the problems developers have right now.

Correction: multi-platform developers. They don't have the problem, simply because the 360 is the lowest common denominator, and because in the early days, creating content is problematic in the face of not knowing the limits of your game engine. Once you've gotten past that, then the 360 will be the lowest common denominator. It will then depend on its importance, the PC factor, and the competition games that target only PS3/PC to see how soon it becomes profitable to make additional use of BluRay for future games, or it becomes cheaper to design art at a higher level and then being able to downgrade texture quality or other matters to suit the 360.

Initially, the problems that PS3 developers have been having is coming to grips with the PS3's memory pool division, and the OS taking up more memory. But this memory has slowly been given back to developers and the difference is much smaller today than it was at the start of the PS3's launch.

Also initially, developers have been building much more on existing code base and practices that were not well suited for threaded and particularly multi-core development. But this is changing quickly. Perhaps not fast enough, and even the original Xbox got many more PC ports than the PS2 ever did, which is partly where all the booing and hissing from developers comes from (they may correct me if I'm wrong if they are here on the forums, but I think most of them come from PC), but those didn't make the difference back then. Will that be different this generation? I don't know.

And for all of Cell's potential, how it was implimented into the PS3 (memory issues, RSX assistance, design complexity meeting game complexity) is being counterproductive to this very point.

Maybe, though I don't think that's really it. The RSX itself may be an issue in some respects, as I think especially PC developers expected more from it compared to the 360's GPU, which is a triumph by ATI for now, and Sony probably not ready to take a risk in improving the RSX specs despite the year delay (which again especially PC developers and consumers would probably have expected). On the other hand, something like the RSX made the PS3 more accessible to PC developers compared to say the PS2. But Cell doesn't have much to do with it. Rather, the OS issues and reserving too much RAM for it I think made things harder, and the concept of texturing from main memory was something that I think wasn't picked up as quickly either.

The big thing here I think is that Microsoft got the software part of the package down faster and better initially, in many respects (Tools, SDKs, Live standards), and that's where Sony has been lacking by far the most. Part of my issue with the whole BluRay discussion is that those aspects are greatly overlooked. Even Rumble, with diehard gamers, initially being considered a bigger thing in the online discussions than motion sensing, which is much less a proven technology and something developers take time to make the most of, in terms of PR seems to have been more damaging for now. However, if the Dualshock 3 does become a success, then the 360 will remain as the controller which has a lesser rumble (I think anyway, it's in some respects really half the DS2's rumble, which has two motors, like the DS3) and does not have any motion sensing features.

Sony needs to focus on what they have now--not what they may have tomorrow. Ultimately the focus on potential makes discussion circular and pointless as it relies heavily on PR and less on tangible realities consumers can purchase.

True, in these days of the interwebs, PR is more rapidly deconstructed than 6 years ago. But it is also important to realise that for all the PS3 does, it IS more ambitious in terms of hardware, and it does take more time to get the rewards out of it. That said, the developers that have made clever choices, like Infinity Ward and Criterion, are I think how all multi-platform games are going to be done in the future.

Even if true the early issues paint an uphill battle: what developer is going to create a game for NA/European release that doesn't work on the 360? Install base is very relevant and there is no incentive to invest more time/money into content when the larger consumer base won't even have access to it.

Question of being clever. A small developer that does not have the resources to develop for more than one platform may look at one particular platform and develop a game that makes use of the strengths of that particular platform, to help it stand out in a crowded market. Then, if it is a large success, it can still decide to reegineer the game to some extent if at all possible, and try to win market share on the other platform(s), but this time having the advantage of having been a standout title on another platform (though that's not always a guarantee of success either, as we've witnessed in the past).

Talking to myself, the storage issue should have been pretty obvious this generation. PCs were already churning out high resolution games with high quality graphics for years. But more importantly is how most games use their content. You would think that a game like a sandbox game would be huge. But to the contrary, these games need gamers to be able to interact with the entire world and travel in any direction. So streaming into system memory is a huge issue, so transfer speeds are a huge bottleneck. Having a 10GB with 512MB of memory and 15MB/s transfer speed is suicide! How data is transfered is very unpredictable so you need to keep the source content within budget to effectively stream. Further, objects need to be really interactive and accessible. Which brings into view why games like Oblivion and GTA are quite small. The odds of a developer needing 10s of GBs of data--the ability/$$$ to create such a world and then to execute it effortlessly to stream to main memory remains a huge hurdle.

Perhaps, but on the other hand I think Bizarre Creations have already shown that you can create interesting worlds quickly with photo-textures. That's not to say it's the best way to do it, but it looks good enough (I don't hear many people complain) and it is a relatively fast way to create a lot of content. With PGR3 you can set your own tracks through the city, so they must have solved a big part of that problem (there's some LOD work going on certainly). And for all its faults, Lair also shows a lot of potential in streaming huge, detailed worlds with a lot of detail. In the next months I think we're supposed to see more of the Getaway, which I think is using a similar system as PGR with photo textures and other stuff.

think that games like Oblivion and GTA have so obviously been forced to be economical with their assets and having huge game worlds that they can't do anything else but think up a system that is very efficient, and considering the size of the project, they have to allow some room for error. The end result was that they stayed well within the margins of error. But if someone were to design a game with BluRay and HD as a given, then I'm sure they could easily use the space for the benefit of the game. Just think of the use of MegaTextures in the article in B3D news, or similar technology.

On the other hand linear games can be more aggressive. Player progression is much more predictable, hence more predictable caching patterns. Further, you control everything the gamer sees, so you can afford to concentrate your art to the linear path. Instead of a lot of generic items repeated (as in a GTA game and spread out) a linear game can have a slew of unique content.

Which is why we're seeing the advantages of BluRay in those type of games first, no doubt (again, R&C and Uncharted seem to me prime examples, though I have to wait 5 more days)

A lot of RPJs fall right into this mold. And this problem can be addressed the old fashioned way: disk spanning. Your "Base world" can be on both disks, but you can separate story progression. Once a story driven movie sequence has been played it is no longer necessary due to the linear nature of the game. And even cruder resolution works for large games with MP and SP (just split them).

That's a repeat argument, but recently we've seen RPGs (say, the FFX series) move to a setup which allows you much more freedom in which part of the world you access and in which order (like FFX-2).

The MP / SP split is a more obvious one, and works well though, for some genres. But it already happened with MGS3 Subsistence on the PS2.

Some games won't have elegant solutions. Driving games, that can leverage a lot of resources sampled from the realworld, will be an example where they hit storage size in its weak spots as spanning causes frequent and tedious disk transfers because gameplay isn't linear. But even then few companies can afford to create 8 cities with nearly 200 courses (plus 2 tracks with another couple dozen courses).

Unless, again, you take the approach that Bizarre Creations took for PGR3, where basically 'all' they did was create 8 cities and then added a course editor, which was available to themselves as well as to the gamers. If they can do 6-8 cities for the first game, then why not create as many for the second game and ship the first (maybe slightly improved) on the same disc? If you have the space. Also, at 200.000 polygons and varying amount of textures and sound data, I imagine GT5's cars also take up some space. BC though don't necessarily have to limit themselves to 360 now that they're with Activision, so we'll just have to see if they can afford to do something for PS3 or not.

The PS1 and PS2, according to Sony, are 10 year platforms. PS1 continued to be one after DVD was introduced, PS2 after Blu-ray. Pure marketing spin from Sony, put out by Sony in the context to message concerns about the longevity of the invesment (right from Kaz himself).

Even today they release a Singstar PS2 pack into the US, Buzz is still new to the US, etc. At $99, it's still something you can sell. God of War 2 is still a great game that wowed even in the first year of next-gen. With the advent of firmware upgrades, even more possibilities exist for keeping a platform relevant. But for all this, you do need forward looking hardware.

What determines the longevity of a platform is consumer adoption and developer support.

Are also obviously important factors. But if I may revert back to one of my favorite examples, the Amiga 500 vs the Atari ST, at the beginning of the lifecycle the cost and devleopment difficulty of the Amiga held it back against the Atari ST. But at the end of the lifecycle, the importance of the two had reverted. It's not that hard to imagine that a chip like Cell, the value of the SPU architecture and developing for multiple cores is going to be more mainstream among developers along the line.
 
Disclaimer: I've not fully read recent parts of this thread but let's post some viewpoints that seem relevant...

  • Blu-ray is quieter than 12x DVD.
  • Blu-ray is good for anti-piracy. Piracy is rampant for 360 and Wii but not for PS3.
  • Blu-ray is good for games in the same sense that Blu-ray movie discs can have more extra contents. These extra contents can include all kinds of advertisement.
  • HDD is good for advertisement. Non-game industries can join to push their latest product info to customers and sell goods as seen in Home, GT5P, and Dress.
  • HDD is good for anti-piracy. Piracy is common in China/Korea, MMOG are relatively popular there. MMOG requires HDD. Another thing that is as harmful as piracy for publishers is used games. As SCEI sued used game sellers in Japan in 1999 but lost the case, it's now completely out of control for publishers unlike movie business in Japan. This has been a big issue, most PS2 games sell day 1 and cease to move after that. Publishers have to add their games to the "PS2 Best" series ASAP, which is just another name of a price cut to counter used-game sellers. If you remember Kutaragi's patent about game DRM, it was against used games. Games on PSN are free from used game sellers. GT5P is released on BD and PSN download. DLC are also an attempt to extend the product life of BD-based games before going to used game sellers. 100mbit connection is common in Japan, it's likely 10GB-over games are on PSN in a few years.
  • The 40GB PS3 has WiFi which is not found in the 20GB 360. It's useful for PSP.

These features show PS3 is a publisher-friendly system while developers may have headaches in other areas. What is not friendly to publishers is the larger development cost, but actually multiplatform development can mitigate the impact.

BTW, there's no Blu-ray player market in Japan unlike in other countries. Only a Blu-ray recorder market. Likewise, there's no DVD player market, there are DVD-playing devices such as PS2 and PC, and a DVD recorder market. The latest recorder models from Sony all feature BD, they say they won't release DVD-only recorders in future. In the PC side, BD-R LTH discs are appearing with a cheaper price with 4x burners. Overall the year 2008 is promising for Blu-ray with more demand for HDTVs for the Olympic Games.
 
As long as the developers put it to good use (whether to store more game data, or to save development time, or to stream more at the same time), it will be worthwhile.

Worthwhile for late generation gamers, but this is a thread about Sony's strategy, and I don't think it was worthwhile from that perspective at all.
 
Disclaimer: I've not fully read recent parts of this thread but let's post some viewpoints that seem relevant...
  • Blu-ray is good for anti-piracy. Piracy is rampant for 360 and Wii but not for PS3.
  • Blu-ray is good for games in the same sense that Blu-ray movie discs can have more extra contents. These extra contents can include all kinds of advertisement.
These features show PS3 is a publisher-friendly system while developers may have headaches in other areas.[/quote]Point 1 is kinda moot. Would you rather release your game on a piracy rampant 10 million console user-base with an attach ratio of 10:1, or a piracy-free console of 5 million owners with an attach rate of 3:1? It's software sales that matter as a developer, if people will buy your game in numbers. Unless you can show that 2+ million XB360 owners are using pirated titles, the impact of piracy isn't going to harm the viability of the console unless it truly is rampant, and not just big-for-a-tiny-niche.

Creating extra content is also a dubious claim. If that extra content doesn't net you more income, it's a greater investment for lower returns. If you can sell 1 million copies of your game on DVD with a DVD's content, and moving up to BluRay and filling it still only get 1 million sales, was the move worth it? Advertisements are limited to certain genres and games, and you can't fit too many in without diluting their presence. If you offered an advertisement opportunity to a business partner, where your gamers will see their advert once every 6 months on play on average because they see a different ad every time out of zillions on the disc, I doubt it'll prove popular!
 
Creating extra content is also a dubious claim.
They have to devise a business model that shows how you can utilize the space without putting your own money. Don't create your house, instead rent the space to make money. Advertisement contents or demos by partners, or unlockable contents by third parties (you buy its key online), user MODs, etc. The same contents can be distributed online, but offline solution may be economical in some cases.
If piracy is so rampant on the 360, why does it have by far the best attach rate?
RRoD discouraging hardware crackers? ;) Publishers will never stop thinking "if there's no piracy the attach rate would be even higher".
 
Sony says the 40GB is still 90nm. Supposedly, the power reduction is done in other areas of the console.
link

Thanx a lot for information and thats sounds very good because with 90nm process they can sell at US$399.

Maybe march/april/08 with leap to 65nm they can practice price level sub US$349.

(its very impressive they can down 200 watts to 135 Watts with 90nm process)
 
Last edited by a moderator:
Irrelevant. What matters is that 3rd party publishers get good returns on their investment. X360 is the only next-gen platform providing that right now.
 
Publishers will never stop thinking "if there's no piracy the attach rate would be even higher".
Just so I'm clear, are you suggesting that developers feel PS3's anti-piracy measures (ie BR) is preferable to having the better sales via higher attach rate on the 360?

If so, that's laughable! There have been dozens of quotes from various developers and publishers stating the PS3's sales (hardware and software) are disappointing, and numerous threats to pull the plug. Hell, do you even remember last month's article titles "Sony begs developers not to abandon the PS3"?

I appreciate that anti-piracy measures are a good thing, but you can't simply laugh off better sales + some piracy as a better situation than terrible sales and no piracy.
 
Irrelevant. What matters is that 3rd party publishers get good returns on their investment. X360 is the only next-gen platform providing that right now.

Only?

By the way, we'll have to wait and see whether the security aspect of BluRay is going to be relevant. Certainly the PS2 had some pretty rampant piracy eventually, but it's not sure that this won't happen elsewhere either.

On the other hand you should also not understimate how short term attach rates weight out against long term piracy concerns with publishers. ;) I'm sure they at the very least appreciate Sony's efforts (as I'm sure Microsoft's aggressive attempts at RRoD'ing chipped 360s are too).
 
On the other hand you should also not understimate how short term attach rates weight out against long term piracy concerns with publishers.
Has anyone got real figures on piracy? The production companies for all media whinge and moan, but I've never seen real figures, and the phrase 'rampant piracy' could mean 50% or 0.5% for all I know.

Are developers really going to be concerned that in future XB360 owners won't be buying games because they all use pirate games? Are you saying that developers will prefer a smaller install base of piracy free consoles where the owners don't buy much software, over a larger install base where more software is sold? That makes no sense to me whatsoever. Given a choice between 10 million consoles with 30% piracy and a 5:1 attach ratio (30 million games sold) and 5 million consoles at 2:1 attach ratio (10 million games sold), isn't it incredibly obvious which market you want to go after? The concern of piracy only makes sense if it would impact one market to make it non-viable, and you'd need an insane level of piracy for that to happen.

I can't see any hope for PS3 on the strength of anti-piracy!
 
And with the Damocles sword of banish for xbox live, piracy was not really in mind of xbox's owners who connect their console…
And for piracy influence, I remember that alot of PS1, and after PS2, owners choice Sony products 'cause you can use "easily" copy of games, so for a console manufacturer games piracy can help sells…
 
Has anyone got real figures on piracy? The production companies for all media whinge and moan, but I've never seen real figures, and the phrase 'rampant piracy' could mean 50% or 0.5% for all I know.
The data is from Europe 2006 and it is Spanish but you get the idea (I got the data from here http://www.adese.es/pdf/Datos-industria-2006.pdf ):
2.png


1-1.png
 
How do they measure piracy amounts?

Edit : Even if accurate (which I seriously question) it pegs the average piracy figure at <10% That's easily well below an amount that'll totally skew the appeal of a market away from install base and attach ratios. XB360 has 2x the market and 2x the software buying rate, making it 4x more appealing in that respect, sans 10% is 3.6x more appealing. Who's going to look at PS3 and say 'but it has zero piracy so lets develop for that instead'!
 
Last edited by a moderator:
Yeah I wonder too. Copies are like a black market. It is hard to spot the size of piracy.
 
Publishers will never stop thinking "if there's no piracy the attach rate would be even higher".

Or perhaps they'll be thinking about how bad the PS3 attach rate will get once it gets cracked. The extra security just means extra costs once its been circumvented.
 
Only?

By the way, we'll have to wait and see whether the security aspect of BluRay is going to be relevant. Certainly the PS2 had some pretty rampant piracy eventually, but it's not sure that this won't happen elsewhere either.

On the other hand you should also not understimate how short term attach rates weight out against long term piracy concerns with publishers. ;) I'm sure they at the very least appreciate Sony's efforts (as I'm sure Microsoft's aggressive attempts at RRoD'ing chipped 360s are too).

Long term piracy concerns will be there regardless of platforms. No one expects BluRay to be pirate proof forever. It won't happen.

Piracy won't become a real concern in terms of dictating whether or not a game is released on a platform until piracy drastically affects sales. Noones going to choose PS3 over the 360 strictly because piracy on the 360 until software sales of 360 titles become affected by piracy to the point that PS3 and Wii give better overall returns and the returns on the 360 isn't worth the investment.

Until PS3 can show that it can drives more legitimate software sales than the 360 then there will never be a financial incentive (unless given by Sony) to go PS3 only. Regardless whether or not piracy eats 1% or 99% of your potential 360 sales.
 
Status
Not open for further replies.
Back
Top