*Game Development Issues*

Status
Not open for further replies.
PS2 didn't come close to delivering the performance in its 500+ mm^2 of silicon cost that NAOMI2 did in its ~370 mm^2, PSP's graphics core didn't come close to delivering the performance or battery life of the more affordable MBX Pro, and the Cell architecture wasn't efficient enough to compete in the wider embedded consumer electronics market, and Management at the PlayStation division has been replaced as a result.

The new management should be tasked with forming design teams that achieve better performance/area/power-consumption in their creations.
 
For a console, though, you could make the argument that backwards compatibility decreases the value of the upcoming library. It certainly happened that way with the Atari 5200. And when the PS3 library is this small and the PS2 library is so huge, full BC has the possibility of making the PS3 look more like a Bluray player that plays PS2 games in high-def.

Yet full BC didn't hamper the PS2 by making it look like a DVD player that plays PS1 games. Backwards compatability with the biggest Xbox titles didn't harm the 360 in this way either, at least as far as I can see.

I'd suggest that if BC ever were decreasing the value of your upcoming library, the real problem wouldn't actually be BC but your upcoming library.

Well, that was a case where the reverse happened. Rather than hamper the success of the Genesis, it was simply a feature that nobody cared about. And in fact, one that many people didn't even know about -- "Sega what system?" For the vast majority of Genesis owners, they thought the Genesis AKA Megadrive was the first console Sega produced.

Perhaps for Genesis owners, but a lot of Megadrive owners were aware that before the MD there was the MS. The MS was perhaps the most successful console since the 2600 in parts of Europe. I'd still agree that (practically) no-one cared about the add-on enabled the BC though. If BC hadn't needed the convertor I think it would have been of far more value, at least in the MD's early days.
 
PS2 didn't come close to delivering the performance in its 500+ mm^2 of silicon cost that NAOMI2 did in its ~370 mm^2, PSP's graphics core didn't come close to delivering the performance or battery life of the more affordable MBX Pro, and the Cell architecture wasn't efficient enough to compete in the wider embedded consumer electronics market, and Management at the PlayStation division has been replaced as a result.

The new management should be tasked with forming design teams that achieve better performance/area/power-consumption in their creations.

I'd suggest they should also work on hardware and software that allows their machines to deliver competitive titles on the first day the platform hits the streets. It's no good having "big chips" and "lots of flops" in a system if in its make-or-break first two years it has a crap version of Madden (compared to its older, cheaper rival platform).

There is an idea banded around on message boards that a "complex" system will have longer legs, and survive on the market longer, if it delivers relatively poor performance at first and grows over the years to look better and better. Any system will look better over the years, and it's a software library that sells a system over time and not some proportion of utilised theoretical power.

If a consoles fancy chips haven't helped it in the first couple of years you've pretty much wasted the time, effort and money you spent designing and manufacturing them. If you're going to splash out on making a powerful system, make something that can show its power out of the gate (something the PS3 has really failed to do).
 
PS2 didn't come close to delivering the performance in its 500+ mm^2 of silicon cost that NAOMI2 did in its ~370 mm^2, PSP's graphics core didn't come close to delivering the performance or battery life of the more affordable MBX Pro,...
Since you seem to have good knowledge on the matter, would you care to elaborate why the NAOMI2 and MBX Pro are not dominating the market if they contained such superior technologies in comparison to the PS2 and PSP?

Are they still superior with regard to your criteria or was that just a transient period during the launch window of the PS2 and the PSP?
 
There is an idea banded around on message boards that a "complex" system will have longer legs, and survive on the market longer, if it delivers relatively poor performance at first and grows over the years to look better and better. Any system will look better over the years, and it's a software library that sells a system over time and not some proportion of utilised theoretical power.

If a consoles fancy chips haven't helped it in the first couple of years you've pretty much wasted the time, effort and money you spent designing and manufacturing them. If you're going to splash out on making a powerful system, make something that can show its power out of the gate (something the PS3 has really failed to do).

I can agree that PS3 offers poor performance in relation to hind end PC. But in relation to x360 it's overall performance is very similar. At least it seems to be when you compare multiplatform titles (slight diffrence to none). The question is how much will both consoles improve in next couple of years?
 
To be honest I see the biggest problem with the PS3 is the current industry stigma which seems to propose that if your developing a game for the platform, you somehow have to either pull out every ounce of power or you have to do everything you can to try and attain parity with xbox360 titles in terms of visual fidelity..

This obsessive focus on graphics the industry currently has is only detrimental to the industry as a whole.. Making smaller developers avoid attempting development on the PS3 because they are made to feel like they have to "fight for graphics" to have they product accepted goes against everything that attracted gamers to the playstation brand in the first place..

Imagine if developers stopped caring about "competing" with Xbox360 games in terms of graphics and actually spent their time concentrating on using the hardware to innovate technologically in other areas (physics, AI, animation etc..) The PS2 couldn't keep up with the Xbox or GC in terms of graphics (regardless of what was in the box) but the majority of gamers didn't care.. A good game is a good game regardless and most consumers place more value in "well presented art" than "technical merit".. I just wish we could go back to focusing on telling a good story, making a rich, deep and fun game first and then worry about adding the visual sundries later since nintendo have been doing this for years and even with the current success of the Wii, many STILL haven't learn't what matters most to consumers in this industry..

It's like when 3D was first introduced to the industry.. Many saw value in it not because 3D games looked better, but because they added so much more to the experience in terms of new types of gameplay, new definitions of what fun can be & new ways to tell a story..

At the end of the day there are many ways to enrich a game & visual fidelity is just one (& not the most important by far..) So the day we spend more time & money investing in fleshing out the others (which are considerably cheaper) will be the day the industry gets back on track and begins to move forward again in my view..

/rant

Edit: I fear this post is slightly off topic & if so then please delete..
 
But at least for some time PS2 was the graphical powerhouse compared to the Dreamcast and was, very much like the PS3, made to seem that way buy the media and possibly corporate execs. By the time the other 2 systems hit the market the momentum of the PS2 could not be stopped and by Nintendo and Microsoft launching so close to one another neither one could truly pull away and really compete with the PS2(the same thing might be happening right now) eventhough one launched for less and the other for the same price.

If I remember correctly the PS2 almost had 20 million consoles shipped by the time the other 2 came out. It was no contest, you had to develop for the PS2 if you wanted to make money last generation despite the difficulty of developing on the machine. This gen things are turning out a little differently except for the fact that the PS3 is also considered a difficult machine to work with.
 
Last edited by a moderator:
If PS3 sold for a lower price and sold better, you wouldn't have these complaints about the tools or the architecture. You had some complaints about having to use the VU units on the PS2 but people got over it because PS2 sold well and games for it did well.

Maybe, just maybe, if PS3 had a newer GPU and more RAM, they might have been able to pull off that price.
 
Backwards compatability with the biggest Xbox titles didn't harm the 360 in this way either, at least as far as I can see.
That's because it isn't really there, is it? The BC support in 360 is pretty weak and doesn't even cover a large percentage of the Xbox's library all too well. But then, nobody cares especially now that Halo 3 is out.

I'd suggest that if BC ever were decreasing the value of your upcoming library, the real problem wouldn't actually be BC but your upcoming library.
Of course. But there's little you can do to make the really desirable games come out sooner than they're going to. At least BC, you can do something about. Sure people are eager to see MGS4 or FFXIII or WKS or whatever, but they'll be out when they're out. Until then, someone waiting for those titles may not have a reason to see the PS3 as a PS3.

Imagine if developers stopped caring about "competing" with Xbox360 games in terms of graphics and actually spent their time concentrating on using the hardware to innovate technologically in other areas (physics, AI, animation etc..) The PS2 couldn't keep up with the Xbox or GC in terms of graphics (regardless of what was in the box) but the majority of gamers didn't care.
Would be nice, but unfortunately, the die is already cast. In the case of the PS2, sure it didn't quite compare to the Xbox or GC, but it came out earlier, had a large library by that point, and its prior competitor was the Dreamcast with which it was graphically competitive from the get-go. And as peoples understanding of the hardware grew, it also stretched its graphical legs as well to at least be semi-competitive with the Xbox and GC.
 
It's not a problem, and neither should programming an SPU be either. I'm with nAo on this, no senior project member on the programming level should ever have to moan about things such as this, but rather get their fingers out of their arses and think like a software engineer and provide the basic framework. Your point about PS3 needing "super geeks" is wrong, nAo is right, it's not a complex machine. Getting the best out of it in the longer run might require a little bit of ingenuity. For an example of what a quality technical lead should be doing, I refer you to this post by Mike Acton: http://forum.beyond3d.com/showthread.php?t=44542

If this so called "basic framework" is so basic and every team out there has to make it, why not have Sony do it? It's one thing to say, hey all you lazy bums, start writing code, but then when on THE competing platform I can get the SAME game run better, faster, sooner, for LESS money ... what the hell are you talking about then?!
See, I have a lot of respect for guys like nAo and Mike Acton, but I think these guys can do SO MUCH better, if they look at the big picture and apply their talents there. I mean, Mike recommends ditching Sony's SPURS framework and rolling up your own sleeves and coding it from scratch, because "it's so much better this way". I'm not going to defend SPURS as being the best, BUT, in the big picture, this is crazy! In times, when AGILE is the ONLY way to stay on top of tasks and schedules, when a slight delay or slip could mean canceled projects and companies going under, NOT thinking about how to leverage your knowledge across people and teams is irresponsible. The game industry needs to move forward, we need to leverage code not just from the last project, but from the last generation too. We need to leverage code across teams, across games, across industries. And all this requires strong abstraction and consistent hardware that can deliver on this abstraction, the same way it worked for Win/X86 and D3D/GPUs. Look at Intel, they are STILL "stuck" with X86 even in their ground breaking Larrabee design - is this a problem for them, no, because they evolve and move the frontier forward. On the other hand IBM kept screwing around with PPC, and well, it's dead now.
Cell is not the future. Sony ditched it already. Toshiba is already modifying the design. The truth lies with Larrabee. Homogenous cores, 1-cycle L1-caches, unified memory access, hyper threads for managing latancies, hardware and software support for MASSIVE multithreading. I can code with good old OOP patterns or I can optimize things for a streaming model if I need and if I can. This is how it should be. Between Revolution and Evolution, Evolution ALWAYS wins.
 
You can look at things from a lot of different angles. I´ve always looked at Cell as an evolution of the asymmetric architecture of the EE, with the PPU replacing the Mips core and the SPUs replacing the VU units.
 
Isn't already a lot of work to get a game up and running period? Especially these next gen games where people would like to be oohed and awed. Yeah, next gen they do need to stick with the Cell and not change up again otherwise all the work they are going through now and for the next 5 to 6 years to get the PS3 to be more comfortable for devs would be lost. They should have figured out a way to stick with the EE again this generation It would have likely given them an advantage in development over the 360 but it's too late for that. They should learn from this experience or find themselves right back to were they were a few months ago.
 
Barbarian

Why can't you use OOP with Cell?

Is your complaint that abstraction cannot be done or that it isn't done for you already?(I should say to a larger extent...)

What makes you think Sony has ditched Cell?

What makes you think the CBEA won't be improved upon?
 
Last edited by a moderator:
Though winning the vast majority of designs of hardware 3D equipped mobile/low-power devices, accounting for tens, and soon hundreds, of millions of units most definitely makes MBX dominant and winning the designs of a major console and the standard bearing amusement/arcade platforms of its generation makes NAOMI2's Series 2 architecture a success, sales aren't proof of the proficiency of a technology.
 
Though winning the vast majority of designs of hardware 3D equipped mobile/low-power devices, accounting for tens, and soon hundreds, of millions of units most definitely makes MBX dominant and winning the designs of a major console and the standard bearing amusement/arcade platforms of its generation makes NAOMI2's Series 2 architecture a success, sales aren't proof of the proficiency of a technology.

OK, so those performance criteria were not really important in the case of the PS2 and PSP as they have been quite successful. The PSP may even end up in a mobile phone some day if you think that is important, but I doubt that has been a high priority at Sony.

I find it hard to follow your logic when you in your previous post said.

The new management should be tasked with forming design teams that achieve better performance/area/power-consumption in their creations.

I doubt that is Sonys main problem.
 
Barbarian

Why can't you use OOP with Cell?

Is your complaint that abstraction cannot be done or that it isn't done for you already?(I should say to a larger extent...)

What makes you think Sony has ditched Cell?

What makes you think the CBEA won't be improved upon?

Regarding OOP - you realize the limited SPU storage and separate address space, make a lot of the OOP approaches impractical or impossible. Just think for a second about polymorphic behavior (virtual tables and addresses), correct copy behavior for complex C++ objects (which DMA doesn't do obviously), lack of space for larger functions (possibly because of template expansion) - basically anything that is modeled after pointer semantics, iterators etc, just breaks down or needs to be somehow translated (preferably in a generic way, but consider that the "this" pointer cannot easily be changed to be a special pointer type) etc etc. These are just things off the top of my head.

On abstraction - my complaint is that Cell forces you to adopt the streaming model of computation, regardless if you need it or not, and fails to provide transitional steps to make this easy. It's trivial to ask programmers to rewrite their legacy code to fit on SPUs, yet in reality that approach rarely works. Lack of legacy support has killed many technologies before - DEC Alpha, Itanium, BeOS etc.
Apple managed to pull off PPC -> Intel mostly because they ported all their software to it. Making other companies do the same, didn't go so well. Adobe took a year to port Photoshop, even though supposedly they already had Intel/Win version running for years. According to my source at Adobe, the reason was that Apple's dev tools couldn't cope with the size and complexity of Adobe's codebase. And PPC->Intel transition doesn't even scratch the complexity of moving from X86->Cell.

And on Sony ditching Cell: http://www.engadget.com/2007/10/18/sony-sells-cell-to-toshiba/
 
If this so called "basic framework" is so basic and every team out there has to make it, why not have Sony do it? It's one thing to say, hey all you lazy bums, start writing code, but then when on THE competing platform I can get the SAME game run better, faster, sooner, for LESS money ... what the hell are you talking about then?!
See, I have a lot of respect for guys like nAo and Mike Acton, but I think these guys can do SO MUCH better, if they look at the big picture and apply their talents there. I mean, Mike recommends ditching Sony's SPURS framework and rolling up your own sleeves and coding it from scratch, because "it's so much better this way". I'm not going to defend SPURS as being the best, BUT, in the big picture, this is crazy! In times, when AGILE is the ONLY way to stay on top of tasks and schedules, when a slight delay or slip could mean canceled projects and companies going under, NOT thinking about how to leverage your knowledge across people and teams is irresponsible. The game industry needs to move forward, we need to leverage code not just from the last project, but from the last generation too. We need to leverage code across teams, across games, across industries.
I'm pretty sure the framework Mike is writing will be reused in many agile development projects to come later.

Also, if you run a company, it's important to differentiate your framework from others. Reusing is good, but if it's easily reused by your competitor it's not so good. The R&D put into Cell is hard to apply to other consoles. This is a double-edged sword, but worth to take the risk.

BTW has SPMM in this patent already replaced SPURS?
http://www.freepatentsonline.com/20070074212.html
The truth lies with Larrabee. Homogenous cores, 1-cycle L1-caches, unified memory access, hyper threads for managing latancies, hardware and software support for MASSIVE multithreading. I can code with good old OOP patterns or I can optimize things for a streaming model if I need and if I can. This is how it should be. Between Revolution and Evolution, Evolution ALWAYS wins.
How did you conclude your OOP code don't run on Larrabee pretty inefficiently? Then it's 32-core * 4-way SMT, how do you split the workload of games by reusing your older code designed for a far less number of cores without little changes like you suggested? Core-to-core communication overhead will be massive.
 
How did you conclude your OOP code don't run on Larrabee pretty inefficiently? Then it's 32-core * 4-way SMT, how do you split the workload of games by reusing your older code designed for a far less number of cores without little changes like you suggested?

I didn't conclude it, of course. I very carefully said that my legacy or even new code, OOP or not, will run at some speed, and as long as I'm comfortable with that, why should I optimize anything. If I'm writing a word document processor do I have to use every last cycle left in Larrabee? If I'm writing a Super Mario Galaxy game (which scored 9.5) do you not believe I can have it running on 1 Larrabee core?
 
Lack of legacy support has killed many technologies before - DEC Alpha, Itanium, BeOS etc.
Are things like FX!32 and x86 emulation on Itanium not legacy support?
And PPC->Intel transition doesn't even scratch the complexity of moving from X86->Cell.
How about EmotionEngine -> Cell?
I didn't conclude it, of course. I very carefully said that my legacy or even new code, OOP or not, will run at some speed, and as long as I'm comfortable with that, why should I optimize anything. If I'm writing a word document processor do I have to use every last cycle left in Larrabee? If I'm writing a Super Mario Galaxy game (which scored 9.5) do you not believe I can have it running on 1 Larrabee core?
I thought you demanded a relatively efficient way to do a game. If a Super Mario Galaxy game is what you want, why don't you count high-level things like PSSG available for PS3?
 
Last edited by a moderator:
How about EmotionEngine -> Cell?

Well, so far the transition hasn't been terribly smooth except for one or two studios (known for their rocket science abilities before), so I can't see why you are pointing it out as an example.
 
Status
Not open for further replies.
Back
Top