*Game Development Issues*

Status
Not open for further replies.
Since when is cell phone gaming more profitable? It's a small, niche market with low ASPs. You can't hit GTA homeruns there. Maybe it will become bigger later, but your argument is rather meaningless today.

No, but if you substitute cellphone with DS and you have something. Of course, id already develops for the DS (and the cellphone, for that matter).
 
Now now now. ;) Thing is, the developer landscape has changed somewhat in two important ways.

First of all, it has become easier for PC developers to release games on consoles. This is mostly thanks to the Xbox, but with this generation, the Playstation has actually also become a viable platform to release to. Which they are doing, in droves, especially with the PC having been in a long sales slump the last years (excluding stuff like Sims and Wow, of course).

However, now that console releases are becoming the norm, PC developers who have never been confronted with anything like the PS2 are now running into things on the PS3 that are very different (and much harder) than they are on the PC or the 360. This is again very understandable.

Secondly, because projects have become so big and complex, a lot of developers like to work on PC because this gives more tools, is cheaper, gives more resources, etc. Even a developer like Square Enix for Final Fantasy has shifted towards developing on the PC in the course of FFXIII development.

And finally, because it is still relatively early in the console cycle, it is too expensive to develop games more specifically targeting the new hardware. It's also less important, because distinguishing yourself technically isn't yet as important early in the cycle when everything looks impressive compared to the last generation anyway. Another advantage is that it is easier to recruit people that can work on a PC and/or teach them to work on a PC.

A lot of these factors will be changing in the future though, though there is no telling how much. PC will eventually be too far ahead of the consoles again. Developers will search for more performance to distinguish themselves and/or innovate, and art creation will become more efficient (including user generated content).

One thing remains certain though - multi-platform development isn't going away soon, and partly thanks to Microsoft it's very clear that we'll be seeing a shift towards supporting developers in all sorts of ways to make their lives easier, from libraries to tools.
 
When you look at the big picture and take budget into account, I don't think it's fair to say that coding for efficiency on the PS3 benefits the 360, and in fact it's quite the opposite.
It also benefits 360 from a performance standpoint. I have this habit to try to strive for excellence, not for mediocrity, but I don't manage budgets. I guess a game or technology director might have another point of view.

If we are talking about getting things done while not caring that much for the technology involved then yes, 360 is very good and a PC is probably even better (though 360 being a close and fixed platform has better dev tools).

While if we are working on some (let say) 360 exclusive game and we want to really push the hardware then we'd probably code something that maps fairly well to PS3 and vice versa (talking about the CPU side of things here)
 
But yeah you are right, I could "tough it out", be hardcore, and stick to the PS3 way where I'd:

1) compile and run
2) setup the scene
3) run replay, capture and time
4) see the results
5) tweak the shader
6) go back to step 1
If someone does this to tweak a shader is not hardcore, is, let say..not very smart.
What about recompiling and reloading a shader on the fly? It's not really rocket science.
'Hardcore programmers' can for sure spend half a day to write and debug some code to do it and save a lot of time.
 
:???: Sorry, I wasn't aware that the PS2 started out with 100M+ install base. So, there were no games other than 1st party games on the PS2 'til the install base reached 100M+ install base? ...Nothing at the 10M, 20M, or 30M mark, huh? After all, the PS2 is supposedly the most difficult of all the systems to develop for, right? It must have been really rough for the PS2. How many 3rd party games were on the PS2 by the time the install base reached 20M+?

Somehow that reason doesn't make sense to me. Can anyone tell me why?

The PS2 marketshare rocketed out of the gates and never looked back. iirc by the time Ninny and MS got their first holiday under their belts Sony had 24M consoles in the market and almost double that a year later when the other two combined were struggling for 10M sales. I do remember some developers complaining about EE and some GS issues (like AA) early on... but it is hard to bite the hand that feeds you.

I would ponder a guess that some of the irritation this time around was, "Not again." When the PS3 was late and failed to capture early marketshare leadership the grumbles became louder (even though most accounts were it wasn't as bad as the PS2, although size/complexity of games as well as staff size/skill need to be factored in).

Now now now. ;) Thing is, the developer landscape has changed somewhat in two important ways.

First of all, it has become easier for PC developers to release games on consoles. This is mostly thanks to the Xbox, but with this generation, the Playstation has actually also become a viable platform to release to. Which they are doing, in droves, especially with the PC having been in a long sales slump the last years (excluding stuff like Sims and Wow, of course).

However, now that console releases are becoming the norm, PC developers who have never been confronted with anything like the PS2 are now running into things on the PS3 that are very different (and much harder) than they are on the PC or the 360. This is again very understandable.

Secondly, because projects have become so big and complex, a lot of developers like to work on PC because this gives more tools, is cheaper, gives more resources, etc. Even a developer like Square Enix for Final Fantasy has shifted towards developing on the PC in the course of FFXIII development.

And finally, because it is still relatively early in the console cycle, it is too expensive to develop games more specifically targeting the new hardware. It's also less important, because distinguishing yourself technically isn't yet as important early in the cycle when everything looks impressive compared to the last generation anyway. Another advantage is that it is easier to recruit people that can work on a PC and/or teach them to work on a PC.

A lot of these factors will be changing in the future though, though there is no telling how much. PC will eventually be too far ahead of the consoles again. Developers will search for more performance to distinguish themselves and/or innovate, and art creation will become more efficient (including user generated content).

One thing remains certain though - multi-platform development isn't going away soon, and partly thanks to Microsoft it's very clear that we'll be seeing a shift towards supporting developers in all sorts of ways to make their lives easier, from libraries to tools.

I think we may agree on most of these points! Most...

PC developers who have never been confronted with anything like the PS2

I wouldn't say this, exactly. The PS2 did, afterall, have a huge following. The number of developers who touched the platform is pretty significant, especially when you consider employee migration. While a small number of AAA PC-only dev houses hadn't bothered much (I am thinking people like Valve, Epic, id, etc) with the PS2 there are as many of the big dogs who did as well as many of your 2nd tier guys did.

What is interesting is Epic's next gen offerings have been pretty good and id's id Tech 5 may technically be impressive as well.

because projects have become so big and complex

This is only going to get worse. As I was just mentioning Epic, the new Gears game is much more complex than the first. We knew this would happen based on last gen (games get more technically advanced, feature rich, content rich, and push the bounds of the systems' limitations as the generation advances). If complexity outpaces the pace of (a) training people to work proficiently on a difficult platform or (b) making the difficult platform(s) more accessible through various efforts, this poses a problem.

Minimally, as the market expands and individual game budgets (complexity) expand there will continue to be a dillution of skilled developers. I think this will dictate future consoles be even more accessible than todays versions.

it is easier to recruit people that can work on a PC and/or teach them to work on a PC

Yep.

A lot of these factors will be changing in the future though

I think they will be changing for the worse to be frank. In the short term, yes, the PS3 is going to catch up a lot in dev savvy and best practices. It already is in many ways. But with budgets and dev times growing to points of unprofitability there will become a growing need to tame these more so than now as well as accessing a pool of cheaper labor.

The path of least resistance are platforms with easy (-ier) to tap performance and a forgiving performance environment that deals with the reality that the increased complexity of games results in suboptimal hands working on significant perfomant parts of games. In this context some of the chirping about using managed code for large chunks of game development may have a leg to stand on.

partly thanks to Microsoft it's very clear that we'll be seeing a shift towards supporting developers in all sorts of ways to make their lives easier

All 3 are going to have to pull out the stops on this last point next gen (and through this gen in continued improvements). The risk factor alone dictates getting the most out of the hardware with the fewest people/budget and shortest time of prime importance. I think this factor above will be a major guiding principle on the design of all three new consoles.
 
What about recompiling and reloading a shader on the fly? It's not really rocket science.

Already done that, but profiling is still tedious that way. The best we can do now on PS3 is recompile the shader, reload it on the fly (map the shader reload functionality to a button click), recapture the command buffer in gcmReplay, and finally reprofile it. It's works, but it's still slower that way.

I want to be able to do it all right in gcmReplay. IE, I have my scene all captured and I'm staring at hlsl code for say fragment shader #4 in some replay window. Then, I change a line of hlsl, hit a button, and gcmReplay recompiles just that shader and reprofiles the scene on the spot.

I know support for displaying hlsl in gcmReplay is coming, it's just unknown when. After that, they can look into edit & change functionality, but that's unlikely to arrive this year :(
 
The PS2 marketshare rocketed out of the gates and never looked back. iirc by the time Ninny and MS got their first holiday under their belts Sony had 24M consoles in the market and almost double that a year later when the other two combined were struggling for 10M sales. I do remember some developers complaining about EE and some GS issues (like AA) early on... but it is hard to bite the hand that feeds you.

I would ponder a guess that some of the irritation this time around was, "Not again." When the PS3 was late and failed to capture early marketshare leadership the grumbles became louder (even though most accounts were it wasn't as bad as the PS2, although size/complexity of games as well as staff size/skill need to be factored in).
That must be revisionist history. The PS2 did NOT rocket out of the gates. It's 2nd year was more like a rocket, though. I think you have to try again.
 
:???: Sorry, I wasn't aware that the PS2 started out with 100M+ install base. So, there were no games other than 1st party games on the PS2 'til the install base reached 100M+ install base? ...Nothing at the 10M, 20M, or 30M mark, huh? After all, the PS2 is supposedly the most difficult of all the systems to develop for, right? It must have been really rough for the PS2. How many 3rd party games were on the PS2 by the time the install base reached 20M+?

Somehow that reason doesn't make sense to me. Can anyone tell me why?
Just how big was the PS2s' installed base when Xbox and GCN launched? Dreamcast only sold around 10.6 million units, in total.

That must be revisionist history. The PS2 did NOT rocket out of the gates. It's 2nd year was more like a rocket, though. I think you have to try again.
"The PS2 marketshare rocketed out of the gates".
 
I think we may agree on most of these points! Most...

We both have little babies ... this makes us soft. ;)

PC developers who have never been confronted with anything like the PS2

I wouldn't say this, exactly. The PS2 did, afterall, have a huge following. The number of developers who touched the platform is pretty significant, especially when you consider employee migration. While a small number of AAA PC-only dev houses hadn't bothered much (I am thinking people like Valve, Epic, id, etc) with the PS2 there are as many of the big dogs who did as well as many of your 2nd tier guys did.[/quote]

I don't know about this. A lot of outsourcing happened here ... Turnover for coders, especially junior, is high, and Unreal like engines were already available. And you don't need that many coders anyway for engines. Something like programming a shader and such wasn't even really relevant in the PS2 days (though of course there are some similarities - but these were rarely used), which is another reason there is more influx from PC coders to the console space these days.

What is interesting is Epic's next gen offerings have been pretty good and id's id Tech 5 may technically be impressive as well.

For these companies, the PS3's difficulties as well as the PC cross-over factor are a blessing rather than a curse. It makes software like theirs all the more valuable.

This is only going to get worse. As I was just mentioning Epic, the new Gears game is much more complex than the first. We knew this would happen based on last gen (games get more technically advanced, feature rich, content rich, and push the bounds of the systems' limitations as the generation advances). If complexity outpaces the pace of (a) training people to work proficiently on a difficult platform or (b) making the difficult platform(s) more accessible through various efforts, this poses a problem.

Historically this is what you would expect. But I'm not sure that is necessarily the case. I think publishers should and eventually will focus more on developing smaller download games. This allows for smaller experiences and smaller projects with less risk and greater reward. It should help both innovative titles (less risk, test the waters online), casual titles (no need to justify a 60$ pricetag).

But even big projects still have more of a chance of standing out on one particular platform, and it is not unlikely that we will see less overlap in target demographics between the different platforms as time progresses. Look at Gears and Halo on the 360 for instance, Wii motion control games, etc. Once the 360 and PS3 cross that 20 million install base mark, things will change even for bigger projects. I don't think Gears is the best of example to take for multi-platform development anyway. ;) Apart from the obvious issues of not being multi-platform, at the same time it's the result of Epics work on the Unreal Engine, and functions partly as a show-case title (these days maybe even moreso than Unreal Tournament, witnessing they used Gears to show Unreal Engine innovations).

Minimally, as the market expands and individual game budgets (complexity) expand there will continue to be a dillution of skilled developers. I think this will dictate future consoles be even more accessible than todays versions.

Agreed. However, the need for a platform to stand out will also remain important. There's a constant struggle in that respect between the UE3 and id tech5 type projects on the one extreme, and Nintendo's domination of their own platforms with their own titles on the other extreme. It is hard to predict how much these two will cancel each other out, but Nintendo at the very least shows there is the potential. Of course, that machine not stressing graphical prowess, it's a relatively easy and cheap platform to develop for anyway.

I think they will be changing for the worse to be frank. In the short term, yes, the PS3 is going to catch up a lot in dev savvy and best practices. It already is in many ways. But with budgets and dev times growing to points of unprofitability there will become a growing need to tame these more so than now as well as accessing a pool of cheaper labor.

It will happen though. I don't think the drive to distinguish yourself technically from your competitors is going to go away completely. I'm sticking to the theory that this is something that goes up in importance as the console generation progresses, and also some of the technologies that are relatively alien now may be considered the norm in a few years time. This too has happened before.

All 3 are going to have to pull out the stops on this last point next gen (and through this gen in continued improvements). The risk factor alone dictates getting the most out of the hardware with the fewest people/budget and shortest time of prime importance. I think this factor above will be a major guiding principle on the design of all three new consoles.

Perhaps, but we'll see. I think that we'll not see that many technological differences in terms of innovation, but software support of those innovations will improve and have a higher standard from day one (or rather, probably before that time even ;) ).

At the same time, and maybe far more important, I think being able to do titles at a different scale through use of download services will become a bigger factor, perhaps even the biggest, in taking away a lot of the risks and cost. I have a feeling that on the PS3 for instance, the Warhawk and GT5 Prologue projects have been a (quiet, don't want to upset retailers too much) success, with Insomniac now following with a Ratchet & Clank title, and PSN getting more and more positive attention in the press in this regard. Meanwhile, the 360's upping the download limit for Arcade games all the time too, and Nintendo has just launched WiiWare.

Edit: coincidentally, I just come across Peter Moore's take on this matter:

http://www.gamesindustry.biz/articles/peter-moore-part-two
Edit: also on the same site, Sushei Yoshida, the new head of Sony WorldWide Studios:
http://www.gamesindustry.biz/articl...rning-opportunity-for-developers-says-yoshida

I think he and Phil Harrison could see eye to eye in their new positions. ;)
 
Last edited by a moderator:
Already done that, but profiling is still tedious that way. The best we can do now on PS3 is recompile the shader, reload it on the fly (map the shader reload functionality to a button click), recapture the command buffer in gcmReplay, and finally reprofile it. It's works, but it's still slower that way. :(

Just a little tip: Use HUD. Real-time performance counters are pretty neat. Won't give you your final frame-time, but you should be able to calculate that from your intial replay capture.
 
I don't know if this lies within the realm of opinion since Carmack claims there's a limiting or gating factor with the PS3 version. It's not like his earlier comments about cell.

However, It's like his earlier comments about PS3's memory (specifically main memory).
 
I don't know if this lies within the realm of opinion since Carmack claims there's a limiting or gating factor with the PS3 version. It's not like his earlier comments about cell.

Its perfectly inline with his eariler comments, its just that early Sony owners only listened to "will have a bit more peak power." and forgot all about "it will be easier to exploit the available power on the 360."
 
Just how big was the PS2s' installed base when Xbox and GCN launched? Dreamcast only sold around 10.6 million units, in total.


"The PS2 marketshare rocketed out of the gates".
My post was really about the amount of games at certain points in the install base as can be read in the post you quoted. We should stay closer to the topic, which is not about the Dreamcast selling 10.6 million units.

First, people said it was the install base being ridiculously high as to why 3rd party PS2 games were created despite the supposed tremendous complexity (when supposedly the PS2 is more difficult to develop on than the PS3). Then, I mention the first year of the PS2 (where the large install base doesn't exist). That's when it changed to marketshare. I guess the excuses will always change without getting to the heart of the matter. Oh well, I'm done with it.
 
Its perfectly inline with his eariler comments, its just that early Sony owners only listened to "will have a bit more peak power." and forgot all about "it will be easier to exploit the available power on the 360."
I didn't claim that he contradicted himself. It has been said that developers (i.e. Carmack) have differing opinions on PS3 being harder to develop for. My point was that while his opining on Cell may be philosophical and thus moot, the GPU and memory limitations aren't. In that spirit I have little hope that Bioshock on the PS3 will turn out well.
 
Just a little tip: Use HUD. Real-time performance counters are pretty neat. Won't give you your final frame-time, but you should be able to calculate that from your intial replay capture.

I used gcmHud early on as well but it's unstable alas, it will crash in different ways in different builds. The current version of gcmHud will crash when I try looking at textures, but only on certain draw batches. At this point in the optimization though, I really need the extra detail gcmReplay provides anyways.
 
:???: Sorry, I wasn't aware that the PS2 started out with 100M+ install base. So, there were no games other than 1st party games on the PS2 'til the install base reached 100M+ install base? ...Nothing at the 10M, 20M, or 30M mark, huh?
Earlier on the ratio of PS2 consoles to the competition was even higher, so instead of quadrupling your tarket market by going through PS2 pains you'd be making it 10 times higher. Heck, for a while it was the only console, so you had no choice whatsoever.

In other words, you're only further proving my point by bringing this up.
 
It also benefits 360 from a performance standpoint.
I understood that from your original post, but the point is that if there was no PS3, it would be deemed that such optimization wouldn't be worth it.
I have this habit to try to strive for excellence, not for mediocrity, but I don't manage budgets. I guess a game or technology director might have another point of view.
Making a game run as fast as it can on a console is not the same as making a game as good as it can be with a given budget. They're only partially correlated optimization goals.

For any cross platform game that runs worse on the PS3, we have no idea how the overall game would be affected if the studio put more time into PS3 optimization and less into other aspects of the game (art, gameplay/design, testing, etc). We can only assume that the director for a game chose the best balance because that's what he's paid to do.

The same thing applies to MP games developed on PS3 first. There's a good chance that increased productivity from better tools would have led to a better game had the 360 been chosen as the primary dev platform, and possibly better on the PS3 too even though it is relatively slower than the 360 when taking this route.
 
I've got a question to PS3 developers.

Doesn't the multicore situation of today, which forces multithreading, pose a much more difficult challenge concerning high-performance software (such as in game development) than any differences between the PS3 and the 360?

In other words, how different are the PS3 and the 360? When we compare the differences of last gen vs today's software development.

BTW, even though I'm not a game dev, I work on high-performance/availability server software and I am facing several multithreading challenges (when things just have to sync up).
 
I've removed the trollage and PR talk tangent. I can see the relation to the topic, but I feel it's outside the particulars of discussion about the difficulties and issues faced by multiplatform development, with too much heated politics. Please keep discussion to the tools, titles, and developer comments, and leave the corporate viewports to the smoky backrooms.

I'll add for clarification that anyone referencing E305 or 'Corporate Promises', I deem off topic.
 
I always had this picture in my head of JC being the uber nerd that preferred the hard way instead of the easy way.

I dont know if i am wrong, but since the heydays of Quake he started to look more at the business side around Quake 3 so i don´t think it´s any surprise that he prefers things "as he knows them".

The sad thing is that he would be the perfect choice if you wanted someone to really explore the depths of Cell.

How did you come to that conclusion? It could have been given x time/budget, he rather work on a system that will deliver enough wiz-bang-oh-ah for the consumers.

Just because we have done ASM in the past, doesn't mean we don't want higher level languages today. Why do we need to use XML, relational database, oop? We want to reduce development cost, because at the end of the day, it's a business to make money instead of doing it for a cool factor.

I would say that he have grown into a role that is more then just technical junky.
 
Status
Not open for further replies.
Back
Top