*Game Development Issues*

Status
Not open for further replies.
Arwin didn't add his as an official source. He's offering a different rumour he 'heard'. So far no-one's actually linked to sources of any side

What do you mean? I did link to a source--you guys are just dismissing it.

so all are equally worthless basis for any arguments, like all Bloke-down-the-pub 'facts'.

I just did link to more information--and they were all linked at B3D before. And I cannot really "link" to people talking to me :???: I (re-) linked to the old articles. Take-Two paints the picture (technical issue on a platform making a simultaneous release impractical... note they don't say they couldn't release one version if determined) and the decided flow of information points at the PS3. I agree with the MSNBC writer when they say it is a "poorly kept secret".

And put GTAIV aside--we have developers right here sharing the technical difficulties they have on the PS3 which have caused issues (NOT a "lack of focus and less investment"!). How many PS3 games have been delayed due to unforseen technical issues requiring more time-intensive development than budgeted for?

The fact we have fans like wco81 implying that these developers aren't trying as hard on the PS3 is a stark contrast to what I am told directly and what we have read here a lot: Developers are spending a LOT of time working on the PS3 versions.

And that was a complaint from the beginning from some developers: The time required to fully tap the PS3 is substantial (and mainly the persuit of exclusive developers) because your typical 3rd party would need to work harder on the PS3 versions to get them on par with the 360 version, let alone fully exploit the latent power unharnessed due to further time constraints.

So is all the dev chatter about the PS3 taking more effort to get equal performance all FUD lies? If so, why doesn't someone address that head on?
 
The fact we have fans like wco81 implying that these developers aren't trying as hard on the PS3 is a stark contrast to what I am told directly and what we have read here a lot: Developers are spending a LOT of time working on the PS3 versions.

Hey I wasn't accusing people of anything.

The blog entry by Booth and a couple of developers here have said they have to work much harder just to get parity on the PS3.

They implied that the extra effort may not be justified by the economics.

I can understand if they think it's a PITA to get the same results on the PS3 that they can get much more easily on the X360.

I NEVER said they weren't trying as hard. Just that they thought it was a problem that they had to expend more time and resources on the PS3 version.
 
Why would RockStar/Take Two delay both versions if the PS3 version was behind?

Why would they do Sony that kind of favor and take the hit to their own stock price, especially when MS paid big for the exclusive content?

The phrase, "contractually obligated" comes to mind....

What we do know is there was some technical issues. Take Two had stated, "Certain elements of development proved to be more time-intensive than expected, especially given the commitment for a simultaneous release on two very different platforms" and has further noted that the decision to delay was, "almost strictly technological challenges". Take Two declined to get into specifics because they wouldn't be helpful. So we know that:

1. Parts of development (not all) were more time-intensive than anticipated

2. Notably in the context of a simultaneous release--i.e. one was taking longer than the other

3. The cause of the delay was primarily related to said technological issues

Shorthand: One of the platforms is taking a lot more time and effort to get into a shippable shape.

I don't know what information you have beyond the above, but I think you're stretching those quotes a bit to get to your shorthand version. To me that just as easily could mean that both versions aren't in a shippable state and both presented unanticipated challenges.

Common sense tells me that the 360 version was in a more advanced state, though, since AFAIK no one has even seen the PS3 version. So, the idea that the 360 version was (solely) holding up the release is ludicrous in the extreme.

I can second what Arwin has referenced from the 1UP podcasts. To them, after what they had seen, the only reason the delay was a surprise was because of RockStar's track record of hitting release dates. What they saw was not in a state that they would have expected if the game was going to be able to hit it's original release.
 
First thing: I wish you'd say something about how manage parallel programming on 360, since you completely skipped those questions.

Trust me I do wish I had the time to write all these nice things.
If you can't afford to work for a few weeks on something that is going to save you months then someone took the wrong decision here. Sony makes mistakes, MS makes mistakes..and developers make mistakes too.
And it's great that your team had few nice chaps to sit down and design that system, and lucky for you you had 3+ years exclusive platform development.
?? I worked exclusively on PS3 for 19 months.

in 6 months we went from "the game barely compiles on PS3" to the first E3 demo.
The important thing about X360 and I'll repeat that again is that "It Just Works".
Sorry but I strongly disagree, expecially on the CPU side of things. Your junior programmer is not going to write code that just works, and if it does it will run like crap. No amount of prefetching or unrolling will make turn some O(N^N) implementation into a O(logN) one :)
And that takes very little time because of the nature of unified memory address space. [edit - and by that I mean that the multiple cores all see the same memory, can read and write to it within the same address space, the only "problem" being you need to protect(syncronize) any shared resources]
You can do exactly the same on SPUs as well, if you want to. Just cause a memory access to external memory is not direct it doesn't mean it can't be done (abstract it!)
Moreover if you plan for it since the beginning you will greatly improve performance on BOTH platforms.
I wanna remind you that on B3D we were discussing this stuff 3 years ago when we knew nothing about SDKs and stuff like that, and we already knew what kind of problems we were going to meet with these new in order multicore CPUs.
Were we fortune tellers? I don't think so.

On the contrary, on PS3, you need top engineers devoted just to get basic stuff off the ground, and you NEED to use SPUs to get decent performance out of RSX.
Emh? There are a few PS3 shipped games out there, globally praised for their graphics/3D engine, that strongly disagree with your opinion. (they don't use SPUs in exotic ways to speed up RSX)
Is it easier to extract performance from Xenos than RSX? You can bet on that!
Do you need to pull the most amazing tricks in the world to have good performance
on RSX? Hell no!
And let me go back to the topic of SPU programming since I forgot to mention few aspects of it. The SPU has local address space so any pointers in your data have to be translated, DMAed etc, which is PITA.
I didn't know that spending 10 minutes to write a stupid function to generate a DMA transfer given start and end addresses is PITA. I guess we have different definition/vision of what our job is.
The memory is limited so you have to partition the problem.
Which is generally a good thing.
The code might not even fit initially, and you might not be able to run a debug version of the code.
Reduce your dataset in those cases, this is how I worked around this particular problem.

and now you have to find the SPU code in memory before it gets uploaded and set a break point there, very similar to VU programming back in the day.
Put a breakpoint explictely in the code and you're done.
A "HALT" on the SPU could mean anything
Sure, if you don't know what to do/where to look..
especially if it's inside a middleware SPU job.
So what do you do when some middleware code linked with no symbols crashes on 360? Do you blame MS?
It could be their bug, or yours, someone stomping the code or data, or it could be stack overflow. You get basically zero tolerance for error. That is not what I call "It Just Works".
Again, this happens with any middleware on any platform.

Sony with all their open source/Linux fundamentals, how can they screw it up so bad that they have worse offering than MS?
Personally I don't think Sony screwed up on the tools level, they're doing a much better job now than what they did during the PS2 era and you can't expect big companies to change overnight.
Maybe their biggest mistake was not to offer an easy path for multiplatform developers, a kind of best practices manual to make your code run well on PS3 and the 'other platforms'.
 
Sorry but I strongly disagree, expecially on the CPU side of things. Your junior programmer is not going to write code that just works, and if it does it will run like crap. No amount of prefetching or unrolling will make turn some O(N^N) implementation into a O(logN) one :)

Well, I fully expected you'd strongly disagree, but trust me, in few years time you'd change your mind, because, guess what, I used to share the same opinion that you so well try to defend here. Maybe I'm getting old or lazy or both, but needless to say, the days when I enjoyed wrestling with the hardware are gone. I clearly remember being so happy manually optimizing my VU asm loops, to extract every last cycle. I still get a glimpse of that on SPUs and it's great and then I'm reminded how I have a giant pile of actual features that I need to finish, and that even if my code runs 2x faster on the SPU, nobody will ever notice. At the end some artist just needs to place few more "boxes" in the level and my efforts are wiped.
But I digress. I want to answer some of the questions in regards to Cell programming and best practices.
First, I absolutely agree that abstracting some of the SPU complexity is a must. Of course, one has to wrestle with it first for some time to understand what all the gotchas are. That takes time and experience. As Joker indicated, the second time around things get better, and their team are ahead of us in that regard. It might be possible to have a nice abstraction written in 3 weeks, but in my experience, it will be 6 months before it's fully ironed out. After all nAo's team only implemented it on one platform. A truly well developed abstraction proves itself when you test it against a second platform.
Second, on the topic of junior programmers - yes, they could write O(N*N) algorithms and no optimization can fix this, but they can also easily take advantage of STL/Boost or similar high level abstractions and will get O(logN) with no sweat. Tell me how would one leverage 10+ years of experience and C++ template power on an SPU? I really dislike when platform manufacturers feel they should make me program like back in the stone age of software engineering.
Third, IBM - for whatever reason, Sony discourages using whatever tools are coming out of IBM. Either they don't trust them or prefer to stick to their own solutions. I clearly remember a discussion few years ago about IBM's XL compiler and their auto-parallelizing auto-vectorizing efforts, and Sony basically laughed at them, saying true programmers would, of course, like to manually do it, for _maximum_ performance. In retrospect IBM might have been right. What is better : people not using SPUs because programing them is too complicated, or using all of them all the time at 50% efficiency?
Looking at Intel's Larrabee effort, it seems to me, Intel seems to agree that if you throw 64 cores at programmers, they will definitely screw it up, so some of the complexity needs to be taken away, even at the cost of performance loss (from the point of view of one of these cores).
 
?? I worked exclusively on PS3 for 19 months. in 6 months we went from "the game barely compiles on PS3" to the first E3 demo.

I wanted to double check this before I questioned it, but I did see Heavenly Sword at E3 2005 in a playable demo. The game shipped September 2007. If I can still count correctly that's about 2 years and 4 months, plus 6 months to the demo, so 34 months development time then?
 
MSNBC is a profession news site; they are always careful to give the warning, "We are an MSFT Affilliate" but that doesn't prevent them from praising the competition--or running Op Eds saying Halo 3 is overhyped or reporting on the RRoD.
... or providing rumours and FUD.

Microsofts business practises are not always that rosy, just ask Netscape.

Actually I don´t think neither of those rumours speaks the complete truth. I ' think the game simply wasn´t ready, such a long delay close to the release date indicates there was more than just some polish missing.

Perhaps some focus group tests did not work out as planned, and forced them to make some changes to the game play or the story line.

BTW the PS3 version of GTAIV is not a port, Rockstar has a developer team for each version.
 
Last edited by a moderator:
Third, IBM - for whatever reason, Sony discourages using whatever tools are coming out of IBM. Either they don't trust them or prefer to stick to their own solutions. I clearly remember a discussion few years ago about IBM's XL compiler and their auto-parallelizing auto-vectorizing efforts, and Sony basically laughed at them, saying true programmers would, of course, like to manually do it, for _maximum_ performance. In retrospect IBM might have been right. What is better : people not using SPUs because programing them is too complicated, or using all of them all the time at 50% efficiency?
IMHO IBM and Sony have different goals and people like this guy think so too.
http://www.next-gen.biz/index.php?option=com_content&task=view&id=7637&Itemid=2&limit=1&limitstart=1
According to Andrew Richards (pictured), CEO of Scottish programming tools company Codeplay, the problem is that games work differently to scientific work.

“We’ve looked at various parallel programming languages such as OpenMP, which is an industry-standard way of doing parallelism, but it relies on data being in arrays rather than in data structures,” he says. Incidentally, OpenMP is the basis of IBM’s own research into compilers for getting the most out of Cell. “This assumption is fair enough for scientific applications but isn’t true of games,” Richards continues. “A lot of game data is in complex data structures because game enemies don’t line up in a long line. They move around, so you need a data structure that can handle issues such as which enemies are visible at any time.”

Equally, when it comes to games, the type of processing carried out on a CPU is varied, typically involving tasks such as artificial intelligence, audio and physics calculations, as well as setting up the rendering to be sent over to the graphics card. In addition, many developers have existing engines that run to hundreds of thousands of lines of code. Retrospectively getting these into shape to get the most out of the Cell part of the equation just adds further complexity to the task.

That’s why, after originally setting up shop to offer its VectorC compiler for PlayStation 2, Andrew Richards and his team decided to take a different approach with PlayStation 3. Instead of working on compilers, which have to be written directly for each piece of hardware to automatically make your code run well, Codeplay’s solution is its general-purpose Sieve C++ Parallel Programming System, which acts as a frontend to official compilers provided by hardware companies.

“In the past we took the view that we would take your existing program and automatically parallelize it across various processors, but with PlayStation 3 we’ve come to the conclusion that what you need to do is rewrite parts of your code,” Richards says. “The trick is that we can ensure it’s a small change and one that our tools will help you make.”
 
STL and Boost seek to be algorithmically efficient but do not target efficient use of specific hardware in so much as being cache/LS aware,be excellent or even safe players to have in a multi-threaded game. Some STL containers will allow for the provision of custom allocators but not all of them (depends on conformance etc.) and for those which do this only addresses a few issues with STL. Someone must write these allocators and if there is shared access to an STL container there are synchronization issues that a programmer must solely deal with alone. Without explicit intervention STL containers will leverage dynamic memory which can lead to memory fragmentation. Debugging STL code isn't always a walk in the park either.

As much as these libraries are potential boosters to productivity in some cases in others they are a source of serious issues for a game which a junior programmer should not be left to deal with.
 
Last edited by a moderator:
Veru long discussion about the difficulty to bring a new game to the PS3....

Usually, with experience, the library of trick is bigger on the same system. It should be better in the next titles, maybe not, because of the symmetric architecture...

Usually, games are praised on the artistic level over the specification. It's a shame that 2 machine share same games & are dependent in a way...

Who's pulling the plug?
 
What do you mean? I did link to a source--you guys are just dismissing it.
Let's change your bolding...
msnbc said:
Although analysts had predicted that "GTA 4," “Halo 3” and “Madden 08” would account for a third of all game sales this holiday, Take-Two Interactive, the game’s publisher, acknowledged in August that the title just wasn’t ready for prime time. Or more specifically, the PlayStation 3 version of the game wasn’t ready for prime time. Take-Two won’t comment on that part, but it’s about as big a secret as Joan Rivers’ plastic surgery.
They tell as factually GTA4 is delayed, and then add the rumour its because of PS3 troubles. The reason for the delay is officially unknown, and isn't only possible to be PS3. eg. Perhaps they were having trouble getting the HDD-less version of GTA4 to run up to snuff? Add to that difficulties in optimizing for PS3, and the reasons for the delay could be manifold.


So is all the dev chatter about the PS3 taking more effort to get equal performance all FUD lies?
No. Clearly not. I know some people like to think otherwise but the truth is when you get people saying 'we find this hard' its because they find it hard! Just if you're going to present an example to back up your argument, it is important to have a credible statement from a trusted source. The MSN site saying 'here is what Take Two said about PS3 troubles' and quoting them would be fine - they're not going to make up a quote, but the MSN site saying 'the game's delayed and we're saying it's the PS3's reason' isn't as there's no validation at all, and that idea might be no more credible than others.
 
I wanted to double check this before I questioned it, but I did see Heavenly Sword at E3 2005 in a playable demo. The game shipped September 2007. If I can still count correctly that's about 2 years and 4 months, plus 6 months to the demo, so 34 months development time then?

I think nAo is referring to the time he spent on the project. I.e. presumably his part of the job was done after 19 months working on the game.
 
Let's change your bolding...
They tell as factually GTA4 is delayed, and then add the rumour its because of PS3 troubles. The reason for the delay is officially unknown, and isn't only possible to be PS3. eg. Perhaps they were having trouble getting the HDD-less version of GTA4 to run up to snuff? Add to that difficulties in optimizing for PS3, and the reasons for the delay could be manifold.


No. Clearly not. I know some people like to think otherwise but the truth is when you get people saying 'we find this hard' its because they find it hard! Just if you're going to present an example to back up your argument, it is important to have a credible statement from a trusted source. The MSN site saying 'here is what Take Two said about PS3 troubles' and quoting them would be fine - they're not going to make up a quote, but the MSN site saying 'the game's delayed and we're saying it's the PS3's reason' isn't as there's no validation at all, and that idea might be no more credible than others.

Michael Pachter has said the PS3 version was the issue here as well.

http://www.gamedaily.com/articles/features/pachter-you-can-blame-the-ps3-for-the-gta-iv-delay/70754/


Take-Two wasn't prepared to give the "real" reason for the delay to GTA IV yesterday, but Wedbush Morgan's Michael Pachter is convinced the title was pushed back due to contractual obligations to Sony. He also thinks the new management really screwed up on this one...

Everybody has their own theory, but more than a few fingers have been pointed at the PS3 version of the game as the culprit.
 
Michael Pachter has said the PS3 version was the issue here as well.

http://www.gamedaily.com/articles/features/pachter-you-can-blame-the-ps3-for-the-gta-iv-delay/70754/

Everybody has their own theory, but more than a few fingers have been pointed at the PS3 version of the game as the culprit.

The special thing about Michael Pachter though is that he doesn't have any backup for his claim either. Of course he's always right so ... wait.

It's not a good idea to have this discussion here. There has been speculation, but no evidence, and those that have actually seen the 360 version didn't believe it would make it. Irrespective of that, I simply refuse to believe that the difference between the PS3 and the 360 version was that large. If it was, then they could have seen a delay coming miles away and/or made adjustments to the PS3's development team.

Don't we have quotes by the way somewhere that both the 360 and the PS3 team have received support in the form of actual coders from Microsoft and Sony respectively?
 
its easy to point the finger at the PS3, but R* hasn't said anything so there is NO given reason for the delay... my guess is both versions were behind schedule as it was supposed to be released last month and barely (has any?) gameplay footage been shown on the 360 either... i've only seen trailers etc..
 
There is another opinion in the above matter, by N'gai Croal

http://blog.newsweek.com/blogs/leve...ation-on-the-factors-behind-gta-iv-delay.aspx

And the question is - who is more credible - Pachter or Croal?

Neither are worth much.

In the end, I think we could all agree it's most likely that it was the PS3 which would have missed the release date, but we can never be sure. Certainly the most likely scenario though, given PS3 delays we've seen throughout the industry.

In my opinion, given Rockstar's previous history, I doubt either version would've been ready for a Nov launch.

But given the widespread problems developers are having, and the fact the 360 version was the one used for press demo's, the PS3 probably was much further behind than the 360 version.
 
Its my belief that the culprit was Rockstar being overly ambitious with its title.

GTA has never been a visually pleasing title, usually forgoing the visuals and offering tons of content with a great storyline added in to produce a great game.

However, with the trailers it seems Rockstar wanted to offer all the great content and huge locale without sacrificing visuals at all.

Simply, I think Rockstar bit off more than it could chew and thus could not meet the release date.
 
Mod Helmet of Divinity +100%
Alright folks... just a thread title change to be something more general.
It is nice to have some developers stop by with some insight, so let's keep the discussion civil, please, or else t
he thread will be locked...again.

:)
 
Status
Not open for further replies.
Back
Top