John Carmack on PS3 Video

ERP said:
Sure but how many embarassingly parallel tasks are there in you average game?
And how many are there you can add that add value to the game?

Here's my current thought on this, having not gone through an entire dev cycle, so I'm likely to change my mind. I am pretty positive the limitting factor will be the speed of the PPU, we'll off load all the easy stuff, and probably add stuff that can be easilly offloaded. But I don't believe at there core most game engines will be particularly parallel.

I think there is a real desire to scale gameplay, I want to put 100 people in the streets of a city to populate it instead of the 20 or so in GTA. I don't want them to disapear when I turn around. I'd like them to exhibit reasonable behavior in response to what's happening, I'd rather thay didn't run into each other all the time and look stupid. All this stuff will likely increase the load on the main game thread.

Most of the game engines I've seen don't even do things like attempt to batch physics queries, or deal with the fact that they might be asynchronous. It's very much do a raycast here and change state in response to it.

Parallelism isn't trivial in general and it's easy to actually reduce performance if your not careful, even something as seamingly trivial as creating an object asynchronously has subtle gotcha's that will screw you if your not careful. And unfortunately a lot of things appear to work until they don't.

IMO It's going to be a long slow transition to effective parallelism in game engines.
Yes, that's why the Pentium 4 was scheduled to hit 10 GHz some time ago, and the preferred (and original) goal of the Xbox360 was to have a single 10 GHz processor.

Alas, nobody is able to produce those. So we all have to do with the second best, like three processors at 3.33 GHz. And we all have to learn to live with that, and adapt. Or become obsolete.
 
DiGuru said:
Simple breakdown: John Carmac isn't happy about changing his habits. While there is much to gain by doing it the PS3 way, it's a lot of hard work. And he was doing very well doing what he did, thank you very much. And the Xbox360 is very much as he is used to things, Visual Studio and all the common classes and objects.

Cynical? Yes. True? Yes. The higher people are elevated, the less they like having to start over and proving themselves once again. When you're considered a Demi-God, you're not trampling to change your territory and run the risk that you get your ass kicked by the inhabitants.

I don't even think that's it, I'm sure his next engine will be just as good or better then anything most of the PS2 devs are going to come up with.

BTW I'm talkng technically, not gameplay or artisticly
 
DiGuru said:
Cynical? Yes. True? Yes.

Cynical? Yes.

True? In your judgemental opinion, which really adds nothing to this thread.

Should we apply the same argumentum ad hominem to the other developers who have voiced similar concerns? Starbreez made similar comments about Cell and the SPEs last week? Gabe Newell had the same complaints last year. There are quite a few examples actually.

While I think the answer is pretty obvious, I do wonder why certain posters make a concerted effort to insult Carmack instead of engaging his arguements. If his points are incorrect, then we can dismiss his concerns. Its pretty simple. I see no benefit ignoring his reasons for his position (which he has outlined before), and engage in phsycho-babble mind reading about what he really is thinking. How does that benefit a technical forum at all?

His concerns, echoed by other developers, seem to have more substance than you are giving them credit for. We know that Cell has some advantages that current CPU architectures really cannot keep pace with; but the question is for *these* developers to they solve the problems they are having in game development, and is the time (and thus money) worth the trade off of the extra development time?

There are a lot of questions his comments raise about CELL and Xenon, and multi-core in general as we move forward. I cease to see how your comments about a long-establish developer who has impacted the industry through his games and technical designs deserves such digs.

Even if you don't agree you could at least give reasons why you think he is wrong, or at least in your development experience you see the problem/solution equation differently.

Ps- DiGuru, what company do you work for? Are you working on a PS3 product? Do you mind associating your comments with your name?
 
pegisys said:
I don't even think that's it, I'm sure his next engine will be just as good or better then anything most of the PS2 devs are going to come up with.

BTW I'm talkng technically, not gameplay or artisticly

That is a given industry wide, not just in regards to PS2 devs. Considering that of the thousands of games released every year all but a couple dozen are any good puts it into perspective. I personally cannot stand his games (the games), but not many developers have as long of a history of id Software in making quality game engines.
 
DemoCoder said:
Ease of use is a factor if you want massive adoption at all levels. For a console, I may not want shovelware developers. I may want top-tier developers who can rench maximal performance from my system and come up with innovative designs. I want my top-tier developers to have ease of use and good tools, but one shoudn't design a system where the hands of those developers will be tied.

Is a top-tier developer one who can obtain maximal performance from a system, or one who makes profitable games?

Programming proficiency doesn't directly determine how playable a game is or how popular it will be, and I don't like the idea the that if you can't throw a lot of resources at a game it possibly doesn't deserve to get made.

The most successful series of games this generation, the Grand Theft Auto games, used Renderware and was multiplatform. I can think of a great many games that impressed me a great deal more with how they looked and moved.
 
Ok, I'll do it a bit more nuanced.

First, I'm just an ordinary programmer that does things like databases and robots.

Let's start with another example. If I buy a new car, I would want it to have massive acceleration. 0 to 60 MPH in less than a second, like being launched in a rocket! Because, not only would I get a kick out of that, but I would never have to worry about having enough power.

But, not only is it impossible to mass-produce such cars for a decent price, but I would pay an arm and a leg on fuel.

So, I have to buy a car that requires a lot of skill to make it perform in the same league as that supercar. Or I could just not bother and accept that it will perform much worse when I drive it like I'm used to.

That's about the gist of it. Things changed, they're not evolving as we would have liked them to, and so we have to change our habits. Like, coming up with a better model than the single game loop.

Because that's how games work(ed). You have a very large, single loop, that does all things that have to be done serially. It's easy, it works, and it's how everyone does and did things, form the start. Like, we always had a single, serial processor to run that.



It's like a factory that is used to sending out a single truck each day to deliver their products. But they are able to produce much more. And they get the choice between sending out three trucks each day, or using a much more efficient, but completely different delivery model. And the logistic department isn't happy. Because they have no idea how they should split their deliveries in three different ones. Their administration and software cannot handle it. And it's scientifically a "hard" problem to calculate the optimal loads and routes! They don't know how to do it, and doubt that it ever can be done. But, at last it is more or less what they're used to.

But that other model sucks! Let's say that it uses pneumatic pipes for delivery. But you have to insert a package of a specific size every fifteen minutes, instead of just loading everything on a truck once a day in one go. The logistics department will go nuts! They'll tell everyone that it's impossible. It cannot be done!

But then, some people from engineering give a presentation, showing that they can deliver double the amount each day using the new pipe system, and twice as fast! Great! So the company is able to produce twice the amount of products! Great!

And while some of the younger people are happily working on and dreaming of such a system, the senior logistics expert is very unhappy, and going on television to tell everyone that it is a very bad system and that the old ways are best. And that the new system with multiple trucks sucks less than the new pipe system. THAT isn't going to work well whatever. And it is a dead end. The only good way to do it is by building mega trucks, doubling the width of all roads and inventing a lot of new stuff needed to make all that possible.

While the truck builders have given up on bigger trucks and are investing lots of money in pipe transportation systems. And most other people are happily designing cool new systems that use all the new possibilities offered by the very cool new pipe system. And others are happily implementing and selling new systems to make better use of the multiple truck systems, for the time being.



Like they say: go with the times, or become obsolete. It's your choice. And a sure sign of getting older. ;)
 
For some odd reason, I like Tim Sweeney a lot more than John Carmack nowadays. For one , Sweeney isn't complaining about difficult things and on the whole seems to be a lot more eager to try new things and learn new things / new ways of doing things. Also, the UE3.0 engine is widely used today whereas I think you can count the number of games using the Doom 3 base on one hand...
 
Btw, I did come up with an alternative model in another thread here. ERP even said that that's what he is using at the moment. So I guess I'm in the creative camp, that likes the challenge. Even while I'm not writing (console) games. (And I did respond to and discuss all John's points on this forum, Acert! They're not new.)
 
I haven't heard much about Doom3's toolset. Whereas the main value of UE3.0 is its tools. The engine part of things is not really top notch other than to say that it's pretty suitable for everything. But making the engine part of things is not difficult for any company -- it's making good powerful tools that's hard, and that's the main reason why UE3 dominates.

I'll say one thing, though... there is no game made on any 3rd party engine that didn't have to rip the darn codebase apart. If not for tools, I don't think major developers would license engines very often at all.

Carmack is a guy who's focused on tight underlying pipelines. As much as there are some nice tools and technologies coming out of there every so often, the implementation is still his big concern. Megatextures, for instance, probably wouldn't have come out of iD if there wasn't a nice efficient way to implement them. As a programmer on his own, I'm sure he would enjoy the challenge of PS3, but when he makes statements like this, he's speaking as Tech Dir. of iD Software, not as l33t hax0r JC. In general, I tend to hear a lot of talk about how fun the SPEs are, but that's not what gets a product out the door.
 
Last edited by a moderator:
Like they say: go with the times, or become obsolete. It's your choice. And a sure sign of getting older. ;)

I don't think his problem, or his comments, are about not being with the times or getting older. I think his concerns, right or wrong, are based more in the problems of development and the "solutions" available on the PC, 360, and PS3.

1. Carmack has stated his belief that parallelism is the future. So he is onboard here. He actually has more experience here than most PC or Xbox developers.

2. He thought the consoles trade offs, more cores that are less efficient, was poor and they should have waited until the following generation if that was the only option (i.e. Yes, but not now). If he can influence these companies with his comments he will. It could also be marketing: He chose the lesser of two evils for his particular goals, and now is voicing the reasons for that decision. This in turn lets consumers know where he stands (conjecture on my part here of course).

3. He believes asymetric designs, at least the one in CELL, is "less than optimal". Hence he has chosen the PC/360.

What sticks out to me in his comments is that he (a) says the PS3 is not bad, which indicates these are his honest opinions and (b) this is not about becoming obsolete, but what he thinks is the best solution--and then supporting that.

To be clear, his criticism is not of mutli-cored processors. His criticism is the tradeoffs in CELL (and Xenon). If we think back, Carmack has worked with multi-CPU systems before; and he has been pretty progressive in terms of design, at least on the renderer side. If we consider "go with the times" just jump on whatever hardware Sony offers, then that is one thing. But Carmack is pretty influential compared to most developers. He also does not need to "get with the times" (=Sony) if there are other options. I am not sure we can consider him antiquited because he thinks Sony made a poor design decision. Yes, his concerns wont change their choice. But then again he doesn't need to support their hardware either.

Obviously if the PS3 was the only platform on the market "getting with the times" would be a fair criticism. But he is a programmer and business owner (something most developers are not). He has gotten with the times--it just is not Sony. And he is saying why.

Now why does he choose to keep commenting? Obviously people hold him in high regard. Doom and Quake will do that considering their impact on the industry. He is known to be very smart. He probably is also answering these questions to cut through the "bull" as he did last year where he complained about Sony AND MS; but I don't doubt the possibility that some of his comments now are aimed at saying, "I chose MS, here is why, now buy MS's console so you can play my game".

So there are some various motivations here; I just don't see being old or "not getting with the times" are it. And his concerns, as you say, are not new. But that doesn't mean they are going away or not worth discussion.

Last years drubbing of Xenon and Cell was pretty fair I thought, in that we had been slammed with a lot of PR and buzz, and it gave another perspective on it. He has done that again, just as others (like Factor 5) have taken the OTHER position (i.e. pro Cell).

Personally, I can see how Factor5 working on CELL is great for their talents and games and where id sticking with the PC and 360 works best for their focus. They both are meeting new challenges, but more inline with what they are known for.

If you are good making trucks you don't want to make sedans. New technologies, engines, tires, and so forth require new designs and ways of doing things, but expecting these companies to completely abandon their fanbase wouldn't be a good idea. It kind of goes back to what Newell said of both MS and Sony: They are creating solutions for problems he does not have. Newell went as far as saying the 360 does not solve a SINGLE problem he has as a developer and just creates more.

And THAT is where I think the big devs are upset. The see games going North and they need more silicon dedicated to that; yet chip makers are taking their designs South.

There is a tension there. Of course there are compromises on both sides, but right now Carmack is saying he does not like Sony's compromises as much as MS's. I personally see the issue as pretty complex and involving the entire industry and shows how there are a lot of forces pulling it in many ways. With the PC, 360, and PS3 we also have 3 very different solutions to the same problems.

Pretty exciting IMO.

DiGuru said:
(And I did respond to and discuss all John's points on this forum, Acert! They're not new.)

This topic has been closed 2x because of the nature of his comments (PS3 v 360) and because of the comments directed at him instead of what he said. I am not questioning your contribution to the forums (which are far superior to mine!), only your comments in the last post which were directed more toward him and not what he was saying. Thats what got this thread deleted the last times. Implying he is old, lazy (others, not you, did this), rigid, not willing to get with the times, etc really doesn't have much to do with his points and are just conjecture, and to my knowledge few of us know him. Put another way I would feel the same way if someone said the same things about ERP or DeanoC.

Nothing personal :smile: I just think what he is saying has some applicability to the industry as he is one of the biggest PC developers in the world. I don't think what he is saying is universally true, but I also don't think it is an issue of being out of the loop. He is moving forward, he is just being critical of Sony, just as he was of MS last year.

Pretty ballsy if you ask me! It keeps things in perspective. But I don't see it as whining, more of using his influence to let his opinion be known (its not like he is not making next gen games and just folded house, or that he refuses to use multiple cores). Maybe next time Sony will ask him what he wants ;)
 
ShootMyMonkey said:
As a programmer on his own, I'm sure he would enjoy the challenge of PS3, but when he makes statements like this, he's speaking as Tech Dir. of iD Software, not as l33t hax0r JC. In general, I tend to hear a lot of talk about how fun the SPEs are, but that's not what gets a product out the door.

Well said ShootMyMonkey. In many ways he is speaking from the position of the small independant developer with smaller teams. I know Newells concern with the PS3 was similar when he stated that he was worried some Junior Engineer could ruin the code and that just adding people was not a solution.
 
DiGuru said:
Simple breakdown: John Carmac isn't happy about changing his habits. While there is much to gain by doing it the PS3 way, it's a lot of hard work. And he was doing very well doing what he did, thank you very much. And the Xbox360 is very much as he is used to things, Visual Studio and all the common classes and objects.

Cynical? Yes. True? Yes. The higher people are elevated, the less they like having to start over and proving themselves once again. When you're considered a Demi-God, you're not trampling to change your territory and run the risk that you get your ass kicked by the inhabitants.

So JC doing the rendering engine on cell-phone games isn't proof enough that he's not lazy or "accomodated" to the x86? You're oversimplifying things.

And reports from "PS2" devs saying PS3 is not so bad have to be taken in perspective too. If their company has exclusive agreements with Sony (insert other devs and MS/Nintendo) they only have to learn about coding for the PS3 so of course they'll say it's not so bad. I'm sure that if you ask Bungie they'll say the xbox 360 rocks their world.

Where JC comes from and where he wants to go, i.e. releasing the next game on PC/PS3/Xbox360 his comments are perfectly valid, and IMO, far more interesting than any other "x exclusive" developer out there.
 
ERP said:
I think there is a real desire to scale gameplay, I want to put 100 people in the streets of a city to populate it instead of the 20 or so in GTA. I don't want them to disapear when I turn around. I'd like them to exhibit reasonable behavior in response to what's happening, I'd rather thay didn't run into each other all the time and look stupid. All this stuff will likely increase the load on the main game thread.

Yeah, but why is that necessarily a serial problem? It seems alot of core game logic these days ends up being done in an embedded scripting language sitting on top of some native primitives like path finding and visibility checks. If they are using a scripting language to run the core behaviors of the NPCs, it can't be that intensive, otherwise some enterprising individual who was suffering from CPU-bounded issues would implement a native compiler either for their core scripting language, or, they'd end up just using a high level C/C++ library.

Seems to me that looking at old Quake and Unreal script code, much of game logic ends up being a simple rules engine. This should be highly parallelizable for each actor, and you don't need to worry about actors being out of sync with world state IMHO because only *cheating* AI NPCs have perfect up-to-date knowledge of the entire state of the world. Real people make mistakes by creating planes based on semi-accurate semi-stale information. In fact, you don't even need to simulate all actors at the same tick rate and give them all access to the latest entire state of the game world. It would be acceptable to me IMHO to let some NPCs work on stale data as a local copy, which won't have contention issues.

I'm leaving out physics. But we take away I/O handling code, the render loop, and physics simulation. What's left that is inherently serial and can't scale?
 
Thanks for the long reply, Acert. I stand corrected. And I agree.

But nonetheless, you see the developers divided into two camps on this issue, and John is one of the most respected and vocal ones. And while I shouldn't have made it personal, I do have more respect for the camp that is actively trying to come up with great ways to make their games run as much parallel as possible.

But I agree, that from a business point of view, his comments are very realistic. It takes a lot of time to build a great engine from scratch, and if you don't need to do so, you save lots of time and money.

But even Intel and AMD are going the Cell way, although they want to keep the basic instruction set for all the cores the same, and want to add or remove certain units from most of the cores. While that makes it easier for the deveopers to get up to speed, I'm not sure if a clean design that focusses primary on the strong point is not much better, when that initial hurdle is taken.

I agree with DemoCoder that you can use models that make most of the common tasks "embarrasingly parallel". But that does require a completely different way of looking and thinking at it. Like, doing as much work each frame as possible, instead of just running the loop and see how long that takes. Or a frame/task manager, that batches small tasks that can spawn into streams if needed. And a completely different object model. Which are all things that definitely have to happen to make good use of multiple cores. If not now, then next generation.

Let's wait and see.
 
The product life cycle for each console could have a lot to do with the product we see today.

If Sony plan to keep their PS3 on the market for 8-10years then you could likely see more and more of the cell coming into play. If the SPE cpu model does take off, it'll still be a while before it becomes mainstream so during the sunset period of the product cycle you could still see benefits from the tech.

If MS in only going for a 4 year product cycle, then they're better of introducing technology that devs are more comfortable with immediately and will be mainstream for the next few years. Thus Xenon and Xenos seeing that unified architecture will be mainstream during this time.
 
RobertR1 said:
The product life cycle for each console could have a lot to do with the product we see today.

If Sony plan to keep their PS3 on the market for 8-10years then you could likely see more and more of the cell coming into play. If the SPE cpu model does take off, it'll still be a while before it becomes mainstream so during the sunset period of the product cycle you could still see benefits from the tech.

If MS in only going for a 4 year product cycle, then they're better of introducing technology that devs are more comfortable with immediately and will be mainstream for the next few years. Thus Xenon and Xenos seeing that unified architecture will be mainstream during this time.
Yes, good point. But that does mean, that most of the extra power those consoles are capable of is going to waste.
 
DiGuru said:
Yes, good point. But that does mean, that most of the extra power those consoles are capable of is going to waste.

Likely. The prime example of this is the visible improvement of the titles over time. So if nothing was going to waste, then the launch titles would look as good as the end of cycle titles which has never been the case.

Citing your car example, the whole thing is a compromise. You have to spend the right amount of time on engine, chassis, suspension, tyres, electronics and aero development. Then find the perfect balance. Even if you have a 800hp engine but no downforce you won't be really using the engine for it's maximum to achieve the faster lap time. So you start improving your aero which will exhibit weakness in other areas. Before you know it, it's time to develop a whole new car and you're left with "what if's" but during that time, did your car improve? yes. :) 100% efficeincy when multiple elements must interact is not possible thus theoretical numbers are PR talk. All they give you is a glass ceiling that you're never going to reach anyway.
 
RobertR1 said:
Likely. The prime example of this is the visible improvement of the titles over time. So if nothing was going to waste, then the launch titles would look as good as the end of cycle titles which has never been the case.

Citing your car example, the whole thing is a compromise. You have to spend the right amount of time on engine, chassis, suspension, tyres, electronics and aero development. Then find the perfect balance. Even if you have a 800hp engine but no downforce you won't be really using the engine for it's maximum to achieve the faster lap time. So you start improving your aero which will exhibit weakness in other areas. Before you know it, it's time to develop a whole new car and you're left with "what if's" but during that time, did your car improve? yes. :) 100% efficeincy when multiple elements must interact is not possible thus theoretical numbers are PR talk. All they give you is a glass ceiling that you're never going to reach anyway.
Agreed.

But there is a difference between using all of it in the brute force way (start of lifetime) and optimizing things so you have resources to do them even better. And all of the launch titles are more or less single core ports, that use the other resources (cores) for a few small tasks. They're not used, most of the time.

So, it's like the difference between running that engine full speed but in an inefficient way at the start, and simply not using two thirds (or more) of the power available. Like, never shifting to second gear or higher.
 
hey69 said:
but he said the ps3 has more peak performance ..

and how is 3 cores symetrical actually?

Hmm....you know, I've been wondering something. I don't at all understand that inner workings of cpus, but it seems to me that a design like Cell would offer a developer more control over the processor's abilities, whereas a design like the x360 cpu, which is more of a traditional PC design, offers less control and unfortunately doesn't share the OOE of PC cpus to automatically make best use of its resources. It seems to me that if the two designs are approached from a standard PC viewpoint, Cell would be bad and Xenon would be less bad, whereas from a more hands on approach, Cell would be good and Xenon not so good.

Can anyone satisfy my suppositions, or crush them bitterly into the ground?

The last game id developed on Console was Doom for the Atari Jaguar...

Didn't iD itself do the Commander Keen port to GBA? I think they may have done a cellphone or PDA game themselves.

Then again what solution was there? CPU makers hit the wall 3 years ago, there was not many options. It seems to be more an industry problem and that we should have been looking at parallelization in hardware, tools, and software years ago. Then again he is pretty up front that he thinks both consoles are not bad (from a dev perspective), unlike last gen. So his comments were all negative.

They probably could have held out a bit longer. The wall hadn't been hit yet in the design of either Xenon or Cell, and as evidence by Conroe, it still doesn't seem to have been hit. I think Xenon's development was more about time to market + end costs (buying chips from Intel or AMD would be too expensive), and I think Sony was honestly trying to make a very cpu focused system with as minimal as possible a graphics processor (similar to PS2), but ran into problems along the way and/or realized that they wouldn't be able to outdo a dedicated graphics processor, so the Cell almost seems like baggage brought along from an earlier vision. Playstation 3, as a whole package, seems like it could use another year in development imo (price and blueray adoption included).
 
Back
Top