*Game Development Issues*

Status
Not open for further replies.
What role is that? The reason we listen to Carmack is because he's a technical junkie, not because of say, his game-making expertise.

He's a technical director of a company. As part of his role, to deliver not only technology, but within budget and time. That's his role. The reason you/we listen to him, is because (you're right) he knows his stuff.

I find it very odd for many people here to call him, lazy or he prefers things "as he knows them". When most people here don't know him personally and yet judge him. Id is not a large development house, to call him lazy or whatnot, just because he has to work within the parameters of his company and shareholders...i just don't know...

Yeah, maybe he's lazy or living in the past...but whenever making that claim, please provide irrefutable evidences. Or else, we can pretty much call every developers lazy because we're not working in asm for everything.
 
He's a technical director of a company. As part of his role, to deliver not only technology, but within budget and time. That's his role. The reason you/we listen to him, is because (you're right) he knows his stuff.

I find it very odd for many people here to call him, lazy or he prefers things "as he knows them". When most people here don't know him personally and yet judge him. Id is not a large development house, to call him lazy or whatnot, just because he has to work within the parameters of his company and shareholders...i just don't know...

Yeah, maybe he's lazy or living in the past...but whenever making that claim, please provide irrefutable evidences. Or else, we can pretty much call every developers lazy because we're not working in asm for everything.

I see your point, and I agree. On the other hand, I don't think Carmack has the influence he had, say, in the late 90s/early 00s, when everything used the Q3 engine. But we still act like every word of Carmack's will shape the future of gaming. Right now he's just one more tech lead, albeit a good one.
 
How did you come to that conclusion? It could have been given x time/budget, he rather work on a system that will deliver enough wiz-bang-oh-ah for the consumers.

Just because we have done ASM in the past, doesn't mean we don't want higher level languages today. Why do we need to use XML, relational database, oop? We want to reduce development cost, because at the end of the day, it's a business to make money instead of doing it for a cool factor.

I would say that he have grown into a role that is more then just technical junky.

I think thats what i wrote? the "as he knows them" may not be the best wording :) But he clearly prefers PC development, it´s only money that made him move to consoles, in the old days i think it would have been his curiosity and junkie needs.

I used to read his .plan files like i read this forum, lots of juicy technical stuff where some of it would rub off so i learned something. At some point i think he started to explain his choices with money reasons. "We chose to do this because that earns us more money".

I dunno if by "people" you mean me, but i was the only one that wrote "as he knows them". I dont i think i required him to write in asm? And i sure as hell didnt call him lazy? Maybe you are reading things that never were written? This guy plays around with java games for mobile phones, wrote linux drivers for fun and plays with real spaceships in his freetime, (i guess he exchanged the spaceships for his ferrari fun). If anyone could be considered a super nerd it´s him, just to bad for the PS3 that he never found it interesting enough to play around with, which was my point.
 
Last edited by a moderator:
I understood that from your original post, but the point is that if there was no PS3, it would be deemed that such optimization wouldn't be worth it.Making a game run as fast as it can on a console is not the same as making a game as good as it can be with a given budget. They're only partially correlated optimization goals.

Well, I kinda have to disagree here. Without handing out too much Kool-Aid, I think it's safe to say that we're seeing a pretty painful transition to a more async-job oriented programming model. This is not really something that is limited to the PS3, but also something we see a lot on 360. What is holding us back here is mostly the PC, where you simply cannot assume that you have a couple of cores just yet. This relates to the "PC guys moving over to consoles" situation, which gives you tons of straight single-threaded code that is being ported to the consoles. Complete rewrites are simply not that much in fashion right now. :)

Anyway, back to topic. If you want to stay competitive (among AAA titles, at least), you will need to get as much processing power out of your consoles as possible. This is not only true for graphics.
To get good performance out of both Xenon and Cell, however, you can't really do PC-like pointer-chasing, or do multiple passes on big sets of data. You need to have chunks that fit into cache, and you need stream-operations to work on these chunks. Basically, you turn you CPU cores into SPUs, with the cache acting as local store. This is something you cannot really get around on PS3, while you *can* get relatively far on 360 with the PC-like model. On the other hand, considering 360 memory bandwidth, I don't think you really want to fetch cache-lines more often than needed.

So yes, right now, it seems that PS3 forces a programming model on you that seems unnecessarily different from what you are used. But I'm pretty sure it's a model we will see a lot more of.
 
Thanks T.B., for saving me from having to type that as they were my thoughts exactly (and have been voiced by others before)
 
Well, I kinda have to disagree here. Without handing out too much Kool-Aid, I think it's safe to say that we're seeing a pretty painful transition to a more async-job oriented programming model. This is not really something that is limited to the PS3, but also something we see a lot on 360. What is holding us back here is mostly the PC, where you simply cannot assume that you have a couple of cores just yet. This relates to the "PC guys moving over to consoles" situation, which gives you tons of straight single-threaded code that is being ported to the consoles. Complete rewrites are simply not that much in fashion right now. :)

Anyway, back to topic. If you want to stay competitive (among AAA titles, at least), you will need to get as much processing power out of your consoles as possible. This is not only true for graphics.
To get good performance out of both Xenon and Cell, however, you can't really do PC-like pointer-chasing, or do multiple passes on big sets of data. You need to have chunks that fit into cache, and you need stream-operations to work on these chunks. Basically, you turn you CPU cores into SPUs, with the cache acting as local store. This is something you cannot really get around on PS3, while you *can* get relatively far on 360 with the PC-like model. On the other hand, considering 360 memory bandwidth, I don't think you really want to fetch cache-lines more often than needed.

So yes, right now, it seems that PS3 forces a programming model on you that seems unnecessarily different from what you are used. But I'm pretty sure it's a model we will see a lot more of.
Well from the things I hear regarding programming on the PS3, the Cell doesnt sound as efficient as it was probably expected from Sony, considering their long investments on the chip.

It looks like IBM, Toshiba and Sony have been working on this chip for years only to come up with a solution that not only it may not differ much from other simpler CPU solutions in terms of performance but it is also more complex to program for, while they can do the same things the Cell can.

So my question is, what advantages does Cell really offer compared to the Xenon or any other current multi core CPU?
 
Bandwidth, local store, and being extremely fast in dealing with data stream manipulation also in terms of how the instruction and data pipelines have been set up. There are definitely a lot of things where the Cell is extremely efficient, and I don't think this is disputed that much on even beyond3d in general (just read some of the old threads on this). IBM is also showing this with Cell in their Blades and 'supercomputers'.

The problem really is that it requires a significantly different way of thinking about software design that currently poses problems both for programmers in general and for multi-platform development in particular.

On the 360, you can often get a project you developed on PC up and running in literally days with half decent performance, even more so if you use Microsoft tools on both platforms. This is very much harder on the PS3 for both software and hardware reasons - although it is still much easier than it has ever been on the PS2 (the RSX being a familar GPU, and the Cell still having one PPU core that is the same as one of the 360's cores).

But even for a high-level, single platform developer like Insomniac, it takes time to figure out how to make best use of the Cell. The biggest paradigm shift perhaps is to realise that you can run your complete engine from the SPEs rather than the so-called 'main' PPU Core, which is relegated to only dealing with some basic housekeeping stuff.

That's a tall order for multi-platform development, but there is one advantage - the engine team is becoming an increasingly small part of the budget, and as time goes on programming models will shift towards what the Cell, simply because that's where things are going on the PC side as well, and as repeatedly said, it will benefit performance on the 360 also, not to mention that there will be a technology race on the software side again especially towards the middle and end of the console's lifespan (which is made easier by more familiarity of programmers by that time - an advantage of console hardware being fixed and on the market for a very long time).
 
Yes, local-store is huge (if you excuse the pun). Think about that: Every SPU has an MFC and total control over its LS. You can do all kinds of buffering, blocking, caching approaches with relatively easy control over what gets transfered. Yes I said "easy". :)

On the other hand, the SPUs have some pretty insane features. Sure, it is hard to make the most out of dual-issue, but you have pretty damn short instruction latencies (FP mul is 6 cycles, as opposed to 10 cycles latency for the PPU, IIRC) and a huge register file, which allows you to deal with them. Then you have separate busses, so you can fetch from XDR and write to DDR, which is pretty groovy, when you think about it. There is also the atomic store cache, which allows pretty damn fast communication between the SPUs. Things like that. It's basically going back to assembly, or at least intrinsic-heavy coding, but you have to consider what you want to do there. Most of that stuff needs good optimization anyway. As Insomniac was already mentioned, their "SPU Shaders" approach is very smart, even if I don't like the name much. :)

And yes, moving your renderer to the SPUs is very, very attractive. This doesn't necessarily mean that you need highly optimized code, BTW. Just use it as a normal core.
 
So my question is, what advantages does Cell really offer compared to the Xenon or any other current multi core CPU?

Apart from it being very "fast" i think the other benefit of the Cell is that time will benefit the developers and the performance they can suck out of the Cell, in other words, the potential takes longer to maximise on the cell than other multicore CPU´s. This is a bad right now but hopefully it will be a plus later on. Especially if there is a Cell in PS4 :)
 
he engine team is becoming an increasingly small part of the budget, and as time goes on programming models will shift towards what the Cell, simply because that's where things are going on the PC side as well,
Just one point, I don't see this happen, caches are here to stay.
Anyway as time passes devs will get a grip with the cell that's sure ;)

One of the huge advantage of the cell (in a gaming system) is also that the xenon is simply not good.
 
Well, caches are nice for unpredictable behaviour, or if you just can't be bothered.
Basically, caches are a "free" speedup without adding extra work for the programmer. If you look at high-performance code these days, you will see a *lot* of prefeching going on. This is kinda backwards, as prefetches are hints to load cache-lines, not really direct orders. What you usually want is to fetch a large block, keep it in cache and flush it when you're done. This is really hard to ensure, unless you spend quite some time locking cache lines. In other words, prefetching is a hack! (There! I said it!)
On the other hand, cache coherency becomes more and more expensive to ensure, making cache misses more expensive. Having LS means you only have to care for coherency during DMA transfers. That's a huge win.

BTW, I can't really agree that Xenon is bad. It has some really cool features but (IMHO) suffers from the layout of the memory subsystem. But that's NDA land. :)
 
Well from the things I hear regarding programming on the PS3, the Cell doesnt sound as efficient as it was probably expected from Sony, considering their long investments on the chip.
You must be hearing from some pretty whack sources! Look outside of the console space to see how Cell is accelerating processing jobs elsewhere quite commonly an order of magnitude faster. That's even with third parties free to choose whatever processor they want. HPC shows Cell is a great high-speed processor, thanks to the design decisions they made.

Returning to the console space, how that processing power is leveraged in creating games, if the Cell has to prop up the GPU to do the same jobs as Xenos in XB360 or monster GPUs on PC, and the difficulties developers have in getting performanced from Cell both through software design and working with Sony's tools, is why you have underperforming titles on what is supposedly superior hardware. Cell is a better CPU in terms of performance. Whether PS3 is a better system to program and can do more work than other systems is quite a different issue from just the processor.
 
Does some devs have an idea about what could have push namco to do the choices they did on SC IV?

I can't think about a valid technical reason.

Even considering the proces of games validation Joker454 spoke about, I don't get it.
Games could have been identical, I don't understand what have motivated Namco choices.
 
Does some devs have an idea about what could have push namco to do the choices they did on SC IV?

I can't think about a valid technical reason.

Even considering the proces of games validation Joker454 spoke about, I don't get it.
Games could have been identical, I don't understand what have motivated Namco choices.

Isn't it almost the culmination of this thread, and the direct opposite of what joker said was a trend in the industry? Considering how Namco had access to PS3 hardware very early on, and how they already released a fighting game on it, it's likely that PS3 was lead platform. And, if Nao is correct, choices made optimizing for PS3 can be used to the 360's benefit as well. So instead of releasing identical games, they pushed the 360 version a little more and made it pump out more pixels.

Afterall, after the last Ridge Racer, is anyone really going to accuse Namco of being lazy when it comes to the PS3?
 
id's post-apocalyptic open-world shooter Rage (PC, PS3, 360, Mac) will look worse on Xbox 360 due to the compression needed to fit the game's assets on two DVDs, programmer John Carmack revealed at tonight's QuakeCon keynote.


According to Carmack, the royalty fees to include a third disc in the Xbox 360 version would be so high that it simply isn't a feasible solution, with the programmer hoping for Microsoft to make a concession. He stressed that the issue has nothing to do with the Xbox 360 hardware itself, and is merely a storage problem.

Carmack also noted his belief that neither Doom 4 or Rage will be digitally distributed, as is just isn't looking into it at this point.

During last year's QuakeCon talk, Carmack stated that the PlayStation 3 edition of Rage would ship on a single Blu-ray disc, with the PC and Mac versions likely to arrive in both Blu-ray and DVD form.

http://www.shacknews.com/onearticle.x/53976

interesting , i wonder how expensive it is to go 3 discs . Its not like it be the first game that needed more than 2 discs .
 
Does some devs have an idea about what could have push namco to do the choices they did on SC IV?

I've been trying to sort this one out since it's one of the more puzzling choices to come along in a while. Here are some pure guesses on my part, so take them all with a cube of salt:

1) One of the 360's tcr's is that all games must ship with some form of AA. Some studios/franchises have the power to muscle this requirement aside. But assuming Soul Calibur could not, then they would have to either use hardware msaa or their own software method to meet this requirement. It seems fairly clear from the resolution choice that they decided against hardware msaa and tiling, so they decided to super sample instead. For 720p output, their super sampling method would meet the tcr. However, for 1080p output, it's possible that relying purely on the 360's hardware upscaler would not count as meeting the requirement. But, in a perverse twist of logic, rendering to a 1365x960 buffer, supersampling it down to 720p via software, then letting the hardware scale it back up again to the clients native 1080p display would meet the tcr requirement.

2) Their code, shaders and/or post process steps are either hardcoded to work with their custom 1365x960 rez, or somehow would need extensive re-write/re-working to be more flexible to support multiple resolutions. They may have deemed this not worth the effort given that most people would have a hard time telling the difference anyways on typical tv's.

3) They realized that they don't have to do anything for 1080p support, the 360 handles it all. They were running out of time and rather than q/a a second solution custom for 1080p, they just let it roll with what they had.

4) Data gathered from XBLive or where ever suggested that the majority of 360 users have the machine set to 720p, so they decided to focus on making that resolution look the best, ignoring 1080p and just letting the machine deal with that.

Yeah, they all sound kinda goofy, but that's about all I could come up with.
 
1) One of the 360's tcr's is that all games must ship with some form of AA. Some studios/franchises have the power to muscle this requirement aside. But assuming Soul Calibur could not, then they would have to either use hardware msaa or their own software method to meet this requirement. It seems fairly clear from the resolution choice that they decided against hardware msaa and tiling, so they decided to super sample instead. For 720p output, their super sampling method would meet the tcr. However, for 1080p output, it's possible that relying purely on the 360's hardware upscaler would not count as meeting the requirement. But, in a perverse twist of logic, rendering to a 1365x960 buffer, supersampling it down to 720p via software, then letting the hardware scale it back up again to the clients native 1080p display would meet the tcr requirement.

I would actually bet money on this one. Not that I confirm the existence of that TCR-point or that Microsoft is being adamant about it, if you're not Epic. I just like betting. Ahem.
 

If this one is true... whomever-the-..."heck" wrote the TCR needs to be shot, or at least needs a lesson in over/super-sampling ( and the um... the "Nature of Scaling" mulitple times :p ). :|

hm... somewhat less incredulous... I'm curious, how difficult is it to deal with anamorphic resolutions :?:

Although on the one hand, it seems like id Software is one of the few developers who have implemented flexible anamorphic control (See idTech4 games), which itself was implemented in a later patch for Doom 3 on PC. And hence, it was not a great priority...

3)...

4)...
hm.. seems plausible considering the HUD is 720p and presumably drawn last i.e. 1080p = afterthought. But why not composite a pixel-perfect HUD instead and then let the hardware scale up or down in hardware :?: Memory constraint ?
 
This is certainly the trend we've been seeing from cross platform games and developer comments. The latest Carmack interview on 1up sure ain't Sony friendly.

The only new thing in this interview (at least to me) is that he clearly states that he thinks the PS3 has a an advantage because of the Blu-Ray.

The other stuff should be nothing new to anyone here, development is easier on the 360, Xenos > RSX, it takes more work to get Cell´s superior power to actually show and the memory partition and constraints on the PS3 is annoying.

He mentions the 8 SPU´s afaik it´s 6(7), i wonder if that was a slip or if he really never did write code for it :)

It would be very interesting to hear from the guy that actually writes the PS3 code for ID, and his thoughts.
 
Status
Not open for further replies.
Back
Top