SPU usage in games

Claiming a power utilization percentage from nowhere is pretty much PR shit.
Of course, the point of those claims is not really educating public on numbers, but really conveying the same idea I happen to believe, that is first/second even third party Cell utilization will improve significantly over time.
Exactly, and I don't understand why you have issue with that. It seems if they had said 'we've got loads more we can do' you'd have been happy, without that describing an estimate whether 'loads more' is 10x more or 2x more. Instead, choosing a reference, ball-park figure, we get a more appropriate idea of how much more they think they can go, yet because they've used a number you take offence.

Would you rather an application or download gauge tell you an estimate of how much longer it'll take in guessed minutes, or just say 'it'll be a while' and leave guessing as to what 'a while' is?
 
For those that are questioning Jobe's remark of using 7 SPE's. For those that have played the Warhawk beta (before the patch was applied) you may have noticed that the game would consistently crash (I won't bore you with the details, but it had to do with a bug in the low level core code running on the SPE's) and, thus, hang the PS3. Normally, the PS3 can recover from game crashes by returning control to the O/S, but when a game explicitly takes over the O/S SPE (7th) like Warhawk does the PS3 cannot recover from a game crash and your forced to perform a cold start on your PS3.

You can't take over the reserved SPU. Its reserved
 
How can an SPU with no interrupt support (I believe?) gain total control of a system!? Just seems to be a deadlock to me (a non-PS3 programmer).
SPUs can be isolated from the rest of the system..
 
How can an SPU with no interrupt support (I believe?) gain total control of a system!? Just seems to be a deadlock to me (a non-PS3 programmer).

It may be deadlock, SPUR manager may not be very resilient or anything else.

Honestly it is not even clear the problem is due to SPU usage.

Do we even know whether PPE has proper hard interrupts for OS support?

If that is the case, I don't see why you would need a system restart unless the crash messes up low level system calls as well.

Exactly, and I don't understand why you have issue with that. It seems if they had said 'we've got loads more we can do' you'd have been happy, without that describing an estimate whether 'loads more' is 10x more or 2x more. Instead, choosing a reference, ball-park figure, we get a more appropriate idea of how much more they think they can go, yet because they've used a number you take offence.

I don't know why you think I have an issue with those arbitrary numbers or I take offense.
If Stringer said "10%" initially and ND/Incognito raised the number to "15%" I wouldn't care as well. In that case, would you really get a different "appropriate idea" ?

Would you rather an application or download gauge tell you an estimate of how much longer it'll take in guessed minutes, or just say 'it'll be a while' and leave guessing as to what 'a while' is?

Honestly I would rather see concrete performance data based on specific task, rather than mostly made-up PR numbers out of undefined notion of full power.
 
Maybe you can if you are a first first first party ;)
Heh.. no.. 1st Party teams have access to exactly the same hardware as everyone else. There's no special rule for them (us?). As far as I'm aware, anyone saying they're using 7 SPUs (on a PS3 kit) is either in the OS group, or in error.

Dean
 
As far as I'm aware, anyone saying they're using 7 SPUs (on a PS3 kit) is either in the OS group, or in error.
That fills me with confidence. Teams working with multiple cores of massive FP performance doing complex maths...can't even count properly :???:
 
That fills me with confidence. Teams working with multiple cores of massive FP performance doing complex maths...can't even count properly :???:

Isn't Dylan Jobe a designer?

If so I don't know why anyone would take his word for it without further confirmation from someone with much less peripheral knowledge of the tech..
 
Insomniac and SPUs

From R&C LOD thread:
OK guys,

I can speak to this:

1. We have tried out a few different LOD methods that we like for different reasons, including our announced support for texture streaming, and we've decided that not doing the work for LOD on the SPUs is a win for us right now. We're always re-evaluating everything, so this is subject to change, but what we've got looks pretty good and is fast. We might go over the details at a later time.

2. Let me say very clearly, we find the SPUs very flexible. We do in fact do some graphics pre-processing on them, including work for culling and clipping and RSX data building as I've mentioned elsewhere. At the same time, graphics alone don't make a game, and we've got lots of other systems running on the SPUs. The SPUs are absolutely central to how we write everything now - it's no longer a matter of "porting" things to the SPU - the SPUs are where everything should go by default. We've still a lot we can do, but we're definitely taking advantage of the Cell.

3. Insomniac does not use EDGE. Although Insomniac did participate in its development (a few of the Insomniac engine guys spent quite a lot of time over at our pal Naughty Dog's place, doing what they could to help them put it together in the early days), but it's simply turned out very different from where we want(ed) to go with the engine and our current tech is very much Insomniac's own.

Mike.
 
"Percent Utilization" is not a real metric

Obviously 30 percent or even 50 percent is an estimate on where, either they think they are (educated guess) or a gauge in what something like the egde tool is telling them (or a combination of both). Also people should bear in mind that with these percentages/estimates that is based on current code and usage.

Yeah, it's always going to be a rough estimate based on knowledge of the platform and what you think is possible to get out of it.

I think the whole idea of discussing "percent utilization" here is a bit silly. You guys are talking about it as though it's some rigorous standard which can be accurately measured. It can't - it doesn't even make any sense really to try, and I don't think anyone seriously does.

I'm sure sometimes these numbers are based on the amount of time spent on the SPUs not calculating anything. Some devs might even throw in a bit more if they're running legacy code on the SPUs they know they can optimize.

But I can certainly speak for us when I say that whenever we talk about "how much of the Cell we're using", it's almost entirely an educated guess based on a mix of our profiles, our knowledge of the code and our current ideas on how we could optimize things further.

The real answer is: "Ask us in five years. Then we'll be able to tell you how much of the Cell we're using now, at least by comparison."

Also:
GFLOPS is not a useful metric for measuring "utilization" - having the SPUs calculated a few extra really big FFTs would increase it, but wouldn't affect the product itself. Utilization conversations are really about how much more can I get out of the hardware for some specific purpose.

i.e. If there is a game that has all the SPUs pegged doing stuff 100% of the time, that doesn't mean that they're 100% utilized. Perhaps that code can be optimized further. Maybe some things don't need to be done at all. Maybe a new data format or algorithm would change things completely. Who knows? There's just no correlation at all.

Mike.
 
Found this in amoungst the new infamous interview:

GS: What's it been like working on the PlayStation 3 for the first time?

BF: For us, the most exciting part of the PS3 has been the cell processor, the SPUs specifically. In our highest density scenes right now, we are currently using about 30 percent of the SPUs' capabilities--with the SPUs doing lots of heavy lifting for us on rendering, visibility, particle systems, skinning, animation blending, and so on...this with scores of pedestrians, cars, fires, etc., all going on. And the best part? We've not made any significant attempts to even optimize the SPU code. I think it's reasonable to guess we could put 10 times as much stuff on the SPUs and still make our frame budgets. It's really pretty amazing.

full interview here:

http://www.gamespot.com/ps3/action/infamous/news.html?sid=6175383
 
I assume optimised will reduce that 30 percent significantly. I guess its going to be some time before devs get anywhere near 80 percent Cell usage using optimised code.

I think the biggest gain (CPU usage) is going to be about 5 to 7 years in. Although first party games are shoing some very good uses (Killzone 2 with its distructable environments, uncharted with its procedural animation etc.) now.

Funny thing is the industry has complained that the choke point of a PC for example is its CPU. With a powerfull cutting edge CPU and PC "like" RSX one wonders what talented devs will be able to do with the PS3 once the inital steep learning curve, improved tools come out.

Due to the sheer power of the Cell one wonders wether devs will bring out the "God of War 2 game" (i.e. the benchmark for PS2 games) for PS3 in seven years or will it be quicker due to the Cell being less complex??
 
It tweens between key animation sequences, so it's true to say its animation is created via a procedure.

Errr, exqueeeze me, but that would make like 90% of the games produced in the past 5 years "procedurally animated".

The boundary between advanced compression (e.g. skeletal animation) and procedural generation is very thin, and subject to stretching by technical marketing (e.g. the 360 uses compression because of its puny insufficient DVD, while PS3 games use procedural generation because of the mighty Cell). Please don't go there.
 
It tweens between key animation sequences, so it's true to say its animation is created via a procedure.

Shifty that would make every game released for a long time procedural.

I thougth he meant actually procedurally generated animations, which would be super impressive..
 
Back
Top