The Game Technology discussion thread *Read first post before posting*

No...

what the PDF says is, that they use the SPUs for several things, including procedural textures (which not many games used before, iirc), and they also state several times, that they use 5 SPUs for doing their stuff, although they have 6 available for the game.

In the Killzone PDF, the chart showing the SPU usage, the 6th SPU basically was idling all the time, except for 2 minor things (sound and unzipping), which took less than 5% time in the chart. So they basically still have 1/6 plus the rest of the idle time of the other SPUs to leverage code. Not too bad for improvements actually.

Or do they not use the 6th SPU, because of some TRC set up by Sony (we did hear something a long time ago, that the OS can basically take away one SPU at short notice for OS stuff... maybe to circumvent any stalls, they just don't use it at all or so).
The PDF says it uses 5 SPUs for several things, but they didn't say they ONLY use 5 SPUs for everything. They mentioned procedural textures, BUT it was only mentioned in passing (as a bullet point). There wasn't a breakdown of what is being done where.

I, also, remember reading about ProFX's solution. It mentioned using 1 to 2 SPUs to compute the textures to be given to the GPU. That's why I ask if that COULD be it. So, you are saying this isn't a possibility?

I would say your last paragraph is improbable because of Naughty Dog's claim of using near 100% of the clock cycles. They, also, claim that's due to figuring how to pipeline across all cores. If that's true, it would kind of disqualify your last paragraph, right?
 
You sure it isn't a memory thing perhaps? It's huge - 60MB is the largest PDF I've come across so far.
 
The PDF says it uses 5 SPUs for several things, but they didn't say they ONLY use 5 SPUs for everything. They mentioned procedural textures, BUT it was only mentioned in passing (as a bullet point). There wasn't a breakdown of what is being done where.

I, also, remember reading about ProFX's solution. It mentioned using 1 to 2 SPUs to compute the textures to be given to the GPU. That's why I ask if that COULD be it. So, you are saying this isn't a possibility?

I would say your last paragraph is improbable because of Naughty Dog's claim of using near 100% of the clock cycles. They, also, claim that's due to figuring how to pipeline across all cores. If that's true, it would kind of disqualify your last paragraph, right?

I was just curious about why they only use 5 instead of 6, just as Killzone 2 did... nothing else. And, I also noticed they will use procedural textures, which not many games (not just on SPUs, but in general) use, afaik.

And, Naughty Dogs claim doesn't state at all how many SPUs they actually DO use in the end. They also might go with 5 SPUs, but utilizing them 100%, while the 6th is idling, to be possibly taken away be the OS or whatever. It always depends on what/how you measure this. If they say "well, since we cannot put full load on the 6th SPU, we won't really use, thus, we won't include it in our load calculation". But I am just guessing here, because, as those 2 PDFs state, the 6th SPU in both game is "not" used.

And, talking of KZ2... I am actually quite impressed as to how much they leverage the SPUs to help the RSX render stuff. Imagine Sony having used a real next-gen GPU with the PS3, not a year old one... This also seems to me to be the culprit of less spectacular multi platform games, as those optimization in the rendering pipeline are much work, which is not needed on the 360.

I'd like to see some load statistics of multiplatform games and how they use the SPUs, if at all.
 
It is complaining about a custom (paid for) font called Interstate. PDFs should package the fonts inside themselves so they do not need to be present on the target machine - the error comes when it tries to extract this particular font for display (on page 2).

It could be a Reader 7 issue - I'll try with a newer Reader on an open machine.
 
Looking at the Killzone 2 slides, I can safely say that it would be a shame if they didn't make a Killzone 3 for PS3 using what they've built and implementing what they've learned.
 
And, Naughty Dogs claim doesn't state at all how many SPUs they actually DO use in the end. They also might go with 5 SPUs, but utilizing them 100%, while the 6th is idling, to be possibly taken away be the OS or whatever. It always depends on what/how you measure this. If they say "well, since we cannot put full load on the 6th SPU, we won't really use, thus, we won't include it in our load calculation". But I am just guessing here, because, as those 2 PDFs state, the 6th SPU in both game is "not" used.
Naughty Dog's claim doesn't give you a number of SPUs, but we know how many SPUs the game developers have at their disposal. So, near 100% clock cycles of what's available to them would be 6 SPUs, right? Is there a logical way of adding that up that wouldn't come to 6 SPUs being used?

Both PDFs don't state that the 6th SPU in both games are basically "not" used. The Killzone 2 PDF does by showing where all the systems are running, but not the God of War 3 PDF. You are assuming that from what was revealed to serve their purpose. It may actually be only 5 SPUs used for God of War 3, but right now it's just a guess based on the little that was revealed.

I'd like to see some load statistics of multiplatform games and how they use the SPUs, if at all.
I would love to see that as well. Do you think that info will ever be unveiled?
 
I bought Tekken Dark Resurrection off PSN the other day, and was surprised to find the framebuffer drops to a very low resolution in some arenas. Particularly obivous is the Misty Meadow arena. Very sparse graphically, the only obvious rendering performance hog could be the alpha blending of the clouds. There's also a Halloween arena (there may be two different ones) that looks lower resolution, but not as obviously as the misty one.

The big question is, why such a huge drop? Why go with 1/4 of the rendering resolution (if not worse) just for some misty clouds? It can't really be a matter of backbuffer bandwidth, can it?
 
Hmmm...I've always assumed that to be some sort of artistic choice to go along with the theme of the stage. It's odd, yes, but I can see no other reason for it to be THAT blurry.
 
I don't think so. Selecting Lili and one of the lower resolution looking levels, the stepping of her white legs is much more pronounced. It ought to be reduced with a blur! It's definitely not a straight downsized renderbuffer that's causing this issue though. In Practice mode, select the second-left stage from Random Stages for the misty field (for those who don't know). There's a vertical ghosting of the rendering, very visible in the trees but applied to the whole image. Looks like a render bug to me. I can't see this being a deliberate choice.
 
A little late to the thread, but I think the early KZ2 video showing the realtime SPU stats, said that they do indeed push the performance of the SPUs on filled multiplayer levels. So the white gaps on the PDF SPU graphs probably represent cases where they don't yet push to the limit. Still that being said, I'd bet they haven't fully optimized all the SPU jobs yet, and would bet on even cooler stuff for KZ3.

Also if they do cap usage at SPU usage at 60% for frame rate (as someone here posted, which sounds odd to me) that still means 40% left over for background (non-frame dependent) jobs...
 
Quick question for you guys, would you say that the rate at which you can read data from an optical medium (A DVD in a game's case) is directly proportionate to how large your streaming buffer is in memory?
 
Quick question for you guys, would you say that the rate at which you can read data from an optical medium (A DVD in a game's case) is directly proportionate to how large your streaming buffer is in memory?

No. Why would you think it should be?
 
In a discussion of FFXIII tech an individual who has claimed to be making his own game engine stated that the 360 has an advantage because of it's faster read speed- which reduces the size of the buffer for streaming....

So he claimed. But I had never heard anything about that.

In the overall discussion it was a small and trivial point- but I had just never heard of it before.
 
In a discussion of FFXIII tech an individual who has claimed to be making his own game engine stated that the 360 has an advantage because of it's faster read speed- which reduces the size of the buffer for streaming....

So he claimed. But I had never heard anything about that.

If you have higher read speeds you can have smaller streaming "zones".
That means, you can have either higher data density (like better textures), or smaller buffer size with same assets, buffer being the data space for next zone that's being filled while camera is in current zone.

In the end what he says is true (regarding the relation between memory and streaming speed), and incidentally opposite of the question you asked. :)

However whether 360 has a streaming speed advantage is a matter of debate. Even if dvd data is on the faster side of the disc, PS3 can use both HDD and BD.
 
In a discussion of FFXIII tech an individual who has claimed to be making his own game engine stated that the 360 has an advantage because of it's faster read speed- which reduces the size of the buffer for streaming....

So he claimed. But I had never heard anything about that.

In the overall discussion it was a small and trivial point- but I had just never heard of it before.

If 'you' use streaming from HDD Ps3 has an advantage. ;)
 
Most 360 developers are instructed to not rely on the HDD being present for their game's engine to utilize- like Rockstar stated with GTAIV, as an example.
 
Back
Top