SPU usage in games

Shifty that would make every game released for a long time procedural.
Well yes, most walk/run cycles are tweened and procedural! Purely procedural, physics/behaviour based, isn't out there in any form yet. The only implementation I can think of is probably SW : Force Unleashed, although the supposed procedural animations there looked quite scripted from what I remember, of the Storm Troopers hanging on.

Still, the advances with tweened anims that Uncharted has come up with are impressive progress that have created a more natural look when they flow successfully.
 
AFAIK Uncharted uses procedural animation which blends animation data from proximity information, pre-defined contextual animations and IK to generate final in game animations which differ all the time..

So for example the pre-canned animation maybe to run in response to player movement, but a bullet flying over the character's head may cause him to dynamically duck and stumble a little (perception info) using an extra pre-defined animation (duck), adjusted for the currently rig's pose (using IK information) and all blended with the current run anim..

I'd say thats basically a procedural animation system if you ask me..
 
Does it use IK/FK? In some cases it clearly doesn't, when the animation transition are abrupt and the character skits to a new location ready to jump or such. The floating above the rocks also suggests to me a lack of any kinematic methods, as they're not adjusting the animations to fine degrees.
 
The Uncharted dev's called it layered animation. So an animation doesn't have to finish for another to start...
 
I read from in interview with the uncharted devs that it had procedural animation (which is why I made my statment). Also as in regards to shiftys comments the devs also stated in an interview that the trailer shown (GDC I believe) only had 1/3 of the final animations. This would account for animation stutter if it was trying to call animations that were not there. I also read in the interview regarding facial animation reacting to situation and animations for ducking or taking cover were never the same twice due to their animation engine.

Now whether all this means together with the posts above means that the animations are procedural that would be up to the more tech minded to discuss. What is plain though that the animations used are the most sophisticated used in any game environment. Uncharted might very well do for this generation of gaming (in terms on animation) what prince of persure did for its generation (i.e. be the benchmark for human animation).
 
The Uncharted dev's called it layered animation. So an animation doesn't have to finish for another to start...
Considering the way they talk about it, it also suggests something about having multiple animations (even those affecting the same bones) and weighting between them. I think one of the real pains of this is actually defining a good weighting scheme and a good set of blend-in and blend-out curves. Simple approaches like lerping or sinusing work on paper but look stiff and unnatural in practice. You'd pretty much have to have something user-defined.

Also as in regards to shiftys comments the devs also stated in an interview that the trailer shown (GDC I believe) only had 1/3 of the final animations. This would account for animation stutter if it was trying to call animations that were not there.
Ummm... I believe you meant to say that you'd get stutter if you didn't have a suitable solution for a certain case because the animation profile is incomplete, so it just popped. You wouldn't actually attempt to play non-existing animations, especially when you don't know which animations they'll actually be or how many there really will be. "1/3" would at best have only been his rough guess at the time based on some roadmap.
 
Funny thing is the industry has complained that the choke point of a PC for example is its CPU.

Not really, on the PC the gpu is by far the bottleneck. Most PC games aren't even multi threaded. Even for Crysis, the devs themselves have said that a E6600 Core2Duo cpu is all you need to play it to the max, cpu wise.
 
Not really, on the PC the gpu is by far the bottleneck. Most PC games aren't even multi threaded. Even for Crysis, the devs themselves have said that a E6600 Core2Duo cpu is all you need to play it to the max, cpu wise.

But is that because the CPU is left out of most of the game engine in the first place, because involving the CPU more closely with GPU work is simply not an option?
 
But is that because the CPU is left out of most of the game engine in the first place, because involving the CPU more closely with GPU work is simply not an option?

It is an option, you can do much of the same stuff on the PC. But you wouldn't necessarily want to because the PC hardware landscape changes very fast. Your technique may be clever one day, and then do more harm than good in the near future when someone swaps out their video card. On consoles you have no choice but to try alternative methods because you're stuck with its hardware for its lifespan.
 
Mike Ball - IGN Blog

Luckily this sort of thing is just perfect for the PlayStation 3 with its all conquering CELL processor. The SPU's vector processing capabilities are just perfect for procedural tasks and there are seven of them in the PlayStation 3.

This is just one of the game features that the CELL’s SPUs have enabled. They’re a really interesting architecture. As a developer, you can get code running on them easily and immediately receive good performance due to the fact that you now have the task running in parallel with the other cores. And yet the architecture offers so many optimisation strategies that, if you're that way inclined, you can really hone your code until the virtual cows come home. It’s with this in mind that I’m really looking forward to seeing how games develop over this generation because hardcore developers are really going to be able to push it hard. It’s gonna be a really interesting ride!

So today we calculate how each segment of each strand of hair behaves according to physics and how each segment performs collision checks against the other strands of hair and against the body of the heroine. It’s actually a lot of work for such a seemingly simple item! When the simulation is complete and ready to render, we use a custom shader to create multiple highlights along its length to simulate the way that light transports through each hair.

It’s not perfect though - some things just have to slip. If you look closely you’ll see that the heroine’s hair is not affected by the blades. That’s not because we couldn’t do it, it’s just that the blades move so fast in her combat strikes that if the hair were imparted with the energy from the strike it would fire off all over the place. This is just one of those situations where games and reality have to diverge. Let’s face it: if it were truly real physics, she would just accidentally cut her hair off!
 
Not really, on the PC the gpu is by far the bottleneck. Most PC games aren't even multi threaded. Even for Crysis, the devs themselves have said that a E6600 Core2Duo cpu is all you need to play it to the max, cpu wise.
Did it change in 2 years? Is Unified Shader enough to help CPU?
Also I remember Alan Wake was demoed on a Intel quad-core CPU machine.
These are from GDC 2005:

msdev08.jpg

msdev09.jpg
 
Did it change in 2 years? Is Unified Shader enough to help CPU?
Also I remember Alan Wake was demoed on a Intel quad-core CPU machine.
These are from GDC 2005:

You usually demo on the fastest machine you can find, especially early on when your code isn't fully tuned. Those slides seem old to me, they seem to date back to the days where Intel wasn't making any progress. For years there weren't any big improvements to cpu design, just small incremental clock rate bumps. That might be why they forecast eventually hitting a cpu wall. Plus the "multithread to multicore comment" is also telling. You certainly won't be making good use of any cpu setup if you don't thread your code.

Still, I don't agree with their "games are cpu limited" comment. The solution for every game that has a bad framerate so far is to drop resolution or tweak graphics settings. Do you see that changing in the forseeable future? Maybe one day every game will have a "cpu settings" config screen, but for now they still mostly have a million graphics settings you can tweak to get performance back.

Or maybe Alan Wake is doing some fantastically complex cpu calculations. If so, their pc audience will be tiny, and good luck porting it to the consoles!
 
The Alan Wake demo on Core 2 Quad 3.73Ghz can be watched here
http://www.gametrailers.com/umwatcher.php?id=15409
According to the developer interview video it uses Havok.
http://www.hexus.tv/show.php?show=13

Crytek and Intel demoed Crysis on a Core 2 Quad, it's only a matter of Intel showcasing their 'flagship'. I certainly dont believe a Quad core is needed for Alan Wake especially since it doesn't seem to pull as much CPU bound stuff as Crysis does.

Perhaps thoose with Core 2 Quad CPUs can enable some special setting for much more advanced stuff compared to other PCs/ console versions? ;)

Still, I don't agree with their "games are cpu limited" comment. The solution for every game that has a bad framerate so far is to drop resolution or tweak graphics settings. Do you see that changing in the forseeable future? Maybe one day every game will have a "cpu settings" config screen, but for now they still mostly have a million graphics settings you can tweak to get performance back.

Or maybe Alan Wake is doing some fantastically complex cpu calculations. If so, their pc audience will be tiny, and good luck porting it to the consoles!

I have to fully agree, the GPU is what limits perfomance in almost all games. Soft-shadows and high-res shadows and fire/dense particle effects are real GPU killers amongst other things.
 
Last edited by a moderator:
Oh ya, I remember that video. I'm not sure if thats a demo for Alan Wake, or an Intel commercial ;)
Or both ;) We don't know the exact configuration of that demo but the general idea is with 3.73Ghz Core 2 Quad we can see a hurricane with that number of objects processed by 5 threads, probably it's the max as suggested by the nature of the demo to show off the CPU. We'll see less objects on the currently available CPUs such as Core 2 Extreme QX6700 (2.66GHz). The game will be released in the next year and some gamers with Core 2 Duo E6600 may complain "hey this game is nothing like the demo I saw 2 years ago!! It's CPU limited!!" :LOL:

Then again your comment about Intel's recent progress has another ramification, less CPU bottlenecks in mainstream CPU will urge PC game developers to use more CPU resources. This may affect console ports in future.
 
Or both ;) We don't know the exact configuration of that demo but the general idea is with 3.73Ghz Core 2 Quad we can see a hurricane with that number of objects processed by 5 threads, probably it's the max as suggested by the nature of the demo to show off the CPU. We'll see less objects on the currently available CPUs such as Core 2 Extreme QX6700 (2.66GHz). The game will be released in the next year and some gamers with Core 2 Duo E6600 may complain "hey this game is nothing like the demo I saw 2 years ago!! It's CPU limited!!" :LOL:

Then again your comment about Intel's recent progress has another ramification, less CPU bottlenecks in mainstream CPU will urge PC game developers to use more CPU resources. This may affect console ports in future.

I doubt it since the game is supposed to come to the PS3 and xbox360. Unless the PC version is different and more advanced. ;)

EDIT: New Interview, Alan Wake running perfectly well on a dual-core CPU and a x1900 series card at high resolution with AA on! ;) :cool:
http://www.yougamers.com/articles/4489_remedy_interview_-_part_3_the_power_behind_wake-page5/
So what's this "secret" we've been mentioning about the demo? Well, okay, it's not really that much of a secret but remember how Remedy ran Alan Wake in IDF on a quad core machine? Well, they showed it to us on a dual core processor in a machine with around 2GB of RAM (perhaps more) and a single ATI Radeon X1900 series graphics card - all running perfectly well on a massive HD screen with anti-aliasing enabled.
 
Last edited by a moderator:
Still, I don't agree with their "games are cpu limited" comment. The solution for every game that has a bad framerate so far is to drop resolution or tweak graphics settings. Do you see that changing in the forseeable future? Maybe one day every game will have a "cpu settings" config screen, but for now they still mostly have a million graphics settings you can tweak to get performance back.

You are wrong, you just forget the fact, that you can not access GPU directly on PC.
All access to GPU is through driver + API, and this is killing your performance.
So in fact, ALL PC games are CPU limited these days.
And because you can do nothing about it (you can not change the way driver or Direct3D behaves) the games use settings that reduce graphic quality, which in turn reduces the load on API and CPU.
Of course you can always make any game GPU limited by pumping up resolution or AA/AF settings, but in most cases this means that you just stress ONE GPU component (ROP namely) and this causes GPU bottleneck.


As for Uncharted animation: they use "clever" blending + IK.
So it's not "procedural" but it's way more advanced than any PC GPU can do, just because it's SPU-based.
And if you have just one IK agent it's not hard to do proper IK solutions.
 
Back
Top