AI Co-Processors: could this be the best thing to happen to gaming since the GPU?

You might want to post that in the Drive Club thread coz it has nothing to do with AI co-processor. ^_^

I'm curious about Drive Club vs MotorStorm AI myself.
 
The nice thing about an AI co-processor is knowing you have a fixed spec for it.

Put out something good for $100 and have it required for some popular games and boom there you go.
 
Bah I don't believe in any of this ai co-processor bunk. The future of ai is tied to cloud as far as I'm concerned, that way they can leverage a mix of machine code and farmed human behavior, and eliminate predictability in the process.
 
Bah I don't believe in any of this ai co-processor bunk. The future of ai is tied to cloud as far as I'm concerned, that way they can leverage a mix of machine code and farmed human behavior, and eliminate predictability in the process.

Quoted for truth.
 
Is AI a big performance problem with games today? I understand quality of AI can waver from game to game, but is the processing a major factor in slowing down today's game performance?

Judging from the discussion here it seems like having a coprocessor for AI may be a bit much and that transistor real estate is most likely better spent on something else. If it's something GPGPU could do nicely with HSA then there is our solution. Also joker's comment on cloud being the future of AI does bring in some nice possibilities and is something that wouldn't require a crap ton of bandwidth or super low latency. I look forward to seeing AI move to and evolve on the cloud with this coming generation.
 
Is AI a big performance problem with games today? I understand quality of AI can waver from game to game, but is the processing a major factor in slowing down today's game performance?

Judging from the discussion here it seems like having a coprocessor for AI may be a bit much and that transistor real estate is most likely better spent on something else. If it's something GPGPU could do nicely with HSA then there is our solution. Also joker's comment on cloud being the future of AI does bring in some nice possibilities and is something that wouldn't require a crap ton of bandwidth or super low latency. I look forward to seeing AI move to and evolve on the cloud with this coming generation.

Path finding & things like Adaptive AI would give the Jaguar cores a pretty hard time so taking the work off of them with co-processors , GPGPUs or the Cloud would help a lot.
 
Last edited by a moderator:
I'm surprised by how much developers were able to do with meger computational power back in the day. I remember playing Half-life 1 under software rendering on a 166mhz MMX pentium and facing marines sent to assassinate you who would not only use cover but throw grenades to flush you out, even try to flank you while another would try to keep you pinned with suproession fire. Sure some of that was scripting, but the cpu was also doing most of of the graphic rendering. Considering this, I'd be disappointed if AI fodder in PC FPS like consolized Crysis 2 took more than an average 5% of cpu resources in any decent quad core.
 
Last edited by a moderator:
Path finding & things like Adaptive AI would give the Jaguar cores a pretty hard time so taking the work off of them with co-processors , GPGPUs or the Cloud would help a lot.
Or, don't use Jaguar cores. The current consoles have used Jag cores and haven't got AI processors - nothing to discuss there. Ergo the discussion is whether AI processors will bring value to a new machine. I don't see the value on a decent machine with a decent CPU. Whatever AI silicon you put in there, you could just put in extra CPU silicon. Unless there is such a thing as a hardware A* path-tracer that provides remarkable efficiencies over a CPU.
 
Unless there is such a thing as a hardware A* path-tracer that provides remarkable efficiencies over a CPU.
Of course, since it'd be a hardware block it couldn't ever do anything but that particular function - useless for any games title that does not require lots of AI entities. Also, if you build anything into hardware, you better make sure it's universal enough in nature so it won't soon be obsoleted by technological (algorithmical?) progress.
 
I remember playing Half-life 1 under software rendering on a 166mhz MMX pentium and facing marines sent to assassinate you who would not only use cover but throw grenades to flush you out, even try to flank you while another would try to keep you pinned with suproession fire. Sure some of that was scripting, but the cpu was also doing most of of the graphic rendering.

Not that familiar with how things are done today, but Half-Life 1 AI was based on the level designer manually placing 'nodes' on the map which the NPCs used to navigate in the world. The designer had to analyze the layout and note all possible player positions; then find vantage points, flank routes etc. and mark everything with these nodes.
This also required lots of testing and it was easy to break the system if the player chose some unexpected route or cover position or so. The NPCs couldn't just walk around the map on their own, and I guess scripting was used to keep them confined into an area and manually initiate more aggressive attacks or retreats based on some criteria.

Halo 1 did something similar but combined it with making the AI decision making and the conditions it was based on far more obvious to the player. There's also a finite state machine thing where the AI can be scared or berserk or curious and that changes the type of action it will use to respond to events. They also had memory of last known player position and information about nearby allies and such.
I'm sure driving games also have invisible metadata stuff on the track geometry marking ideal paths and breaking points and such.

Basically ingame AI is about packaging the human intelligence of the level designer into the game.
The actual AI computing part is relatively simple and not too resource intensive. Where it requires performance is gathering sensory input (usually both vision and hearing require raycasting) and keeping track of other AI agents when their number gets higher.
 
There's an Uncharted presentation on this somewhere, that details how their AI looks around and path searches.

There's a very detailed discussion that I just found and hadn't seen before here:

http://www.gamasutra.com/view/feature/134566/the_secrets_of_enemy_ai_in_.php

Uncharted excels in friendly AI and enemy smarts.

KZ2 has fantastic AI-driven animation but I couldn't find the presentation anymore.

Found these KZ1 and 2 AI presentations instead:

Killzone's AI: Dynamic procedural combat tactics (KZ1)
http://www.cgf-ai.com/docs/straatman_remco_killzone_ai.pdf

The Playstation 3's SPUs in the Real word: A KZ2 Case Study
http://www.slideshare.net/guerrilla...he-real-world-a-killzone-2-case-study-9886224

The AI of Killzone 2′s Multiplayer Bots
http://aarmstrong.org/notes/paris-2009/the-ai-of-killzone-2s-multiplayer-bots
 
Uncharted excels in friendly AI and enemy smarts.

KZ2 has fantastic AI-driven animation but I couldn't find the presentation anymore.

Found these KZ1 and 2 AI presentations instead:

Killzone's AI: Dynamic procedural combat tactics (KZ1)
http://www.cgf-ai.com/docs/straatman_remco_killzone_ai.pdf

The Playstation 3's SPUs in the Real word: A KZ2 Case Study
http://www.slideshare.net/guerrilla...he-real-world-a-killzone-2-case-study-9886224

The AI of Killzone 2′s Multiplayer Bots
http://aarmstrong.org/notes/paris-2009/the-ai-of-killzone-2s-multiplayer-bots


Looking at this makes me believe more than ever that the PS4 is going to have some kinda co-processor to make up for the SPU's not being there.

But then again they have been talking a lot about Compute so maybe it really will replace the SPUs.
 
Looking at this makes me believe more than ever that the PS4 is going to have some kinda co-processor to make up for the SPU's not being there.
Or they just use the CPU. Look at slide 56. AI processing is a teeny part of SPUs workload. The Jag cores aren't going to be doing any of the graphics work Cell had to do, so proportionally far more CPU can be dedicated to AI.

There is no AI processor. If there was, we'd have been told about it. And there's no need - that's what the CPU is for.
 
Back
Top