Benefits of higher performance Console CPUs (Game AI) *SPAWN*

But, is AI so game dependant?. I mean... you had awesome AI in Half-Life 1 soldiers, awesome AI in Fear 1 soldiers... but this was ages ago and second parts of these games lacked in AI. Couldn´t have been copied and expanded prior AI algorithms?.

PS: Another recent example: Killzone 2 AI IMHO is way superior to Killzone 3 one and the same with Uncharted 2 and Uncharted 3 ( the mission in the tibetan temple roofs was great, so i supposse that to have a good AI npcs must look for you, lose you of sight while you are hiding, and attack not passively when finding you, so the contrary of a COD game ).

But generally in same time the number of opponents increased in sequel… So more AI paths and scripts… more synchro data bugs, so need to less complexity, no?
 
Most MMO's run combat on the server, but it has much more to do with security than it does with it being the best place to run it.
The general rule of thumb for these games is that you assume all client side state is compromised.
Though you might be able to relax that to some extent on a closed box.
Interesting. That's the complete opposite view to that expressed by CheezDoodles in the thread about framerate where internet lag was raised. Could you go into a little more detail about the nature of modern online gaming and the communications between server and clients? What are the expected BW requirements for users, and how much headroom is there for server run data? Things like physics I don't imagine are worth communicating to clients. In the case of AI, I guess you'd just send character positions and motions, and handle the whole server learning side on server closed to the clients.
 
I'm certainly not an expert on MMO's, most of what is server/client side can be determined by looking at what hacks allow a user to do. And what happens as network performance degrades.
For example in WOW you can have speed hacks, which implies player movement and positioning is client side with some server side check, which is a function of wanting the game to remain responsive.
Again in WOW if you have a shitty connection, you're spell timeouts don't clear, which implies that's server side.

The way most online games resolve the issue with latency is to predict motion of anything that is not client side, the more recent games tend to run the same simulation/AI on both ends and incrementally transmit state from the master to correct drift, there is a good GDC presentation by Glenn Fiedler that's worth reading if your interested.
http://gafferongames.com/2010/03/11/gdc-2010-networked-physics-slides-demo/

MMO's are complicated, because the server is more than a single machine, in MMO's that don't zone,players can see and interact with other players and objects that are owned by a server that doesn't own the player.
The original asheron's call actually resolved this with messaging between the servers, but having spoken to the programmers dealing with two players owned by different servers fighting was very complicated. I believe the subsequent games like LOTR online nominate a single server as an owner of everything involved in a combat.

It would be better if more of the master state ownership could be passed to the clients, but you can't trust the client isn't compromised. For example if the client assessed or assigned damage, it would be trivial to hack the client to assign a massive amount of damage, or perform hits very quickly against a target.
 
Stupid me, I missed the MMO part. Hadn't been following the conversation closely enough. In more general terms of AI, where do the client/server limits lie? If someone is hosting a game on their console, how could distributed AI be managed? High level AI shouldn't be time critical, and local AI (duck!) is easy to compute on the client. I'm just unsure where the traffic issues lie, as I know diddly squat about network traffic. ;) I know its kept to a minimum to keep packets fast and fit in with expected poor upload speeds, but I wonder where the limits lie? A 1 megabit upload, which is a higher end target I think, offers all of ~300 KB per 1/30th second frame. That seems a significant limiting factor to what data can be shared. the end result would be one machine handling all the AI and just sending NPC data to the clients.
 
ai for conversation complete with speech engine.
is it doable?

will be nice if next Elder Scrolls have something like that. I can make long conversation with "arrow knee" town guard and ask him to join my party haha. THen allow each NPC to interact with each other.

btw in Mafia 2, the NPC will talk to each other when they meet. The conversation also connected. With more recorded speech and animation database, seems the NPC will feel more lively.

then there also a problem where NPC will think too much and start killing everyone....
(Oblivion radiant ai before dumbed down --> https://forums.playfire.com/general-discussion/thread/98580 )
 
Last edited by a moderator:
Not a hope of a natural language conversation engine, that's beyond even academic researchers in the field. I do hope they try some more incidental or 'flavor' speech like Mafia 2 but ultimately that's just more scripting rather than AI.
 
Stupid me, I missed the MMO part. Hadn't been following the conversation closely enough. In more general terms of AI, where do the client/server limits lie? If someone is hosting a game on their console, how could distributed AI be managed? High level AI shouldn't be time critical, and local AI (duck!) is easy to compute on the client. I'm just unsure where the traffic issues lie, as I know diddly squat about network traffic. ;) I know its kept to a minimum to keep packets fast and fit in with expected poor upload speeds, but I wonder where the limits lie? A 1 megabit upload, which is a higher end target I think, offers all of ~300 KB per 1/30th second frame. That seems a significant limiting factor to what data can be shared. the end result would be one machine handling all the AI and just sending NPC data to the clients.

Glenn's slides linked in my previous post cover some of this and it's a really pretty good read.
FWIW the amount of upstream bandwidth games rely on is MUCH less than that. I haven't looked at any hard data lately, but It's probably closer to 100Kbits per second, which is about 400bytes per 1/30th.
There are several different sorts of data, there is data that has to get there, and there is data that you can lose a few packets, at some fidelity loss.
Some state can be perfectly projected.
But in general clients have varying views of the game state. For example you may not have received an update for a distant NPC for seconds.
Strict client server models are less prevalent than they used to be, it's better to nominate master state to the machine that provides the best experience.
In a coop game if I shoot an NPC it makes sense to resolve that on my machine even if I don't have completely accurate NPC state.
 
Ken Levine filmed at a BAFTA Q&A. "From Shodan, to Big Daddy, to Elizabeth: The Evolution of AI Companions."

 
Not a hope of a natural language conversation engine, that's beyond even academic researchers in the field. I do hope they try some more incidental or 'flavor' speech like Mafia 2 but ultimately that's just more scripting rather than AI.

Actually it's not as hard as you think.
 
Glenn's slides linked in my previous post cover some of this and it's a really pretty good read.
FWIW the amount of upstream bandwidth games rely on is MUCH less than that. I haven't looked at any hard data lately, but It's probably closer to 100Kbits per second, which is about 400bytes per 1/30th.
There are several different sorts of data, there is data that has to get there, and there is data that you can lose a few packets, at some fidelity loss.
Some state can be perfectly projected.
But in general clients have varying views of the game state. For example you may not have received an update for a distant NPC for seconds.
Strict client server models are less prevalent than they used to be, it's better to nominate master state to the machine that provides the best experience.
In a coop game if I shoot an NPC it makes sense to resolve that on my machine even if I don't have completely accurate NPC state.

From what I read before, the clients usually have to act on the most recent data it has. When authoritative state information is finally received (from/at the server or an elected host), it will make adjustment to the position, or arbitrate/finalize the death of an enemy/NPC/player.

For security concerns, doing everything on the server side would be easier (and likely safest). But there are business and operational considerations. I'd guess that most implementations use a mix of both client and server implementations to achieve their design goals.

The CPU specs is important but unlikely to affect the outcome in major ways. I presume combat AI is usually lightweight. The heavyweight ones are usually the economy, the world simulation and such. For a network game, especially MMO, these are mostly handled on the server/host side.

As for natural interface, if it relies on speech recognition, it's probably not a real-time need anyway. So streaming to the server to handle (like Siri and Google Now!) is ok too. They can use the collective speech patterns to predict possible speech input too.
 
The scripted NPCs that I see in most games doesn't strike me as something that is a CPU power problem. Strategy games on the other hand.... But those aren't very popular on consoles.
 
Actually, tower defense is not bad also. I like them because of the simplicity. They can be challenging too.

EDIT:
Oh wait, you changed your post. 8^)

The scripted NPCs that I see in most games doesn't strike me as something that is a CPU power problem. Strategy games on the other hand.... But those aren't very popular on consoles.

FPS bots can be more interesting moving forward.

Strategy games would have more scope, but they can't be too hard to play either. The Eye of Judgment AI was really aggressive (but fun). I consider it near the upperbound of AI difficulty/challenge. Anything beyond is either too difficult, or warrant a complain about cheating AI. :p


... which reminds me of Carcasonne on iOS. They have different AI levels. The higher level ones are so aggressive that they kinda miss the spirit of the game. They adopt "dirty" and abrasive techniques to sabotage the player. Drove my wife mad. I remember she spent a full week defeating the highest level AI and then never touch the game again.
 
One thing I wonder about is how well AI can be moved to separate threads. This seems like it would be very difficult and have a lot of performance scaling problems. Jaguar cores are not so amazing individually so threading is critical and obviously what they are relying on for speed.

What always comes to mind for me is Supreme Commander which has all of the simulation running in one thread. In multiplayer games versus the AI, you can swamp a 4 GHz Ivy Bridge core.
 
I would push the question further: Whether and how effective the GPGPU can help in AI. e.g.,

A* Algorithm for Multicore Graphics Processors
http://publications.lib.chalmers.se/records/fulltext/129175.pdf

We have provided three improvements to the parallel A* algorithm to allow it to work faster and on larger maps. The first improvement is the use of pre-calculated paths for commonly used paths. Secondly, we allow multiple threads to work on the same path and thirdly we have implemented a scheme for Hierarchical Breakdown. Instead of computing the complete path as a whole, the path is calculated in many segments. This makes it possible to calculate more paths concurrently on big maps than was possible before. Very large maps are broken down into many clusters and paths are computed at a higher level of abstraction using path abstraction. All the segments of a path are then joined together to make a complete path using path refinements.
 
The scripted NPCs that I see in most games doesn't strike me as something that is a CPU power problem. Strategy games on the other hand.... But those aren't very popular on consoles.

The actual decision making isn't usually a big CPU problem, it's building and maintaining the world view (LOS obstacles etc), animating and depending on how bad your data structure is moving the character over the terrain and path finding that are the CPU sinks.

Havok is commonly used to move characters over terrain (IMO it's a stupid use of a general physics engine) IMO it's of the order of 20x (possibly more) slower doing it than something built for just that purpose but it's easy to get running so it tends to ship.
LOS is better dealt with using volume collisions than rays, but it's commonly done with several rays because it's cheaper.
If you have dynamic obstacles in the world path finding can get expensive in edge cases.

I actually don't think many of these things will ge a lot more expensive going forwards.
 
The actual decision making isn't usually a big CPU problem, it's building and maintaining the world view (LOS obstacles etc), animating and depending on how bad your data structure is moving the character over the terrain and path finding that are the CPU sinks.

Havok is commonly used to move characters over terrain (IMO it's a stupid use of a general physics engine) IMO it's of the order of 20x (possibly more) slower doing it than something built for just that purpose but it's easy to get running so it tends to ship.
LOS is better dealt with using volume collisions than rays, but it's commonly done with several rays because it's cheaper.
If you have dynamic obstacles in the world path finding can get expensive in edge cases.

Path finding in destructible/deformable world !

e.g. In Uncharted, it would be interesting if the enemies and Nate could move the ladders around to change the map routes on the fly. Right now, the enemies always put the ladder one way and that's it.
 
Path finding in destructible/deformable world !

e.g. In Uncharted, it would be interesting if the enemies and Nate could move the ladders around to change the map routes on the fly. Right now, the enemies always put the ladder one way and that's it.

This I agree with, pathfinding when it works is a "solved" problem, but it is too smart and too optimal, to look human an agent should have a view of the world and dynamically update it based on what it sees or is informed of, and there needs to be some degree of noise in a solution.
 
That's a problem with AI in general, such as perfect aiming with perfect deflection on moving targets. Things like that can be comfortably randomised to add human error, but something like path finding needs intelligent obfuscation so the agents select routes in a manner that matches human efforts, which I guess requires an observation of the surroundings and a selection based on evaluating criteria in a human sort of way, that misses some options. I guess nodes in a path could have weightings and weaker nodes only selected by path-finding agents with a higher rating for perception or training. Dynamic terrain would necessitate algorithmic ways of defining weightings.

Although my preference for next gen would be the end to invisible barriers and for all terrain interaction to be physics based, so every clearly achievable movement, every climb and jump that the play knows the character could make, is possible. Nothing worse than trying to escape up a rocky outcrop into cover only for there to be a no-go barrier! It became laughable in Uncharted where, on the one hand Nate can make superhuman leaps to reach impossible areas, and yet on the other he couldn't even climb over a rock or up a small hill because he wasn't supposed to.
 
Yap, they will need to retune all the encounters if they made the AIs as versatile as Nate.

One of the desert skirmishes in U3 was extremely tough partly because Nate was essentially boxed in. He can't get to any safe area without being followed by the enemies quickly. There were more of them too.

In a situation like this, I should be able to blow a hole somewhere, or create a diversion to escape.
 
Back
Top