Benefits of higher performance Console CPUs (Game AI) *SPAWN*

Been done several times over the years in various forms.
Problem is like all user generated content, quality is all over the place and 99% of it is crap, so <1% of your user base dominate, which makes it no fun for the majority of your players.

This is off topic, but It's not that Game AI ( and I use the term to describe enemy behavior rather than sim like behavior) is simple because that's all that can be done, it's simple because it's what provides the experience most game designers are trying to create. They want to control tempo, and pace in a level, and you can't do it with a bunch of autonomous AI's running around.

There is a computation cost in terms of evaluating the current world view of the AI line of sight, tracing possible paths to a target etc.

When most people describe AI's being dumb, it's usually referring to some un covered edge case, running into a wall standing stupidly and letting you shoot it, there just really hard to find and fix regardless of the actual decision making mechanism, often it's just bad data in the level, or undersampling of cast rays, the AI's world view is incorrect, it makes a decision based on it and does what appears to be incredibly stupid.

Having said that for anyone whose ever done a raid in WOW or any other MMO with an inexperienced group, there are plenty of Real Intelligences that will stand in the fire until the healer runs out of Mana then complain about the shitty healing job.

I remember Bungie also did an extensive write up on their AI implementation, and that in order to make it seem more realistic to Human players they had to "dumb it down" (not their words). Things like making the AI deliberately pause, take longer than an average human player to aim, always miss on the first shot, restrict view distance, etc. Basically if they modeled the AI after a halfway decent human player, then actual human players would think the AI was unrealistic and dumb.

It's amazing how many dumb things human players do in multiplayer games, as you pointed out MMORPGs are a great place to find them.

Regards,
SB
 
In perf/watt jaguar is a top dog.This is good if Sony didnt want a 100 watts cpu inside as at least if they go with 1.8-2 Ghz clock they'll have a decent performer.

In perf/watt there are probably ARM processors that can match or beat Jaguar. The latest Atoms seem pretty good in this regard and whatever is in the Wii U might even be competitive on 32 nm.

Jaguar is an ideal tablet and ultra book processor, but while it's undoubtedly a big step up from Cellnon it's pretty hard to get excited about it in a "powerhouse" console.

And you don't have to go to 100W to get a CPU faster than 2 x Jaguar. A 65W quad core desktop Richland (including GPU) is both faster and uses only half the cores, despite being son of Bulldozer. 2 x Jaguar seems to be the best of what was available on the required processes, not necessarily anyone's dream CPU (dat inter-module cache-hit latency).

Still, much better than last time!
 
I'm more excited by what an improved CPU can bring to next gen games than an improved GPU. I hope Jaguar's up to the task.
 
I remember Bungie also did an extensive write up on their AI implementation, and that in order to make it seem more realistic to Human players they had to "dumb it down" (not their words). Things like making the AI deliberately pause, take longer than an average human player to aim, always miss on the first shot, restrict view distance, etc. Basically if they modeled the AI after a halfway decent human player, then actual human players would think the AI was unrealistic and dumb.

It's amazing how many dumb things human players do in multiplayer games, as you pointed out MMORPGs are a great place to find them.

Regards,
SB

I remember that article. They had to make things really obvious for the players to notice the AI, such as repeating the same behavior. ^_^

Then again, it’s an FPS example. There should be more interesting AI ideas overall, including summoning real humans to stand-in as a boss in Demon’s Souls, and learning player preferences (From "Psycho Mantis reading your memory card” to Big Data). Starhawk mixes building and a little resource management with TPS, GTA has a whole city of NPCs, SimCity has both NPCs and real players everywhere. Should be able to find plenty of opportunities for improved AI nextgen.
 
I'm more excited by what an improved CPU can bring to next gen games than an improved GPU. I hope Jaguar's up to the task.

We should be able to see more cooperation between the CPU and GPU nextgen. It’s no longer just the CPU or the GPU separately. Let’s not forget the specialized hardware units too. Regular setup may rely solely on the CPU to do “everything else”.
 
at 3,2 Ghz 4 piledriver cores are much bigger in size and tdp that 8 jaguar cores. And 8 jaguar cores at 2 Ghz will give you better fpu performance than that.

Choosing 3.2 gHz for Piledriver and 2 gHz for Jaguar is a bit arbitrary isn't it? And there's more to CPU performance than rawr flolps. Even on PC games (often console ports) that can utilise 6+ cores performance doesn't scale linearly with core count.

Yeah Jaguar is small and power efficient, but at 15~25W expected for a quad core at the 1.6~1.85 gHz range the power savings for an 8 core jaguar setup over a lowish clocked (~3 gHz) Trinity aren't likely to anything like as spectacular as you're making out.

Jaguar is the pragmatic and/or compromise choice and years behind desktop alternatives in terms of absolute performance. The Jaguar hype in console land is just a bit too much, and I don't think diluting the kool aid would do any harm.

Tablet land is another matter though, and Jaguar should rock the house in the < 10W and especially < 5W x86 world.
 
Choosing 3.2 gHz for Piledriver and 2 gHz for Jaguar is a bit arbitrary isn't it? And there's more to CPU performance than rawr flolps. Even on PC games (often console ports) that can utilise 6+ cores performance doesn't scale linearly with core count.

Yeah Jaguar is small and power efficient, but at 15~25W expected for a quad core at the 1.6~1.85 gHz range the power savings for an 8 core jaguar setup over a lowish clocked (~3 gHz) Trinity aren't likely to anything like as spectacular as you're making out.

Jaguar is the pragmatic and/or compromise choice and years behind desktop alternatives in terms of absolute performance. The Jaguar hype in console land is just a bit too much, and I don't think diluting the kool aid would do any harm.

Tablet land is another matter though, and Jaguar should rock the house in the < 10W and especially < 5W x86 world.

3.2 GHz is the clock of the cpu in your aforementioned quad core Richland at 65W as for this http://www.techpowerup.com/cpudb/1501/AMD_A8-6500.html ( although to be fair other sources say it is 3.5 GHz ).
 
The numbers I've seen for 65W Richland quad core are 3.5 and 3.7 dependant on model, with a >= 800 mHz 384 shader GPUs to boot. Currently only the mobile versions seem to be officially released, with 35W getting you a jolly respectable 2.1 or 2.5 gHz quad core (plus GPU). It's unlikely that the 25 W Jaguars will be able to match that 2.5 gHz Richland for perf/watt, but then again at high clocks Jaguar is probably getting out of it's element.

Bulldozer was a hot mess, but for a full fat CPU and GPU on an ageing process Trinity/Richland are really doing very well. Silicon be dammed, I'd take 4 Piledrivers next gen over 8 Jaguars, even if it meant fewer shaders. Gameplay before hair physics!*

*But preferably both.
 
Not sure if it’s meaningful to compare raw CPU power w.r.t. AI. Console CPU doesn’t need to run the DX11 software layer. It also has specialized hardware units to help out. There should be room left over for physics and AI, as long as it can feed the GPU quick enough.

The consoles’ GPGPU may present new opportunities for compute use as well since it does not have the command latency issues seen on PCs. If you look at AMD’s slides, they have an example of path finding running on their GPGPU.

What’s more, developers are predicting the death of pure single player experiences in several years. If true, server-based AI and game logic may enter the scene. New ideas/concepts may analyze data from multiple players to provide a connected single player experience.

I doubt nextgen will make/break because of low level hardware specs.
 
Let’s see ! 8^D
It is also possible for a cool implementation to draw gamers in. Demon’s Souls is a great example.

I see some good ideas even in a few iOS games my family play.

Some of these cloud AI data doesn’t have to be always connected too. They can be refreshed transparently in batches.
 
Not sure if it’s meaningful to compare raw CPU power w.r.t. AI. Console CPU doesn’t need to run the DX11 software layer. It also has specialized hardware units to help out. There should be room left over for physics and AI, as long as it can feed the GPU quick enough.

The consoles’ GPGPU may present new opportunities for compute use as well since it does not have the command latency issues seen on PCs. If you look at AMD’s slides, they have an example of path finding running on their GPGPU.

What’s more, developers are predicting the death of pure single player experiences in several years. If true, server-based AI and game logic may enter the scene. New ideas/concepts may analyze data from multiple players to provide a connected single player experience.

I doubt nextgen will make/break because of low level hardware specs.

I just want to see AI finally get pushed to the next level. I'm tired of enemies acting like idiots and only being difficult because they are able to withstand a barrage of bullets at point blank range.
 
I just want to see AI finally get pushed to the next level. I'm tired of enemies acting like idiots and only being difficult because they are able to withstand a barrage of bullets at point blank range.

Good AI isn't a processing power problem. Some of the best AI was made in nearly decade old games and can run on hardware of that era. Good AI is a software coding problem. Which really means its a "how much time and money developers/publishers want to waste on AI" problem.
 
Good AI isn't a processing power problem. Some of the best AI was made in nearly decade old games and can run on hardware of that era. Good AI is a software coding problem. Which really means its a "how much time and money developers/publishers want to waste on AI" problem.

But, is AI so game dependant?. I mean... you had awesome AI in Half-Life 1 soldiers, awesome AI in Fear 1 soldiers... but this was ages ago and second parts of these games lacked in AI. Couldn´t have been copied and expanded prior AI algorithms?.

PS: Another recent example: Killzone 2 AI IMHO is way superior to Killzone 3 one and the same with Uncharted 2 and Uncharted 3 ( the mission in the tibetan temple roofs was great, so i supposse that to have a good AI npcs must look for you, lose you of sight while you are hiding, and attack not passively when finding you, so the contrary of a COD game ).
 
Last edited by a moderator:
But, is AI so game dependant?. I mean... you had awesome AI in Half-Life 1 soldiers, awesome AI in Fear 1 soldiers... but this was ages ago and second parts of these games lacked in AI. Couldn´t have been copied and expanded prior AI algorithms?.

PS: Another recent example: Killzone 2 AI IMHO is way superior to Killzone 3 one.
The AI in the original FarCry wipes the floor with anything that's come after that, too: The way in which the enemies in that game intelligently and dynamically flanked you, took cover, called for support, smartly advanced in squads, barricaded themselves into sheds (when they realized they were overrun) etc.

I remember playing the demo (starting in that rubber raft) a few douzen times, trying different approaches every turn - and the AI kept surprising me with scaringly smart and dynamic reactions and intelligent moves.

EDIT: Great read in that respect: http://www.ign.com/articles/2003/09/30/far-cry-iq-test

Given what they achieved in that game in 2003, I really don't think processing power is that much of a limiting factor when it comes to implementing a good AI. It's mostly about making smart design decisions around the notion of good, dynamic gameplay.

As a matter of fact, I'd even argue that any really good AI is based around core routines that are as smart as they're simple: Identifying what's most important to create efficient reactions to any possible event is the key. The 2011 AI challenge was very interesting in that respect - the guy that won it basically seemed to have used the least lines of code (of any competitor): http://xathis.com/posts/ai-challenge-2011-ants.html.
 
Last edited by a moderator:
Great find !

I think analytical applications (e.g, match making, analyzing and relating player preferences, computer recognition ), and large scale simulation (e.g., game economy) are interesting too.
 
Ignorant question. We know Free-2-Play(F2P)/ microtransaction and the MMO/ CoOp is likely to come down the pipe with this gen. I was just curious, where is the AI processed for these kinds of games? Since every player needs to have exactly the same actions from an enemy represented, where does the AI process? I would have thought that would be on a server somewhere, but doesn't the 360 use some metric whereby the most reliable and quick network connection amongst the boxes handles it? Just curious whether this would have any effect on those kinds of games/ features.
 
Depends on what the game wants. I think the server is mostly in charge of exchanging common data between players and the "system".

Quick, local action like combat can be run on the client side for responsiveness, and the local results sent to the server to arbitrate and tally. Game economy is probably run on the server side.

Some behavior can be changed on the server side without patching the client. I vaguely remember MAG acts this way but I am not sure. At the end of the day, the devs make the call given their objectives and constraints.
 
Most MMO's run combat on the server, but it has much more to do with security than it does with it being the best place to run it.
The general rule of thumb for these games is that you assume all client side state is compromised.
Though you might be able to relax that to some extent on a closed box.
 
Back
Top