Love_In_Rio
Veteran
Yes but an i3 generally runs at around twice the speed.
In perf/watt jaguar is a top dog.This is good if Sony didnt want a 100 watts cpu inside as at least if they go with 1.8-2 Ghz clock they'll have a decent performer.
Yes but an i3 generally runs at around twice the speed.
Been done several times over the years in various forms.
Problem is like all user generated content, quality is all over the place and 99% of it is crap, so <1% of your user base dominate, which makes it no fun for the majority of your players.
This is off topic, but It's not that Game AI ( and I use the term to describe enemy behavior rather than sim like behavior) is simple because that's all that can be done, it's simple because it's what provides the experience most game designers are trying to create. They want to control tempo, and pace in a level, and you can't do it with a bunch of autonomous AI's running around.
There is a computation cost in terms of evaluating the current world view of the AI line of sight, tracing possible paths to a target etc.
When most people describe AI's being dumb, it's usually referring to some un covered edge case, running into a wall standing stupidly and letting you shoot it, there just really hard to find and fix regardless of the actual decision making mechanism, often it's just bad data in the level, or undersampling of cast rays, the AI's world view is incorrect, it makes a decision based on it and does what appears to be incredibly stupid.
Having said that for anyone whose ever done a raid in WOW or any other MMO with an inexperienced group, there are plenty of Real Intelligences that will stand in the fire until the healer runs out of Mana then complain about the shitty healing job.
In perf/watt jaguar is a top dog.This is good if Sony didnt want a 100 watts cpu inside as at least if they go with 1.8-2 Ghz clock they'll have a decent performer.
I remember Bungie also did an extensive write up on their AI implementation, and that in order to make it seem more realistic to Human players they had to "dumb it down" (not their words). Things like making the AI deliberately pause, take longer than an average human player to aim, always miss on the first shot, restrict view distance, etc. Basically if they modeled the AI after a halfway decent human player, then actual human players would think the AI was unrealistic and dumb.
It's amazing how many dumb things human players do in multiplayer games, as you pointed out MMORPGs are a great place to find them.
Regards,
SB
I'm more excited by what an improved CPU can bring to next gen games than an improved GPU. I hope Jaguar's up to the task.
A 65W quad core desktop Richland (including GPU) is both faster and uses only half the cores
at 3,2 Ghz 4 piledriver cores are much bigger in size and tdp that 8 jaguar cores. And 8 jaguar cores at 2 Ghz will give you better fpu performance than that.
Choosing 3.2 gHz for Piledriver and 2 gHz for Jaguar is a bit arbitrary isn't it? And there's more to CPU performance than rawr flolps. Even on PC games (often console ports) that can utilise 6+ cores performance doesn't scale linearly with core count.
Yeah Jaguar is small and power efficient, but at 15~25W expected for a quad core at the 1.6~1.85 gHz range the power savings for an 8 core jaguar setup over a lowish clocked (~3 gHz) Trinity aren't likely to anything like as spectacular as you're making out.
Jaguar is the pragmatic and/or compromise choice and years behind desktop alternatives in terms of absolute performance. The Jaguar hype in console land is just a bit too much, and I don't think diluting the kool aid would do any harm.
Tablet land is another matter though, and Jaguar should rock the house in the < 10W and especially < 5W x86 world.
Not sure if it’s meaningful to compare raw CPU power w.r.t. AI. Console CPU doesn’t need to run the DX11 software layer. It also has specialized hardware units to help out. There should be room left over for physics and AI, as long as it can feed the GPU quick enough.
The consoles’ GPGPU may present new opportunities for compute use as well since it does not have the command latency issues seen on PCs. If you look at AMD’s slides, they have an example of path finding running on their GPGPU.
What’s more, developers are predicting the death of pure single player experiences in several years. If true, server-based AI and game logic may enter the scene. New ideas/concepts may analyze data from multiple players to provide a connected single player experience.
I doubt nextgen will make/break because of low level hardware specs.
I just want to see AI finally get pushed to the next level. I'm tired of enemies acting like idiots and only being difficult because they are able to withstand a barrage of bullets at point blank range.
Good AI isn't a processing power problem. Some of the best AI was made in nearly decade old games and can run on hardware of that era. Good AI is a software coding problem. Which really means its a "how much time and money developers/publishers want to waste on AI" problem.
The AI in the original FarCry wipes the floor with anything that's come after that, too: The way in which the enemies in that game intelligently and dynamically flanked you, took cover, called for support, smartly advanced in squads, barricaded themselves into sheds (when they realized they were overrun) etc.But, is AI so game dependant?. I mean... you had awesome AI in Half-Life 1 soldiers, awesome AI in Fear 1 soldiers... but this was ages ago and second parts of these games lacked in AI. Couldn´t have been copied and expanded prior AI algorithms?.
PS: Another recent example: Killzone 2 AI IMHO is way superior to Killzone 3 one.