Should Cloud AI players be created to beef up multiplayer games?

Shifty Geezer

uber-Troll!
Moderator
Legend
Just thinking from Lawbreakers, reportedly a good game let down by lack of players, that if they had bots, that'd bolster virtual player numbers until the human population was enough to take over.

And then it occurred to me that nowadays, it should be possible to create AI game playing agents that interface with a game like a human and can play it. Potentially a company could develop game playing AI and rent out virtual players to provide a good base for a new game, or a library could be provided. Importantly, the gameplay AI could be generic in terms of interfacing with the same controls the players have (no direct code for actions, but using the same IO routines), and machine learning would have crazy amounts of streamed gameplay to analyse.

How plausible is this and would it really be beneficial to games and gamers?
 
EA publicly talked about AI training of bots in BF1

https://www.theverge.com/2018/3/22/17150918/ea-dice-seed-battlefield-1-ai-shooter

Their target at the moment is for internal testing with large amounts of "players" on the battlefield that at least are interacting with the environment and say it's a long way off from being even close to competitive with humans. I do see it perhaps becoming a benefit eventually with filling slots for games to get them started faster, especially with large 64 player maps, then swap them out as more humans join. Also with legacy games and having more "players" available will help when they aren't so popular.

I believe WW2 COD utilizes bots in multiplayer scenarios as general filler then humans take them over as they die?
 
I'm sure it'd also be useful for QA testing (and stealing people's jobs, but I digress). :p
 
As long as the cloud A.I. bots also ragequit when they get teabagged, I'm good.


Seriously though, the problem with that solution is that the dedicated servers must have A.I. processing which is a lot more demanding than simply hosting a game. That could increase the "per-player" server cost exponentially.

One other thing I know from talking to people who do ML research is that simultaneous training+inference (i.e. what they call online machine learning) is very hard to achieve and doesn't really exist outside simpler case studies for now. Each server could instead have two NN active at the same time -> one that inferences, a second that trains based on the results of the first, and then they switch roles after a given time or between rounds. But this would require even more compute power per server and lead to even larger costs on the cloud side.

Besides, I thought neural-network enemy bots had long been found to be less effective than well-scripted bots. And in that case it might be better to keep the processing in the console.
However, I do acknowledge that "higher difficulty enemy A.I." usually just means that our character loses more HP per shot, our shots cost less HP, and at best the enemies become glorified sniper turrets with infinite range regardless of climate or light conditions (I'm looking at you Fallout 4!).


BTW: Rocket League does exactly this. It fills up empty player sockets with bots to avoid large waiting times or teams with an uneven number of players. I don't know where the computing for the bots happens, though.
 
As long as the cloud A.I. bots also ragequit when they get teabagged, I'm good.


Seriously though, the problem with that solution is that the dedicated servers must have A.I. processing which is a lot more demanding than simply hosting a game. That could increase the "per-player" server cost exponentially.
If you run the AI locally, yes. However, I'm more suggesting a service run remotely that provides remote AI players that can be connected up to any multiplayer game, just like a human logging in.

Besides, I thought neural-network enemy bots had long been found to be less effective than well-scripted bots.
To date, AFAIK, there's never been the opportunity to train AI bots in FPS skills generically because there hasn't been a suitable dataset. ML could watch through squillions of hours of Twitch and YT and learn what to do in whatever situation, and then apply that to lots of different FPSes that fundamentally work the same. Anything that deviates significantly (Fortnite building) would need specialist learning, but on the whole, it's run, aim, shoot, jump about constantly, across all FPSes

BTW: Rocket League does exactly this. It fills up empty player sockets with bots to avoid large waiting times or teams with an uneven number of players. I don't know where the computing for the bots happens, though.
Bots isn't anything new. It's basically what you need for single player, so you can run AI. The problem there is creating an AI for your game if its multiplayer, and getting it good to play other players. The idea of cloud based AI agents is they'd be trained and capable and you could hire them to populate your game during testing or in the early days to fill player slots, saving every dev the same overhead of creating bots for their game by having a shared universal bot solution.

Actually, that's really opening the door for the Robot Rebellion, training AI in combat... :neutral:
 
hmm.. quake 3 arena comes to mind with this topic. AI was a big part of that game as there was no single player. There were multiple modes with different types of challenging bots.
It was very much a training ground to get used to maps etc, and learn the weapons before getting stomped online.

It's possible for Lawbreakers this could have helped eased players into understanding the more challenging aspects of the game
 
There are claims that PUBG uses bots for its mobile ports. When first launched people were readily winning chicken dinners at a faster pace than on PC and Xbox and running into other so called players who didn’t quite react the way a normal player would (slow to shoot). There seemed to be a mechanic where the amount of actual human players per game was increased as you ranked up.

I am not sure why you would need cloud or AI training. AI in most games are dumb downed for the player’s benefit. And human behavior in games don’t tend to be all that complex especially in multiplayer.
 
I am not sure why you would need cloud or AI training.
To alleviate the need for developers to include AI and bots themselves.
AI in most games are dumb downed for the player’s benefit.
Good AI wouldn't need to be if you're wanting to provide that authentic multiplayer experience.
And human behavior in games don’t tend to be all that complex especially in multiplayer.
Making it ideal for simulating on computer and employing over the internet as virtual players. ;)
 
I am not sure why you would need cloud or AI training. AI in most games are dumb downed for the player’s benefit. And human behavior in games don’t tend to be all that complex especially in multiplayer.

Certain features are dumbed down...
  • Shot accuracy.
  • Reactions times.
  • Awareness of players in the environment.
  • Basically all the easy stuff where it's easy to make a bot better than a human ever could be.
However, AI in games is very far behind the sometimes complex decision making processes that human players do unconsciously. This is the computationally expensive part. Not only is it limited in order to save CPU time, it's limited because it's difficult to replicate that varied and almost infinitely subtle variations in how different human players will approach the same situation.

Imagine having a chess-like AI for every NPC in a game, except the AI has to deal with far more varied and far more fluid situations than exist in chess. And it has to do it in real-time in fractions of a second compared to the time that the AI in chess can take.

Now, it doesn't need to be at the level of a good chess AI to fool people. Some fuzzy logic and an AI that can decide between a few hundred choices per second would likely do it well enough. But even that would be computationally intensive with multiple AI. And even that would likely eventually be exploitable by people who play it long enough as they learn the limitations of the AI.

A simple learning AI might make it more challenging for some players by exploiting X player's weaknesses, but then that also is unrealistic in a large scale multiplayer game where not all of your opponents are better than you and good at exploiting errors that you make. Also, that becomes more complex as you introduce more players, again increasing CPU computation needed per AI.

AIBran mentioned something akin to Microsoft's Driveatar.

That might be the way to go. Throw machine learning at the problem of creating 100's or 1000's of different NPC AI behaviors by examining millions of hours of player behavior. IE - a much larger dataset than they used to create the Driveatars for Forza to create more believable behavior. Then randomly inject some of them into games when player numbers drop below some threshold.

TL: DR for a multiplayer game vs single player game

Not only do you need to craft AI that is good, you need to craft AI that isn't good. AI that almost always makes good decisions and AI that makes a lot of bad decisions. AI that react to the player's actions and AI that attempt to predict the player's actions. AI that know the map and AI that don't know the map. AI that is good at using consumables and AI that isn't good at using consumables. Etc.

And AI using everything in between the extremes of those and in addition mixing and matching variations of all of those with each other to create a broad spectrum of believable AI.

Regards,
SB
 
Battlefield had bots and it was awesome.
Loved to play D-day map with friends defending against huge wave of bots.
 
Certain features are dumbed down...
  • Shot accuracy.
  • Reactions times.
  • Awareness of players in the environment.
  • Basically all the easy stuff where it's easy to make a bot better than a human ever could be.
However, AI in games is very far behind the sometimes complex decision making processes that human players do unconsciously. This is the computationally expensive part. Not only is it limited in order to save CPU time, it's limited because it's difficult to replicate that varied and almost infinitely subtle variations in how different human players will approach the same situation.

Imagine having a chess-like AI for every NPC in a game, except the AI has to deal with far more varied and far more fluid situations than exist in chess. And it has to do it in real-time in fractions of a second compared to the time that the AI in chess can take.

Now, it doesn't need to be at the level of a good chess AI to fool people. Some fuzzy logic and an AI that can decide between a few hundred choices per second would likely do it well enough. But even that would be computationally intensive with multiple AI. And even that would likely eventually be exploitable by people who play it long enough as they learn the limitations of the AI.

A simple learning AI might make it more challenging for some players by exploiting X player's weaknesses, but then that also is unrealistic in a large scale multiplayer game where not all of your opponents are better than you and good at exploiting errors that you make. Also, that becomes more complex as you introduce more players, again increasing CPU computation needed per AI.

AIBran mentioned something akin to Microsoft's Driveatar.

That might be the way to go. Throw machine learning at the problem of creating 100's or 1000's of different NPC AI behaviors by examining millions of hours of player behavior. IE - a much larger dataset than they used to create the Driveatars for Forza to create more believable behavior. Then randomly inject some of them into games when player numbers drop below some threshold.

TL: DR for a multiplayer game vs single player game

Not only do you need to craft AI that is good, you need to craft AI that isn't good. AI that almost always makes good decisions and AI that makes a lot of bad decisions. AI that react to the player's actions and AI that attempt to predict the player's actions. AI that know the map and AI that don't know the map. AI that is good at using consumables and AI that isn't good at using consumables. Etc.

And AI using everything in between the extremes of those and in addition mixing and matching variations of all of those with each other to create a broad spectrum of believable AI.

Regards,
SB
some good fuzzy logic, a computationally reasonable good pathfinding AI algorithm like A* and several decently programmed finite state machines, you are good to go. The idea is to develop a good both online and offline AI, that doesnt break the illuson of playing against a human or a machine.

Shifty's idea is good overall, but imho, it has a huge flaw. Just imagine jumping into a game with good AI for the multiplayer mode and single player mode, that's superb.

However the flaw is that for people -like me-, who isn't a big fan of multiplayer and online gaming against humans -except some cases-, adding to that that connections can be either metered or slow/inestable, depending on a cloud AI can be no fun.

I love AI. I do that ever since I played against the Reaper bot in Quake and noticed it could be a decent rival in many maps. It was incredibly fascinating at the time that it learnt on the go, moving around the map. It deeply marked me and had some epic -if sometimes unfair- battles in some stages -in certain maps it could be totally abused-.

Drivatar AI also was an incredible concept. I preferred it in the original Forza for the Xbox. It "studied" how you took every turn, the apex of your driving style and so on. It was extremely interesting to see your drivatar drive for you (which we did many times, so fun!), and that was my second big turning point when it comes to AI.

The biggest drawback of Shifty's approach, imho, is cost and the necessity of an online connection just to play against an AI. I'd rather prefer a local strong AI, as difficult as it may be to program without using much resources.
 
Shifty, are talking about an AI that recieves as input simply the actual rendered frame and sound and output controls?
We must be decades, at least, from this being feasible.
 
The biggest drawback of Shifty's approach, imho, is cost and the necessity of an online connection just to play against an AI. I'd rather prefer a local strong AI, as difficult as it may be to program without using much resources.
I wouldn't use it for single player. For conventional offline play, use conventioanl local AI agents. Only for networked games that already require a network connection would it make sense (to me) to use cloud agents, either to bolster numbers or maybe provide bots for PvE.
 
Shifty, are talking about an AI that recieves as input simply the actual rendered frame and sound and output controls?
Ideally, although you could supply an augmented render that renders objects in key colours, for example, to separate players and environment. Could draw grids on the ground for distance comparison, and generally help the robots out with their perception.
We must be decades, at least, from this being feasible.
I have no idea! I think the technology exists to implement it now. I don't know how well the machine learning and implementation would work though, whether you'd need to set the robots gawping at Twitch now to be able to play COD in 2034, or whether they'll be ready to hand people's butts to them in Fortnite by 2020.

But maybe I'm wrong and the AI technology isn't that advanced yet? @DSoup probably already knows, working on the UK's robot army, but he won't be able to talk without then reprogramming our printers to kill us all...
 
Back
Top