DeepMind, a Google-backed AI project, has already used advanced neural networks to learn how to play the millennia-old game Go and dominate human champions. Now, DeepMind is taking on a more modern challenge in gaming.
At 1PM EST today, Blizzard livestreamed DeepMind’s AI, called AlphaStar, playing StarCraft II.
Complicated strategy games like Starcraft II and Dota 2 are hard for AI to master. OpenAI beat human Dota 2 opponents in August, but did so with significant handicaps. Tencent developed AI that can take down StarCraft II’s own bots at the highest difficulty, but human players have still consistently beaten AIs at StarCraft II. Today, DeepMind prevailed.
AlphaStar took on Dario “TLO” Wünsch and Grzegorz “LiquidMaNa” Komincz in a series of ten matches. Each pro took on the AI in five one-on-one matches. The games were close, but the AlphaStar won every single match.
The DeepMind team trained its AlphaStar by making it watch StarCraft II match replays for three days straight. The team then broke the AlphaStar AI into several different distinct instances of itself and forced them to compete in tournaments against each other for seven days.
AlphaStar did have some guardrails during the game. StarCraft II has three different races—Zerg, Terran, and Protoss—and AlphaStar only learned how to play, and play against, Protoss. Wünsch doesn’t normally play Protoss, but Kominczdoes. Again, the games were close, but the AI won every time.
“It was very hard to judge what AlphaStar was doing,” Komincz said on a livestream that showed some of the matches. “It was an incredible experience.”
“Even though I was defeated, I’m confident that, if I had more time playing Protoss, I could win a significant amount of the matches…if I had another week to train Protoss, I could have defeated these agents,” Wünsch said.
Replays of all ten matches will be available on the DeepMind website.
As the MIT Technology Review pointed out in a recent article, StarCraft II requires players to use their memory, strategize, and plan ahead simultaneously, making it difficult for AI—but perhaps not insurmountable.
In 2016, Blizzard released StarCraft II’s API along with a pack of 65,000 match replays, which DeepMind used to train the AI. Soon after, the AI had learned some sly StarCraft II tactics like the tower rush—when an opponent sends a worker to build a defensive tower inside your base early in the game. While this was surprisingly effective, it was clear that the AI had a long way to go to truly master the game.
“Once it started to grasp the basic rules of the game, it started exhibiting amusing behavior such as immediately worker rushing its opponent, which actually had a success rate of 50% against the ‘Insane’ difficulty standard StarCraft II AI,” Blizzard said of DeepMind at Blizzcon in November.
In a blog post on Tuesday, Blizzard noted that DeepMind has gotten a lot better at StarCraft II since Blizzcon. In a little over two months, the AI has apparently learned to defend against cheesy tactics as well as deploy them, according to the blog post.