Blizzard and DeepMind turn StarCraft II into an AI research lab
Starcraft II has been a target for Alphabet’s DeepMind AI research for a while now – the UK AI company took on Blizzard’s sci-fi strategy game starting last year, and announced plans to create an open AI research environment based on the game to make it possible for others to contribute to the effort of creating a virtual agent who can best the top human StarCraft players in the world. Now, DeepMind and Blizzard are opening the doors to that environment, with new tools including a machine learning API, a large game replay dataset, an open source DeepMind toolset and more.
The new release of the StarCraft II API on the Blizzard side includes a Linux package made to be able to run in the cloud, as well as support for Windows and Mac. It also has support for offline AI vs. AI matches, and those anonymized game replays from actual human players for training up agents, which is starting out at 65,000 complete matches, and will grow to over 500,000 over the course of the next few weeks.
StarCraft II is such a useful environment for AI research basically because of how complex and varied the games can be, with multiple open routes to victory for each individual match. Players also have to do many different things simultaneously, including managing and generating resources, as well as commanding military units and deploying defensive structures. Plus, not all information about the game board is available at once, meaning players have to make assumptions and predictions about what the opposition is up to.
It’s such a big task, in fact, that DeepMind and Blizzard are including “mini-games” in the release, which break down different subtasks into “manageable chunks,” including teaching agents to master tasks like building specific units, gathering resources, or moving around the map. The hope is that compartmentalizing these areas of play will allow testing and comparison of techniques from different researchers on each, along with refinement, before their eventual combination in complex agents that attempt to master the whole game.
The whole goal here is to come up with AI that can play StarCraft II better than any human can, in much the same way that DeepMind did with its AlphaGo software for playing the ancient physical board game of Go. DeepMind wants this to propel the existing research forward, hence its appeal to larger research community and this open release of tools.