Competitions are a great way to share practical solutions for hard real-world problems, and produce some fun results while doing so. CIG 2017 has a huge array of competitions to enter, from old classics to funky new ideas:
The General Video Game AI Competition
(Learning Track, Level Generation Track and Rule Generation Track)
StarCraft AI Competition
Angry Birds Level Generation Competition
Geometry Friends Cooperative Game AI Competition
microRTS AI Competition
The Text-Based Adventure AI Competition
Fighting Game AI Competition
Visual Doom AI Competition 2017
The Ms. Pac-Man Vs Ghost Team Competition
The Showdown AI Competition
Dota2 Bot Competition
Game Data Mining
The General Video Game AI Competition
The GVG-AI Competition explores the problem of creating controllers for general video game playing. How would you create a single agent that is able to play any game it is given? Could you program an agent that is able to play a wide variety of games, without knowing which games are to be played? Could you program a rule or level generator that creates good games?
Learning Track:
Organizers: Jialin Liu, Diego Perez-Liebana, Simon M. Lucas
CIG 2017 will host, for the first time, the learning track of this competition, where agents will face the problem of learning how to play unknown games without a forward model. Instead, they will be given a certain amount of time on each game to train before the final evaluation.
Level Generation Track:
Organizers: Ahmed Khalifa, Diego Perez Liebana, Julian Togelius, Simon Lucas
CIG 2017 will host the level generation track. This is the second time thus track is run (first time was at IJCAI 2016). The level generation track explores the ability of level generators to generalize and work on multiple games described in Video Game Description Language (VGDL). Competitors submit level generators that are tasked to generate levels for any set of game rules. Level generators will be given a certain amount of time to generate the levels before the final evaluation.
Rule Generation Track:
Organizers: Ahmed Khalifa, Diego Perez Liebana, Julian Togelius, Simon Lucas
CIG 2017 will host the first rule generation track competition. The rule generation track explores the ability of generating game rules and winning conditions for a fixed level of a game described in Video Game Description Language (VGDL). Rule generators will be given a certain amount of time to generate the rules before the final evaluation.
StarCraft AI Competition
Organizers: Kyung Joong Kim, Seonghun Yoon
IEEE CIG StarCraft competitions have seen quite some progress in the development and evolution of new StarCraft bots. For the evolution of the bots, participants used various approaches for making AI bots and it has fertilized game AI and methods such as HMM, Bayesian model, CBR, Potential fields, and reinforcement learning. However, it is still quite challenging to develop AI for the game because it should handle a number of units and buildings while considering resource management and high-level tactics. The purpose of this competition is developing RTS game AI and solving challenging issues on RTS game AI such as uncertainty, real-time process, managing units.
Angry Birds Level Generation Competition
Organizers: Jochen Renz, Julian Togelius, Lucas Ferreira, Matthew Stephenson, XiaoYu (Gary) Ge
The Angry Birds AI competition is now in its sixth year. Its main goal is to build AI agents that can play new game levels better than the best human players. At CIG 2017 our focus will be for the second time on generating new game levels that are both interesting and fun for human players and hard for AI agents to solve. Participants of the level generation competition will develop procedural game level generators for Angry Birds that take as input desired level characteristics and output a game level satisfying these characteristics. There will be two prizes, one for the participant who creates the most fun levels to play, which will be determined by a panel. The second prize will be for the participant who creates the hardest, but solvable levels, which will be determined through a combination of live play by humans and AI agents. Participants will be provided with a platform that allows level generation and testing as well as with a baseline game level generator. The level generation competition will be held jointly with IJCAI 2017. Remote participation is possible, but attendance encouraged.
Geometry Friends Cooperative Game AI Competition
Organizers: Rui Prada, Francisco S. Melo, João Dias
The goal of the competition is to build AI agents for a 2-player collaborative physics-based puzzle platformer game (Geometry Friends). The agents control, each, a different character (circle or rectangle) with distinct characteristics. Their goal is to collaborate in order to collect a set of diamonds in a set of levels as fast as possible. The game presents problems of combined task and motion planning and promotes collaboration at different levels. Participants can tackle cooperative levels with the full complexity of the problem or single-player levels for dealing with task and motion planning without the complexity of collaboration.
microRTS AI Competition
Organizer: Santiago Ontañón
The microRTS competition has been created to motivate research in the basic research questions underlying the development of AI for RTS games, while minimizing the amount of engineering required to participate. Also, a key difference with respect to the StarCraft competition is that the AIs have access to a “forward model” (i.e., a simulator), with which they can simulate the effect of actions or plans, thus allowing for planning and game-tree search techniques to be developed easily.
The Text-Based Adventure AI Competition
Organizers: Tim Atkinson, Hendrik Baier, Tara Copplestone, Sam Devlin and Jerry Swan
Before the widespread availability of graphics displays, text adventure games such as Colossal Cave Adventure, and Zork were popular in the role-playing gaming community. Building a fully autonomous agent for an arbitrary text-adventure game is AI complete. However, we provide a graded series of test cases, allowing competitors to gradually increase the sophistication of their approach to handle increasingly complex games. We believe that our competition can foster research into fields such as natural language processing and automatic model acquisition, as well as shed light on the relative merits of model-based and model-free approaches. The competition will be scored according to two independent criteria: The score on an unseen game instance (objective), and freedom from a priori domain knowledge (subjective decision by the judges, only used in the case of a tie).
Fighting Game AI Competition
Organizer: Ruck Thawonmas
What are promising techniques to develop fighting-game AIs whose performances are robust against a variety of settings? As the platform, Java-based FightingICE is used that also supports Python programming and development of visual-based AIs. Two leagues (Standard and Speedrunning) are associated to each of the three character types: Zen, Garnet, and Lud (data unknown in advance). Standard League considers the winner of a round as the one with the HP above zero at the time its opponent (another entry AI)’s HP has reached zero. In Speedrunning League, the league winner of a given character type is the AI with the shortest average time to beat our sample MCTS AI. The competition winner is decided based on the 2015 Formula-1 scoring system.
Visual Doom AI Competition 2017
Organizers: Wojciech Jaśkowski, Michał Kempka, Marek Wydmuch
The participants of the Visual Doom AI competition are supposed to submit a controller that plays Doom from pixels. Our ViZDoom framework gives a real-time access to the screen buffer as the only information the agent can base its decision on. The winner of the competition will be determined by a multiplayer deathmatch tournament. Although the participants are allowed to use any technique to develop a controller, the design and efficiency of ViZDoom allows and encourages to use machine learning methods such as reinforcement deep learning.
The Ms. Pac-Man Vs Ghost Team Competition
Organizers: Piers R. Williams, Diego Perez-Liebana and Simon M. Lucas
A competition to promote the research into Partial Observability research as entrants design controllers for either Ms. Pac-Man or the Ghost Team.
The Showdown AI Competition
Organizers: Scott Lee and Julian Togelius
Showdown AI Competition lets you challenge other players to Pokemon-esque turn-based combat. Competitors submit agents which choose which of hundreds of creatures to include in their team, and then select which actions to apply each turn based on the opponent’s actions. This is a challenging adversarial game featuring stochasticity as well as a great deal of hidden information. Agents should preferably be written in JavaScript, but competitors are free to use other languages at their own risk. Showdown AI Competition is in no way affiliated with or endorsed by Nintendo or The Pokemon Company International.
Dota2 Bot Competition
Organizer: Tobias Mahlmann
This competition is about writing controllers (bots) for the popular team-based MOBA Dota2. The focus is particularly the collaboration between agents. The competition uses the original game with a custom made mod, that allows external processes to control the player’s hero. Each bot controls one hero and has to coordinate with its teammates – without any backchannel communication. No other communication besides the chat wheel or map pings are permitted. Detailed rules of the competition will be published shortly. But as this is the first year that we are proposing this competition, we expect some bumps in the road ahead, and rules subject to change. We encourage you to check out our bot framework’s page below.
Game Data Mining Competition
Organizers: Kyung-Joong Kim, Du Mim Yoon, JiHoon Jeon, Sung Il Yang, Sang-Kwang Lee, Eun-Jo Lee,
Boram Kim
In game analytics field, the game data mining has been recognized as one of the important tools to understand game players’ behaviors. It can help game companies predict players’ churn/retention or purchase behaviors from game log data. Although the game log data mining is so important in game AI community, there are few public datasets available to researchers and it limits the growth of the field. In this competition, participants can access to the big game log data recorded by NCSOFT, one of the biggest game companies in South Korea. The goal of this competition is to predict the game players’ engagement to commercial MMORPG game from the massive game log data. Especially, the game has experienced the change of payment policy from fixed charge system to free-to-play. This competition will evaluate entries’ robust performance to make predictions on test datasets.
copyright@2023 cig2017.All rights reserved