AI Technology

NeurIPS 2019 Will Host Minecraft Reinforcement Learning Competition

A group of AI experts from top US universities is organizing a sample-efficient reinforcement learning competition, MineRL, which will start on June 1, 2019. The organizers want to increase group participation in reinforcement learning and are encouraging people to "play to benefit science".

A group of AI experts from top US universities is organizing a sample-efficient reinforcement learning competition, MineRL, which will start on June 1, 2019. The organizers want to increase group participation in reinforcement learning and are encouraging people to “play to benefit science”. The competition’s top ten participants will present their work in a NeurIPS 2019 workshop this December in Vancouver.

Competitors are tasked with developing artificial intelligence agents that can obtain “diamond” rewards in the popular video game Minecraft. Although standard training procedures require months or more to bring a system to human-level performance in complex games such as StarCraft or Dota 2, the MineRL challenge training time is limited to only four days.

In Minecraft, obtaining diamonds requires a sequence of eight steps, from wood gathering to diamond mining. The AI agent must determine how to efficiently perform these steps and their correct order.

Gather wood
Create wood pickaxe
Mine stone and create stone pickaxe
Mine Iron Ore with the stone pickaxe
Create a furnace
Smelt iron and create an iron pickaxe
Search
Mine diamonds

Organizers created an imitation learning dataset comprising over 60 million frames of human player data, along with video inputs that can help AI agents determine the logical relationship between various steps in a short time and solve three types of tasks — navigation, obtaining, and survival — which represent some of most difficult challenges in reinforcement learning, such as sparse rewards and hierarchical policies.

The MineRL competition runs from June 1st to October 25th, 2019. All submissions should be made through CrowdAI. A leaderboard on the competition webpage will display the latest high scores.

Participants need to first train their AI agents to play Minecraft and submit trained models for evaluation. The top 10 participants on the leaderboard will advance to the second round. In Round 2, participants can submit their codes up to four times and each submission will generate a score after training. The best scores will determine the final ranking.

Prizes will be provided by competition sponsors such as NVIDIA, which will present top teams with three GPUs. More details are on the way as the organizers finalize agreements.

Visit the MineRL website for more information.


Author: Hongxi Li | Editor: Michael Sarazen

0 comments on “NeurIPS 2019 Will Host Minecraft Reinforcement Learning Competition

Leave a Reply

Your email address will not be published.

%d bloggers like this: