AI can now play the bestselling game Minecraft — and accomplish a multi-step task without being trained.
A study published on Wednesday in the scientific journal Nature showcases an AI system called Dreamer, which was developed by Google’s San Francisco-based DeepMind team, that, in a first for AI, figured out on its own how to accomplish a difficult Minecraft task: collecting diamonds.
“Dreamer marks a significant step towards general AI systems,” Google DeepMind researcher and study co-author Danijar Hafner told Nature. “Every time you play Minecraft, it’s a new, randomly generated world.”
Minecraft, which has sold more than 300 million copies, asks its more than 200 million monthly active players to navigate a virtual world consisting of different environments, like forests and deserts. Players use the resources at their disposal, like wood from trees, to build objects and mine prized items like diamonds, which are the source of the best weapons and armor in Minecraft.
Dreamer played Minecraft for nine days straight, with Google researchers resetting the universe every 30 minutes so that the AI had to constantly adjust to a new world.
Collecting a diamond in Minecraft is no easy feat: Users have to find trees, cut them down, build a crafting table, build a wooden pickaxe, and dig deep underground to find a diamond. It can take a human player 20 to 30 minutes to first mine a diamond.
After nine days, Dreamer was able to learn enough about Minecraft to mine a diamond in less than 30 minutes, just as quickly as a human player.
Throughout the experiment, Dreamer was able to absorb its physical environment in Minecraft and improve at the game without receiving step-by-step instructions from a human on how to get better. Previous research trained AI to play Minecraft by exposing it to hours of videos of players; Dreamer had no such prior training before it was introduced to the game.
“Dreamer is, to our knowledge, the first algorithm to collect diamonds in Minecraft from scratch without human data or curricula,” the Google researchers claimed in the study.
Related: Hasbro’s CEO Saw a ‘Clear Signal’ That It Was Time to Embrace AI for Dungeons & Dragons
Instead of watching videos, Dreamer used a technique called “reinforcement learning” to get better at Minecraft. The trial-and-error technique means that the AI identified actions that received rewards and repeated them while discarding other actions that failed to produce rewards.
Dreamer was also able to mine diamonds quickly by creating a mental model of its Minecraft surroundings, a model that grew more detailed each time it played, and mentally testing scenarios without actually running through them.
Hafner told Nature that Dreamer had “the ability to imagine the future” and suggested that this mental testing capability could one day help develop robots that interact more intelligently with the real world.
Minecraft was released in 2009 with the mobile version of the game generating revenues of over $98 million in 2024 alone. The average age of a Minecraft player is 24.