In the most recent development, OpenAI claims its new AI model can play Minecraft, having trained on 70,000 hours of in-game visuals. The new AI uses standard keyboard-and-mouse inputs to play in the same world as humans, unlike many earlier Minecraft algorithms that function in far simpler “sandbox” versions of the game.
In recent years, numerous neural networks, like DeepMind’s MuZero for chess, have triumphed in various games with reinforcement learning. For the more complicated “open-world” game environment of Minecraft, Bowen Baker and his team sought to create a neural network.
They broke ground in releasing “Video PreTraining (VPT): Learning to Act by Watching Unlabeled Online Videos”. They used a large dataset to train the neural network to mimic human keystrokes in solving different tasks in the game.
After some tweaking, OpenAI discovered that the model could carry out a wide range of complex tasks, from swimming to tracking down prey and eating it. The AI also mastered the “pillar jump,” in which the player deposits a block of material beneath itself mid-jump to boost height. It learned to construct a diamond pickaxe after further fine-tuning with reinforcement learning—a feat that requires typically human gamers 24,000 actions and 20 minutes to complete.
Baker and his team said, “While we only experiment in Minecraft, we believe that VPT provides a general recipe for training behavioral priors in hard, yet generic, action spaces in any domain that has a large amount of freely available unlabeled data, such as computer usage.”
OpenAI has been doing wonders in training large datasets for AI-based tasks since its GPT-3 success in 2020. It blew people away by ingesting billions of words into the algorithm and receiving well-crafted sentences. VPT is yet another addition to the company’s outstanding AI portfolio.