Google’s DeepMind AI Is Now Learning to Play With Physical Objects

Researchers from Google’s DeepMind announced some of their new artificial intelligence (AI) projects are learning “how” the world works — akin how a child experiments with the way the world works. This opens an entirely new breakthrough in the realm of machine learning. Advertisement Misha Denil and her colleagues from the University of California, Berkeley announced that they have trained an AI to learn the “physical properties” of objects by interacting with them virtually. This includes numerous aspects of the world, including questions such as “Can I sit on this?” or “Is it squishy?” In their paper, the AI systems were experimented in two environments. The first involved introducing five blocks arranged in a tower. Others were stuck together to make larger blocks, while others did not. The AI had…


Link to Full Article: Google’s DeepMind AI Is Now Learning to Play With Physical Objects

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!