Alphabet X Wants to Develop Robots That Can Learn From Their Surroundings
Sat, April 17, 2021

Alphabet X Wants to Develop Robots That Can Learn From Their Surroundings

The robots at Alphabet X segregate trash to improve their grasp / Photo Credit: kung_tom (via Shutterstock)


Formerly known as Google X, Alphabet X’s moonshot division launched the Everyday Robot Project, whose objective is to create a “general-purpose learning robot,” according to Jay Peters of The Verge, an American technology news and media website. In the project, the robots could use cameras and complex machine-learning algorithms to see and learn from their surroundings without having a human coding every single movement.

Alphabet X’s team is testing robots that are capable of lending assistance in workplace environments. But right now, the robots are learning how to segregate trash. One of the robots looks like a tall, one-armed Wall-E. The concept of grasping is easy for humans, but teaching this to a new robot is a whole new level. In fact, teaching them how to grasp an object is challenging. 

Despite that, Everyday Robots’ robots practice grasping objects in both the virtual and physical world. Tom Simonite of American magazine Wired described seeing a “playpen” of nearly 30 robots segregating trash into trays for landfill, compost, and recycling during the day, as quoted by Peters.

Every night, Everyday Robots has virtual robots learning to grab objects in simulated buildings. The simulated data is integrated with the real-world data. Then every week or two, the data is given to the robots in a system update. The more the robots practice, the more they become better at sorting trash. X’s humans put 20% of trash in the wrong pile, while robots throw out less than 5% of the trash in the wrong place. 

This doesn’t mean that these robots will replace janitors. Simonite saw one robot grasping at thin air instead of the bowl right in front of it, and even attempted to put the imaginary bowl down. Meanwhile, another robot lost a “finger” during the demo. Engineers told Simonite that some robots were not moving through a building as certain types of light caused their sensors to illusionary holes in the floor. 

There are also startups dedicated to addressing the problem of teaching a robot how to grasp like Embodied Intelligence and Open AI.