Researchers tested a four-legged robot on sand, oil, rocks and other tricky surfaces.
The four-legged robot resembles a dog walking over sand, rocks and other difficult surfaces. As it moves forward, the black robot adjusts its stride and adapts without falling.
Teaming up with Carnegie Mellon University’s School of Computer Science and the University of California, Berkeley, Facebook’s AI research team taught the unnamed robot how to adjust to different conditions in real time, the team said Friday. In a video, the robot, which was created by Chinese startup Unitree, adjusts how it walks as it moves over stones, down stairs, through a construction site and around outdoor terrain.
Inside a living room, researchers poured oil on plastic to create a slick surface. They piled planks and other obstacles. They dropped weight on the robot’s back. Each time, the robot recovered its balance and continued forward.
Jitendra Malik, a UC Berkeley professor who works on Facebook’s AI research team, said the robot learned how to adapt quickly through trial and error, and through information it gathers from its surroundings. The robot, which doesn’t have computer vision, is learning from how its body reacts on different surfaces, a process similar to the way humans learn. When people move from a hard surface to sand, for example, they adjust their steps once they figure out their foot is sinking. “The challenges of robotics are that there is a lot of this real-world variability,” Malik said.
Using a combination of two techniques, researchers trained the AI-controlled robot in a computer simulation, exposing the machine to a variety of surfaces and more-grueling conditions before testing it in the real world. The team calls this AI breakthrough Rapid Motor Adaptation, noting it’s the “first entirely learning-based system to enable a legged robot to adapt to its environment from scratch by exploring and interacting with the world.”
The AI advancement, Facebook says, could improve the performance of robots used in search and rescue operations, as well as at home, where machines have to navigate stairs and other objects. The research could also be applied to smart cities that use real-time data to mitigate traffic and other conditions that could hinder a resident’s quality of life.
Robots can be preprogrammed to navigate some environments, but it’s tough for a programmer to predict every obstacle a machine could encounter. Teaching a robot how to adapt in real time could also work on cheaper hardware, possibly helping to drive down costs in the future.
Because of the coronavirus pandemic, the AI researchers had to change the way they conducted their experiments, because the lab was closed. Ashish Kumar, a graduate student at UC Berkeley, said he tested the robot in his home, on hiking trails in the Bay Area and at a nearby construction site.
“It was whatever I could find in some sense,” Kumar said. The robot also broke multiple times during the testing.
The RMA-enabled robot outperformed alternative systems, and was able to walk on sand, mud, trains, tall grass and a dirt pile without falling, according to a paper. It was successful in 70% of the trials in which it had to walk down stairs on a hiking trail. It didn’t fall in 80% of tests that had it walk through a pile of cement and a pile of pebbles, the paper said.