Engineers at UT Austin demonstrate new approach to human-like balance in bipedal robots.
Humans have evolved to be excellent bipedal walkers. But a lot of what goes into walking is automatic, such as balance and avoiding obstacles. For example, when you’re walking in a crowded place, you typically aren’t thinking directly about how to avoid bumping into other people and objects. Humans are built to use a gamut of complex skill sets that enable us to execute these types of seemingly simple motions.
Human-like balance has been the ambition of many engineers and researchers, and now a team at the Cockrell School of Engineering at The University of Texas at Austin have developed a method for robots to soon experience similar functionality during bipedal locomotion. Luis Sentis, associate professor in the Department of Aerospace Engineering and Engineering Mechanics, and his team in the Human Centered Robotics Laboratory have successfully demonstrated a novel approach to human-like balance in a biped robot.
Their approach relies on translating a key human physical dynamic skill—that of maintaining whole-body balance—into a mathematical equation. They then used the numerical formula to program their robot, named Mercury, which was built and tested over the course of six years. They calculated the margin of error necessary for the average person to lose their balance and fall when walking, ending up with a simple figure: 2 centimeters.
“Essentially, we have developed a technique to teach autonomous robots how to maintain balance even when they are hit unexpectedly, or a force is applied without warning,” Sentis said. “This is a particularly valuable skill that we as humans frequently use when navigating through large crowds.”
Mercury, a biped robot developed in Cockrell School of Engineering professor Luis Sentis’ Human Centered Robotics Lab, is able to maintain balance when hit unexpectedly or when force is applied without warning. (Image courtesy University of Texas at Austin.)
Their technique has been successful in dynamically balancing both biped robots without ankle control, as well as fully humanoid robots. It’s much harder to achieve dynamic human-body-like movement for a robot that does not have ankle control than it is for robots equipped with actuated, or jointed, feet. The UT Austin team used an efficient whole-body controller developed by integrating contact-consistent rotators (or torques) that can effectively send and receive data to inform the robot as to the best possible move to make next in response to a collision. They also applied a mathematical technique—often used in 3D animation to achieve realistic-looking movements from animated characters—known as inverse kinematics, along with low-level motor position controllers.
Though the Mercury robot is tailored to the specific needs of its creators, the fundamental equations underpinning this technique in our understanding of human locomotion are—in theory—universally applicable to any comparable embodied artificial intelligence (AI) and robotics research.
Like all the robots developed in Sentis’ lab, the biped is anthropomorphic—designed to mimic the movement and characteristics of humans.
“We choose to mimic human movement and physical form in our lab because I believe AI designed to be similar to humans gives the technology greater familiarity,” Sentis said. “This, in turn, will make us more comfortable with robotic behavior, and the more we can relate, the easier it will be to recognize just how much potential AI has to enhance our lives.”
Their approach has implications for robots that would be used in everything from emergency response to defense to entertainment. The team presented their work at the 2018 International Conference on Intelligent Robots and Systems (IROS2018), the flagship conference in the field of robotics.
The research was funded by the Office of Naval Research and UT, in partnership with Apptronik Systems, a company of which Sentis is co-founder.
Source: University of Texas at Austin Newsroom