NASA’s floating space robots test new sensing algorithms for true autonomy

Wildcat simultaneous localization and mapping and stereo depth fusion are two promising sensing algorithms that could lead to true autonomous movement.

NASA astronaut Megan McArthur poses with the Astrobee robotic free-flyers in support of the Kibo Robot Programming Challenge, or Robo-Pro Challenge. Image: NASA

NASA astronaut Megan McArthur poses with the Astrobee robotic free-flyers in support of the Kibo Robot Programming Challenge, or Robo-Pro Challenge. Image: NASA

Travelling at 17,500 miles per hour nearly 250 miles above Earth’s surface orbits the International Space Station (ISS). And onboard buzzes three of its arguably cutest passengers: the Astrobees. Named Honey, Bumble, and Queen, these cube-shaped free-flying robots are pushing the boundaries of automation and sensors research in space.

Launched in 2018, the Astrobees are a platform for researchers to test out automation and robotic technologies in space. The robots primarily reside in the space station’s Japanese Experiment Module (aka JEM or the Kibo module), ready to be called upon for experiments or student coding competitions. They have been used to test everything from control algorithms to robotic arms.

“There’s a reason so many science fiction stories have a beloved robot alongside the human heroes – we know we can’t explore space alone,” Jose Benavides, the Astrobee Facilities Project Manager at NASA’s Ames Research Center in California’s Silicon Valley said in a NASA article. “We’re showing that humans and robotic systems can collaborate and support powerful science and engineering beyond Earth.”

The most recent Astrobee experiment launched last month aboard SpaceX’s 30th Commercial Resupply Mission, is pushing sensor and automation research to the next level with a test of new in space scanning algorithms.

From Earth to Space

Known as the Multi-Resolution Scanning (MRS) Payload for Astrobee, the project is a collaboration between Boeing and the Commonwealth Scientific and Industrial Research Organization (CSIRO), an Australian government scientific research agency.

CSIRO and Boeing have worked together for many years on several different sensing technologies, focusing on Earth-based uses. About five years ago, Boeing approached the CSIRO team about how that technology might be transferred into orbit.

Launched aboard SpaceX’s 30th Commercial Resupply Mission, the MRS study is the outcome of that collaboration. It tests the capabilities of a unique sensor package and robotics to support automated 3D sensing, mapping and situational awareness functions on future autonomous robots. This package will be mounted on the Astrobees and begin scanning the Kibo module in June 2024.

“What we are trying to prove here is this kind of sensor payload can provide sufficient resolution and situational awareness to start to mitigate the need for humans in the loop. This impacts all aspects of spacecraft operations, whether it be internal maintenance, docking, off world exploration, exterior maintenance, or those kinds of things,” CSIRO Project Leader for the MRS project Dr. Marc Elmouttie told Engineering.com. “Broadly speaking, we are trying to improve the technology suite that’s available for those in charge of managing automated situational awareness in space environments.”

The multi-resolution scanning payload, bound for the International Space Station. Image: CSIRO

The multi-resolution scanning payload, bound for the International Space Station. Image: CSIRO

How It Works

The MRS payload contains two computer vision cameras, three laser time-of-flight sensors, and an inertial measurement unit (IMU) within a 3D printed housing. Although the sensing hardware is off-the-shelf, the team created new algorithms and hardware and software integration.

The team developed two primary algorithms. First, the stereo depth fusion (SDF) algorithm integrates stereo vision with the LIDAR time-of-flight sensors, producing 3D, high quality reconstructions of the surrounding scene. Second, the Wildcat Simultaneous Localization and Mapping Algorithm (aka the Wildcat SLAM algorithm) produces a lower resolution point cloud and trajectory map that is focused on the localization of the robot in the spacecraft.

A 3D model of a CSIRO office, with desks and walls shown in grey. Colored lines show the device's trajectory through the office space. This data was captured with Wildcat Simultaneous Localization and Mapping onboard the multi-resolution scanning payload. Credit: CSIRO

A 3D model of a CSIRO office, with desks and walls shown in grey. Colored lines show the device’s trajectory through the office space. This data was captured with Wildcat Simultaneous Localization and Mapping onboard the multi-resolution scanning payload. Credit: CSIRO

This payload is the first time these algorithms have been paired with this suite of sensors. Since they were being launched to space, it meant an added challenge of also ensuring the creation met NASA requirements for things like vibration and electromagnetic interference and were compatible with the already launched Astrobee platform.

Although the Astrobees currently do have some LIDAR, time of flight, and vision and camera systems, they primarily rely on a visual localization system. This means they are using coded targets throughout the Kibo module and some environmental features for navigation.

“The novel aspect of this is that we’re using the LIDAR, we’re using the vision, and we’re using the IMU in an integrated sense to provide more precise trajectory information for the robot,” Elmouttie said.

The results acquired by the team will be more than just a proof-of-concept. It could also mean the extension of the capabilities of space robot assistants.

“Given the quality of the data, we want to investigate other use cases. They include logistic use cases like keeping track of the environmental components or equipment in a very cluttered spacecraft environment, and essentially that sort of keeping track of movement or changes in that environment over time,” Elmouttie said.

The scans could even serve as a training asset for astronauts before launching into orbit. By transferring highly detailed scans of the inside of the ISS into virtual reality, crews could get a better idea of what space outposts look like before traveling to them.

The multi-resolution scanning team prepares to send off the multi-resolution scanning payload for launch in the United States. Left –to right: Marc Elmouttie (CSIRO Project Lead), Tea Molnar, Lauren Hanson, Matt van de Werken, David Haddon, Ross Dungavell, Anna Campbell, Paul Flick  Inset: Connie Miller (Boeing Space & Launch Principal Investigator) & Leighton Carr (Boeing Australia Principal Investigator) Absent: Peter Dean, Michael Lofgren, Rosie Attwell  Image: CSIRO

The multi-resolution scanning team prepares to send off the multi-resolution scanning payload for launch in the United States. Left –to right: Marc Elmouttie (CSIRO Project Lead), Tea Molnar, Lauren Hanson, Matt van de Werken, David Haddon, Ross Dungavell, Anna Campbell, Paul Flick Inset: Connie Miller (Boeing Space & Launch Principal Investigator) & Leighton Carr (Boeing Australia Principal Investigator) Absent: Peter Dean, Michael Lofgren, Rosie Attwell Image: CSIRO

An Automated Space Future

As astronauts travel further into deep space and look to establish long-term bases on places like the Moon or Mars, automated helpers are even more essential. Orbiting only 250 miles above Earth, the ISS can more easily maintain a constant human crew, but outposts like the Artemis program’s planned Lunar Gateway would be much further away and harder to resupply. This would mean long periods without human habitation. If robotic crewmates like the Astrobees can deftly fly through the spacecraft, scan for issues, and interact with the spacecraft and payloads, it can ensure these stations stay up and running when humans are not around.

To move towards this future, the CSIRO team would also be interested in integrating their sensors with the robotic navigation systems as well.

“We are generating six-degree-of-freedom trajectories in this mission, but they are not being piped back to the control system for safety reasons. It is a proof of concept, “Elmouttie said. “So, in an actual implementation once this is demonstrated, you would want this to be integrated as part of that control and navigation system.”

A previous version of the MRS payload, with its cameras revealed, is tested at NASA Ames Research Center. Credit: CSIRO

A previous version of the MRS payload, with its cameras revealed, is tested at NASA Ames Research Center. Credit: CSIRO

Automation and scanning do not stop at the space station airlock. The MRS teams see potential applications of their algorithms out in the harsh environment of space in areas much more difficult for humans to explore.

“We’re very interested in using this new sensor payload to monitor the effects on the exterior of spacecraft, and we are envisioning some form of a payload that would be suitable to add to the end effector of a robotic arm, for example, like Canadarm, [the ISS’s robotic arm],” Elmouttie said.

Lunar or Martian rovers could also be given new freedom by technologies like this, allowing them to better process their surroundings and move through them with less human intervention.

“We are hoping this is a steppingstone in demonstrating the value of this particular new sensor system. And then the next steps are sort of moving outside of the interior of a spacecraft, which are, to be sure, more challenging,” Elmouttie said. “But hopefully after demonstrating the interior spacecraft use case that will build a bit of momentum in that area as well.”

All this work in space also has the potential to bring benefits back to us here on Earth. These sensors and scanning technologies can be particularly helpful when used in remote or dangerous areas on Earth as well. CSIRO has already been working on several sensor projects focused on mining.

“There is already interest in exploring the application of this for supporting automated underground mining equipment. Similar kinds of use cases to perhaps what you might imagine in an off-world exploration, lava tube exploration kind of use case,” Elmouttie said. “There’s applications terrestrially on ground in the mining space and probably more generally, it will translate to any industrial use case where situational awareness or high-resolution situational awareness, particularly in the exploration phase of work, is required.”

Written by

Erin Winick Anthony

Erin Winick Anthony is the founder of STEAM Power Media, a science communication company focused on digital storytelling. She holds a mechanical engineering degree from the University of Florida, and uses her technical background to serve as a translator between scientists, engineers, and the public. She previously worked as a science communication specialist at NASA’s Johnson Space Center for the International Space Station where she was awarded NASA’s Silver Snoopy, and as a reporter for MIT Technology Review. You can find her on social media @erinwinick sharing space, science, and pinball content.