At the In-Orbit Servicing and Manufacturing Facility in Westcott, simulation and robotics are the brightest stars.
Some 60 miles northwest of London, near a tiny English village called Westcott, once known for hosting a center developing rockets and ballistic missiles, a new lab is revolutionizing space tech testing.
Using car manufacturing robots and computer simulations, the lab enables engineers to recreate aspects of the space environment that are otherwise difficult to understand. Data generated in the experiments could help spacecraft designers revolutionize how satellites are put together and make it easier for future repair, removal or refueling robots to service them, an important requirement at a time when growing numbers of orbiting spacecraft and debris cause concerns about sustainability of humankind’s use of near-Earth space.
Run by the Satellite Applications Catapult, the In-Orbit Servicing and Manufacturing Facility (IOSM) at Westcott has a 27-meter-long hall at its heart. Inside, amid pitch black walls, a duo of Kuka robots is surrounded by 34 motion capture cameras. The cameras track the scene from various angles, some focusing in detail on the robots’ dexterous end-effectors and whatever these hands clutch in any given experiment. While one of the robots is fixed to the floor, the other is mounted onto a slider that moves up and down a 16-meter-long track in the middle of the room. This is space on Earth.
The IOSM facility is one of only a few such labs globally, Jeremy Hadall, the robotics development lead at the U.K. Satellite Applications Catapult, who manages the experiments, told engineering.com. Unlike other similar labs worldwide, the one in Westcott is open to smaller commercial companies developing next-generation space technologies such as in-orbit servicing and active space debris removal.
“The other facilities that are in existence are either run by government agencies or commercial entities, which both limit their availability for wider use,” Hadall said. “We saw a need to offer this service as an open platform for the industry to come and use when they need.”
Space is hard
Space is hard, every operator of a failed space mission will say. Especially when you are developing something that has not been done before. Manufacturers of satellite technologies always subject their contraptions to rigorous testing before putting them on top of a rocket and shooting them into space. Shaker tables capable of creating vibrations as strong as an earthquake simulate the tremors of a rocket launch. Vacuum chambers and thermal cycling test chambers put the devices through their paces to ensure they can survive in the harsh environment of space, battered by cosmic radiation and exposed to temperatures that swing from frigid -85° F (-65° C) to boiling 257° F (125 ° C).
Westcott’s IOSM lab doesn’t aim to replace existing testing protocols. Instead, it adds a new layer that enables engineers to recreate how their satellites will move in the weightless environment in Earth’s orbit.
“We can’t really simulate microgravity, but we can do a lot with the robots to negate that constraint,” Hadall said. “We use those robots to manipulate spacecraft and instruments around the room and by doing that, we can replicate trajectories and the motion patterns that they might come across or they might experience.”
In-orbit servicing robots that will be able to refuel and repair satellites in space to extend their lives, or even build future space stations or space-based solar farms, are part of humankind’s current space expansion ambitions, the effort to make space around the planet a bigger part of humankind’s everyday reality. But while these interactive technologies seem to work effortlessly in visualizations, making them perform for real is a challenge that has not yet been completely solved.
“Although we’ve been docking with the space station for decades, it’s still quite a tricky operation, particularly tricky if you’re trying to do it with something that has no real docking interface or no real place to grab hold of it,” Hadall explained. “If you try to do that with a satellite, you have to realize that the last time that satellite was seen by anyone was before it was launched. Since then, its solar panels, its antennas were deployed. They all stick out around in random places. It’s really hard to get up close to it without damaging any of those features, and then be able to grab hold of it and dock with it.”
The simulations are conducted at snail-pace speeds rather than at orbital velocities of nearly 18,641 miles per hour (30,000 kilometers per hour). The autonomous robots, guided by manufacturing simulation software Visual Components, attempt to bring the two spacecraft—the piece of debris and the garbage collector, or the servicer and the serviced spacecraft—together completely autonomously.
“Our facility allows people to bring their technologies, bring their vision-based navigation systems or their laser-based guidance systems or whatever they might be using, and attach them to one of the robots,” said Hadall. “Then the other robot acts as a target that we’re trying to capture or get close to. We get that robot moving around it, it can follow a random path, it can be a prescribed, known trajectory. And then we use the second robot to carry the sensor equipment, which is trying to find and navigate the spacecraft towards the target.”
Avoiding the worst-case scenario
Even in the well-understood environment of the lab, the operations are so challenging that engineers don’t dare to perform the reenactment of this orbital dance without first simulating it on a computer.
“The worst-case scenario for me is that we have a trajectory given to us by a supplier, we put it straight on the robots, and the robots do not follow the intended route,” said Hadall. “If someone crashed into the walls, damaged the equipment. That’s the worst-case scenario. So, as the first step, we will run [the whole experiment] through our simulation software, and check that it’s all working and that we know what we can expect to see with the robots.”
The simulations run on the Unity game engine or Nvidia Omniverse’s IsaacSim robotics simulator, which allows the engineers to add extra parameters that would otherwise be impossible to simulate. For example, in the digital world, the researchers can analyze consequences of unintended physical interactions between the servicing spacecraft and its target in the microgravity of space.
“If I touch something in space, it’s going to go off in a different direction,” said Hadall. “If you don’t capture an object properly, you may just change its trajectory and potentially create more debris. This type of reaction is very difficult to simulate physically down here on Earth, but in the simulation model, I can do that.”
The simulation, Hadall added, generates a wealth of data that help finetune the experiments before the robots get engaged.
Hadall said that although both the robots and the digital environment are based on commercially available technologies, the Westcott engineers build bespoke plugins to create the most space-like experience possible.
Through virtual reality headsets, the makers of the tested satellite system can examine in detail how their spacecraft interact or perform the tasks assigned to them as if they were standing next to them for real.
“In industrial robotics, the best place to see how things are working is when you stand next to them,” said Hadall. “That’s quite difficult to do even in a terrestrial manufacturing environment. In the space environment, it’s impossible. But having virtual reality or augmented reality means I can be there without physically being there. I can see what the different things are doing as the robots are carrying out the tasks.”
Lessons in sustainable space design
Lessons learned in the experiments are helping to finetune not only the tested systems but also produce insights that will help satellite manufacturers in the future make their spacecraft more serviceable and reduce the risk of the operations failing.
“In-orbit servicing is still a very nascent area and servicing technologies are very much in development,” said Hadall. “We have to rethink how we design satellites and I think it’s going to be quite a big change in the market in the next 10 years.”
So far, the only spacecraft designed to be serviced in orbit was the Hubble Space Telescope, Hadall added. The Westcott team has studied available information about how the venerable telescope was put together back in the 1980s to make replacing parts in orbit easier. But there is only so much that engineers can learn from the five astronaut missions that flew to Hubble in the 1990s and 2000s. Simulations can fill some of the gaps in the engineering knowledge and spark new ideas around the challenges future space robots will face.
“It gives users a really interesting and unique perspective on how the system is working, and how the hardware is interacting with each other,” Hadall added. “At the moment, very little in space is designed to be disassembled and recycled, so we need to learn how we are going to do that if we want to have satellites more sustainable. It’s an interesting technology challenge, but we still have a few years to solve it.”
—
Update February 16, 2024: An earlier version of this story misquoted Jeremy Hadall and misspelled his name as Hadal.