How Cloud Computing Hardware Is Powering the Next Era of Space Exploration

What engineers can learn from cloud experiments aboard the ISS.

Just as cloud computing enables engineers to bring computing resources where they are needed on Earth, it is now powering technology in an even more remote location: space.

NASA is using the International Space Station (ISS) as a testing ground for many technologies that could power the future of space exploration, including cloud computing.

In space, resources are scarce. Seconds can matter when it comes to safety. Cloud computing is a crucial tool that can bring space exploration into a new space age—an era when computing resources can provide quick test results and increase research capacity.

Spaceborne Computer-2 on the ISS. (Image courtesy of NASA.)

Spaceborne Computer-2 on the ISS. (Image courtesy of NASA.)

Numerous companies from Hewlett Packard Enterprise (HPE) to Amazon have already launched cloud computing hardware studies to the ISS. Here’s what they have learned and how they are laying the groundwork for cloud computing on future commercial space stations.

Off-the-Shelf Cloud Computing in Space

As more complex scientific work is conducted in space, more computing power is needed to support it. The SG100 Cloud Computing Payload was one of the first tests of how advanced computers stand up to the intense radiation of space. Created by Business Integra, this computer was based on the data computers within the Alpha Magnetic Spectrometer-02, a physics experiment that has resided on the outside of the space station for more than a decade.

SG100 was launched to empower engineers, scientists and researchers to perform significant data analysis aboard the space station before sending it back to Earth. But the first question was whether this type of cloud hardware is radiation hardened, or at least radiation tolerant.

“Radiation hardened means no matter what, radiation will not affect this processor,” said Trent Martin, SG100 Cloud Computing Payload’s primary investigator in the NASA article “Beyond the Cloud: Data Processing from Space.” “Whereas radiation tolerant means that most likely, nothing will happen—but if it does, it won’t be detrimental. The processor won’t die.”

During the 2 years that the system was tested in orbit, no data was lost to radiation damage. This demonstration paved the way for more affordable data processing beyond Earth.

Cloud Computing at the Edge—of Space

HPE installed the Spaceborne Computer-2 in the ISS with the goal of pushing the boundaries of how scientists could use artificial intelligence and cloud computing in space.

“We want to have thousands of proofs of concept so that onboard data processing can be shown to seriously benefit the scientists and engineers back on Earth,” said principal investigator Mark Fernandez, solutions architect for converged edge systems at HPE in the NASA article “New Research Launching to Space Station Aboard Northrop Grumman’s 15th Resupply Mission.” “I want to get our brilliant minds throughout the world working on the insights rather than the number crunching.”

As the name implies, Spaceborne Computer-2 is a follow-up to a successful first Spaceborne Computer study, which launched a commercial off-the-shelf computer to orbit around the same time as SG100. The goal was similar—to see how it would survive launch and space radiation. After 8 months in Earth’s orbit, NASA reported that “the Spaceborne Computer was still demonstrating teraflop performance rates while showing only a 0.03% difference to the ground computers running in parallel.”

The success led to the second experiment in 2021, which launched a computer with twice the computing power aboard a Northrop Grumman commercial resupply mission to the space station. Because the ISS has very limited bandwidth, this experiment was designed to perform the majority of the processing in space and only send back to Earth what was truly necessary.

“Many experiments that run on the space station primarily collect data and send it back to Earth,” Fernandez said in the NASA article “Technology Tested in Space is Preparing Us for the Moon and Mars.” “We want to move computing to where data are generated or collected, whether that is in space, or on your oil rig or aircraft, to turn a sample into insight as fast as possible. You process at the edge and get the go or no-go or safe-unsafe answers you need.”

As engineering.com reported in 2021, Microsoft and HPE were able to complete a 200GB genomics experiment utilizing the HPE Spaceborne Computer-2 and Azure, the cloud computing software, aboard the orbiting laboratory. To process this data, Microsoft developed a technique that would enable the system to automatically “burst” down from space into the huge network of Azure computers when it ran out of on-station computing capacity. In 2022, Azure and Spaceborne Computer-2 teamed up once again, this time to successfully help NASA’s spacewalk team use artificial intelligence to scan the gloves of spacesuits for damage. This is an activity that was previously done by humans back on Earth over long periods of time. Read more about this project in the engineering.com article, “Keeping Astronauts Safe with Cloud-based AI in Space.

Spaceborne Computer-2 and Azure working together showcase the increased speeds and insights that can be accomplished by pairing edge and cloud technology.

Azure was also recently tested by Ball Aerospace and Microsoft for processing satellite data back on Earth with successful results. The findings indicate that with the right technology, even if it’s off the shelf, information gathered in space can be quickly and easily utilized in a range of industries—from space exploration to agriculture and disaster response.

Equipping Future Orbital Outposts

Axiom is one the companies demonstrating what the future of a low-Earth orbit economy could look like. The company is currently developing its own commercially owned space station to be launched into Earth orbit.

In preparation, Axiom is first constructing a commercial module that will attach to the ISS and lead several private crewed missions to the existing orbital outpost. Aboard its first mission, Axiom-1, the company launched an Amazon Web Services Snowcone solid state drive. The small size and weight of the Snowcone—it measures 9 in x 6 in x 3 in and weighs just 4.5 pounds—made it a good choice for the space industry, which needs to count every inch and ounce.

An AWS Snowcone SSD aboard the ISS. (Image courtesy of Amazon.)

An AWS Snowcone SSD aboard the ISS. (Image courtesy of Amazon.)

The Snowcone was launched unmodified but was wrapped in orange Kapton tape (a commonly used tool in space) to provide extra electrical and thermal protection. The crew used this cloud device in orbit to analyze photographs taken during science experiments with machine learning. Research activity aboard the ISS can produce terabytes of data each day, meaning any increase in data processing speed can make a major impact for researchers.

For this trial, the photos were first stored in Network Attached Storage (NAS) aboard the space station, and then transferred to the Snowcone over a local LAN for analysis. Once the files were on the Snowcone, Amazon said in the blog post “How We Sent an AWS Snowcone into Orbit” that the machine learning model, which was scanning and identifying all equipment in frame, took only 3 seconds to run. By eliminating the need to send all the information back to Earth, the full photo analysis process was reduced from a typical 20-hours to 20 minutes.

This is just the first step for Axiom, though. As Axiom moves to constructing its own station, it plans to partner with Microsoft and LEOcloud to move in-space cloud computing from the experiment stage to the execution stage.

The ISS has been constructed over the course of many decades, meaning its computer capabilities have received only incremental improvements over time. By creating a new station from scratch, Axiom has a real opportunity to offer a big leap in computing to the space community.