Microsoft and HP are dropping AI and big data analysis into space. More specifically, on the International Space Station (ISS), which hosted the HPE Spaceborne Computer-2 in February. An IT platform that Microsoft describes in a blog post as “a supercomputer the size of a small microwave oven, connected to the cloud from space.” This extreme computing device consists of an HPE Edgeline EL4000 Converged Edge system and an HPE ProLiant DL360 Gen10 server. It is enough to process data blocks and directly run AI algorithms as close as possible to the source. Capabilities that will be most useful in the context of space exploration missions.
The ability to run algorithms in space makes it possible in particular to save bandwidth when it is necessary to transmit data to teams on the ground. A few months ago, artificial intelligence for the first time processed data in a spinning satellite using a chip provided by Intel. It involved sorting surveillance images before sending them back to Earth.
Astronaut DNA sequencing
On the International Space Station, HPE’s Spaceborne Computer-2 is used, among other things, for an experiment led by Microsoft’s Azure Space teams, consisting of monitoring the health of astronauts. The goal is to regularly sequence their DNA in order to detect potentially harmful genetic mutations, a risk that would arise during long spaceflights exposing astronauts to increased radiation. Sequencing results must be compared against a large and constantly updated clinical database, an approach that therefore requires sending data back to Earth.
catch? The amount of data is too large (about 200 gigabytes of raw data) for the downlink that was provided to Spaceborne Computer-2 by NASA. It has a maximum download speed of 250 kilobytes per second, and it only lasts 2 hours per week. “It’s like going back to the ’90s phone modem,” says David Weinstein, head of software engineering for Microsoft’s Azure Space division.
Selective filtering of genomic data
To overcome bandwidth limitations, Spaceborne Computer-2 performs the initial process of comparing extracted genetic sequences with reference DNA segments and only captures variations or mutations, which are then uploaded to the cloud. These pieces of genomes presenting pre-existing abnormalities are then compared to the National Institute of Health’s database, in order to determine their impact on health.
The experience is critical and has demonstrated the added value of the edge-to-cloud workflow for the International Space Station, explains in a blog post by Gabe Monroy, Director of Developer Experience for Azure. Without pre-processing the sequencing results, more than 150 data would have to be sent back to Earth. With dedicated bandwidth, it would take more than two years to transfer 220 GB of raw data. While the filtered data can be transferred in just over two hours.
“Certified gamer. Problem solver. Internet enthusiast. Twitter scholar. Infuriatingly humble alcohol geek. Tv guru.”