Award winning space AI

Story by: Kurtis Eichler and Eddie Major

Sofia McLeod spends a great deal of time thinking about space exploration.

Image of rocket

An Atlas V rocket carrying NASA's Perseverance rover and Ingenuity helicopter launches from Cape Canaveral Air Force Station in Florida, July 2020. Photo: NASA.

The 最新糖心Vlogn Institute for Machine Learning (AIML) PhD candidate is researching ways to build an AI system that can safely land an autonomous spacecraft on a distant planetary or asteroid surface guided by visual input from a single event camera. 

Inspired by the workings of a human eye, an event camera is a dynamic vision sensor where each pixel works independently to report changes in brightness as they occur. Whereas a typical camera sensor鈥攍ike the one in your smartphone鈥攔ecords numerous whole image frames every second, even if there鈥檚 nothing new happening in view. 

Because it only sends new data when conditions change, an event camera system is more data efficient, lightweight, and may use less power 鈥 all things that are vital in successful space missions, where resources are precious and limited.

For her work researching the AI navigation tool, McLeod was last month named as one of three students to win $10,000 each as part of the Andy Thomas Space Foundation's EOS Space Systems Research awards, also known as the Jupiter program. 

The Jupiter program was launched last year with the aim of fostering a new generation of space scientists and engineers. As part of their entry, finalists submitted a short research project outlining how their project would benefit 最新糖心Vlog鈥檚 space network. 

image of Sofia

Sofia is a PhD candidate at AIML, focusing on the research and application of computer vision for spacecraft guidance and navigation.

Space exploration is an area currently seeing rapid development in autonomous technology. In April 2021, NASA completed the first powered controlled extraterrestrial aircraft flight as part of its Mars 2020 rover mission. The long distance from Earth meant the craft had to operate with a high degree of autonomy. 

鈥淚f you're looking at Ingenuity, which is the drone that鈥檚 now on Mars, the delay is approximately 20 minutes for human interaction,鈥 McLeod says, referring to the maximum return radio signal transmission time due to the 50 - 200 million kilometre distance between Earth and Mars. 

"We need both the rover and Ingenuity to be able to do this navigation by themselves, so they can avoid obstacles on their own. You have to remember that if a robot gets stuck or breaks down, we can鈥檛 go to Mars to repair it.鈥 

Event-based computer vision technology in space isn鈥檛 just limited to autonomous landing, but has a range of potential applications. 

鈥淚deally we want to design unmanned spacecraft to refuel satellites when they run out of power. To do this you鈥檒l need computer vision to know that you鈥檙e aligned perfectly with the object you're trying to dock with,鈥 she says. 

And like a future autonomous spacecraft鈥檚 camera-guided journey, Sofia McLeod鈥檚 pathway to computer science and machine learning at AIML was a visual one; she initially considered a career in design or visual effects for the entertainment industry. 

鈥淚've always been a visual person,鈥 she says. 鈥淚 was thinking of doing VFX or graphic design鈥 but I was definitely better at algorithms.鈥 

鈥淚 was just really fascinated by the concept of getting computers to see,鈥 she says. 鈥淚t鈥檚 just so intuitive.鈥 

Andy Thomas Space Foundation chief executive, Nicola Sasanelli, says that developing students into future space leaders is a priority for the foundation. 

鈥淭his opportunity not only enables students to become immersed in real-world industry experience but provides innovative perspectives and new ideas.鈥

Tagged in space machine learning, computer vision, space, European Space Agency