You could help the Curiosity rover navigate Mars by flipping through photos of the red planet’s rocky landscape and labeling what you see.
NASA is asking volunteers to help sort through and label thousands of photographs taken by the rover. The labels, gathered through the AI4MARS program, will help the rover pick a path to reach its next scientific target. The labels will contribute to a machine learning project to help the rover’s path planners pick smooth routes, after years of sharp terrain wore down the rover’s treads, Elizabeth Howell reports for Space.
As of Tuesday, AI4MARS volunteers had completed about 82 percent of their goal.
The program is similar to the artificial intelligence behind self-driving cars on Earth, which are trained to recognize their surroundings based on photographs. But on Mars, there are no roads, street signs or traffic lights to guide the rover’s path. Curiosity just has its software, scientists and engineers at NASA and its own six wheels to trek around Mars’ surface.
Mars is a dangerous place to be a car-sized, roving robot. Spirit, a rover that landed on Mars in early 2004, got stuck in soft soil in May 2009, and its mission was declared over in May 2011. Spirit’s twin rover, Opportunity, also landed in 2004, and stayed live until 2018 when a dust storm blanketed its location. NASA tried to contact the solar-powered rover over 1,000 times but ended its mission on February 13, 2019.
Curiosity landed on the Red Planet in 2012. In theory, choosing clear, smooth paths could help extend Curiosity’s useful time on Mars. But by 2017, there was damage on the rover’s zigzagged treads, threatening their ability to carry its four-ton mass. That’s after only driving about 14 miles throughout its mission so far. According to a statement, it can take four to five hours for a team of rover planners to figure out where Curiosity should drive and how it should get there.
"It's our job to figure out how to safely get the mission's science," rover planner Stephanie Oij, who is involved in AI4Mars, says in the statement. “Automatically generating terrain labels would save us time and help us be more productive."
Normally, it takes hundreds of thousands of images to train a machine learning algorithm to recognize features. But there aren’t that many photographs of Martian terrain available for the team to use.
The rover planning team uses a program called Soil Property and Object Classification, or SPOC, but they hope that by working with volunteers to train it on more of their photographs of Mars, it will work better and faster than it does now. Improvements to SPOC could also help the next Mars rover, Perseverance, when it arrives on the planet’s surface.
"In the future, we hope this algorithm can become accurate enough to do other useful tasks, like predicting how likely a rover's wheels are to slip on different surfaces," Hiro Ono, an artificial intelligence expert at the Jet Propulsion Laboratory, says in the statement. When Curiosity reached the top of a hill in March (and took a selfie), it had to climb a 30-degree incline and the steep climb sometimes left its wheels spinning in place.
If SPOC can better identify safe terrain that its wheels can navigate, it would help the researchers conduct more research in Curiosity’s primary mission: finding signs of that Mars may have once been habitable.