Stories
Slash Boxes
Comments

SoylentNews is people

Meta

Submission Preview

From OASIS to NOAH, the evolution of AI in modern space rovers

Rejected submission by exec at 2018-07-17 02:29:32
News

Story automatically generated by StoryBot Version 0.2.2 rel Testing.
Storybot ('Arthur T Knackerbracket') has been converted to Python3

Note: This is the complete story and will need further editing. It may also be covered
by Copyright and thus should be acknowledged and quoted rather than printed in its entirety.

FeedSource: [ArsTechnica]

Time: 2018-07-16 14:36:41 UTC

Original URL: https://arstechnica.com/science/2018/07/from-oasis-to-noah-the-evolution-of-ai-in-modern-space-rovers/ [arstechnica.com] using UTF-8 encoding.

Title: From OASIS to NOAH, the evolution of AI in modern space rovers

--- --- --- --- --- --- --- Entire Story Below --- --- --- --- --- --- ---

From OASIS to NOAH, the evolution of AI in modern space rovers

Arthur T Knackerbracket has found the following story [arstechnica.com]:

NASA's Opportunity Mars rover has done many great things in its decade-plus of service—but initially, it rolled 600 feet past one of the initiative’s biggest discoveries: the Block Island meteorite [scientificamerican.com]. Measuring about 67 centimeters across, the meteorite was a telltale sign that Mars' atmosphere had once been much thicker, thick enough to slow down the rock flying at a staggering 2km/s so that it did not disintegrate on impact. A thicker atmosphere could mean a more gentle climate, possibly capable of supporting liquid water on the surface, maybe even life.

Yet, we only know about the Block Island meteorite because someone on the Opportunity science team manually spotted an unusual shape in low-resolution thumbnails of the images and decided it was worth backtracking for several days to examine it further. Instead of this machine purposefully heading toward the rock right from the get-go, the team barely saw perhaps its biggest triumph in the rear view mirror. "It was almost a miss," says Mark Woods, head of autonomy and robotics at SciSys, a company specializing in IT solutions for space exploration that works for the European Space Agency (ESA), among others.

Opportunity, of course, made this near-miss maneuver all the way back in July 2009. If NASA were to attempt a similar initiative in a far-flung corner of the galaxy today—as the space organization plans to in 2020 with the Mars 2020 rover [arstechnica.com] (the ESA has similar ambitions with its ExoMars rover that year)—modern scientists have one particularly noteworthy advantage that has developed since.

”The rover lacked intelligence,” Woods puts it bluntly. “It could totally roll past a Martian, and we wouldn't even know about it."

The Block Island discovery was such a close call because the software and hardware responsible for a rover's mobility was developing much faster than communications bandwidth back in 2009. Previously, the Sojourner rover traveled around 100 meters in its roughly three-month mission back in 1997. By contrast, Opportunity set a record of traverse distance [arstechnica.com] at around 220 meters on a single Martian day. Unfortunately, the amount of data both rovers could send back to Earth remained roughly the same. The more ground rovers were able to cover, the more science they were likely to miss at the time.

Hence, not long after Sojourner, the team working at NASA's Jet Propulsion Laboratory began to think about making its rover hardware autonomous to some degree. The idea was to design algorithms that would recognize interesting phenomena encountered in the rover's surroundings during traverses and either notify the science team on Earth asking for instructions or examine those phenomena straightaway. And at the time, it was quite a challenge because of the limited computing power the rovers could muster: for Opportunity, for instance, the JPL team had to find out how to run advanced artificial intelligence systems on the rover’s BAE RAD6000 processor clocked at 25MHz.

“[It was] way slower than a typical smartphone. We're talking pre-Pentium levels of processor performance,” Woods tells Ars. Yet, JPL pulled it off. After years of development, the Autonomous Exploration for Gathering Increased Science (AEGIS [nasa.gov]) software was successfully uploaded to the Opportunity rover in 2010. Its successor, Curiosity [arstechnica.com], got the AEGIS update a few years later.

AEGIS is a relatively simple system. On its most basic level, its purpose is to aim a camera at an interesting rock and take measurements. An algorithm called Rockster finds rocks in images looking for shapes predefined by the science team on the ground. The team remotely turns the AEGIS on and off depending on how much energy the rover has left and how computationally intensive its other tasks are.

This software is still in use today and has been effective enough to merit inclusion on NASA’s upcoming Mars 2020 rover. [nasa.gov] But working as it does now, AEGIS is primarily a time-saving tool. Normally, there is roughly a 20-minute delay in communications between Earth and Mars. With AEGIS turned off, whenever a team on Earth sends a command to the rover to move a few feet forward, it has to wait 20 minutes before the command gets through. Once the rover has moved those few feet, it sends images of its surroundings back to Earth. That's another 20 minutes. When the images reach Earth, the science team looks for promising targets and sends commands to take high-resolution pictures back to the rover. Twenty minutes. It all seems painfully slow, because it is. But with AEGIS turned on, a rover analyzes rocks on its own, takes pictures, and sends them back to Earth immediately after reaching its destination. It's roughly an hour saved each time it happens. Coupled with automatic obstacle avoidance systems, that's about all the autonomy planetary rovers have right now.

But future planetary rovers are about to get way smarter. AEGIS was initially designed as a part of a more advanced system called OASIS [nasa.gov] (or, Observation and Analysis of Smectic Islands In Space). In OASIS, the intelligence arises from the interplay between two main modules: an automatic science gathering component and a scheduling system called CASPER [nasa.gov] (Continuous Activity Scheduling Planning Execution and Replanning), which is also responsible for allocating available resources to all tasks on the agenda. This autonomous ecosystem all begins with an image.

"In building rovers' AI, we were mainly focused on vision,” says Woods. "When you think about energy budget, it's the cheapest way of gathering data because rovers use their cameras all the time for navigation.” Every image first goes to the feature detection component. Algorithms next recognize the line of horizon and then proceed to identifying rocks on the ground (that’s what AEGIS is responsible for) and clouds in the sky. If rocks or clouds are of unusual size or shape, the system marks them as promising and plans additional data gathering steps like getting closer to the target or using more precise instruments. Those steps go into a scheduling module which determines how much time and energy it would take to complete them. If a target seems really novel, CASPER is more likely to give it a go, if it doesn't, the scheduler simply kills the project and goes on with pursuing more important goals.

-- submitted from IRC


Original Submission