Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 16 submissions in the queue.

Submission Preview

Link to Story

NASA Astronauts on Artemis Could Talk to a Spaceship Computer

Accepted submission by upstart at 2022-08-22 09:32:05
News

████ # This file was generated bot-o-matically! Edit at your own risk. ████

Bad Feeling About This. HAL?

NASA astronauts on Artemis could talk to a spaceship computer [mashable.com]:

Captain Kirk, Spock, and the rest of the Star Trek gang were in constant dialogue with the onboard Enterprise computer, asking it questions about the starship and their alien environments.

With NASA reviving its human space exploration program [mashable.com] in a matter of days through Artemis, it seems only natural real astronauts of the 2020s who will crew the forthcoming missions would do the same. After all, boldly going where no one has gone before could be lonely, and having an A.I. sidekick might help on those long voyages.

When Lockheed Martin, the company that built the new Orion spacecraft [nasa.gov] for NASA, first dreamed up the talking computer, engineers figured they'd just throw an Amazon Echo Dot on the dashboard with a laptop and call it a day. But it wasn't nearly that simple, said Rob Chambers, Lockheed's director of commercial civil space strategy.

Beyond technical constraints, they had to overcome the menacing representations of an inflight space computer, in the vein of Stanley Kubrick's 2001: A Space Odyssey. Unlike the collegial computer in Star Trek, "HAL" starts to glitch [mashable.com], takes control of the spacecraft, and then fights the crew's attempts to shut it down.

That's not merely a concern raised through science fiction. This summer A.I. developer Blake Lemoine, formerly of Google [mashable.com], went public with his belief that a chatbot he helped build had become sentient. The story sparked a global conversation about whether some artificial intelligence is — or could be — conscious.

William Shatner as Capt. James T. Kirk on Star Trek talks to the Starship Enterprise computer.

Such claims work to reinforce fears long embedded in popular culture — that one day the advanced technology enabling humans to achieve extraordinary things could be too smart, perhaps leading to machines that are self aware and want to hurt people.

"We don't want the HAL 9000, 'I'm sorry, Dave. I can't open the pod bay doors,'" Chambers told Mashable. "That's the first thing that everybody said when we first suggested this."

"We don't want the HAL 9000, 'I'm sorry, Dave. I can't open the pod bay doors.That's the first thing that everybody said when we first suggested this."

Rather, Lockheed Martin and its collaborators believe having a voice-activated virtual assistant and video calls in the spacecraft would be more convenient for astronauts, affording them access to information away from the crew console. That flexibility might even keep them safer, engineers say.

An experiment to test the technology will ride along with Artemis [nasa.gov] on its first spaceflight, which could launch as early as Aug. 29. The project, named Callisto after one of Artemis' favorite hunting companions in Greek mythology, is programmed to give crew live answers about the spacecraft's flight status and other data, such as water supply and battery levels. The technology is being paid for by the companies — not NASA.

A custom Alexa system built specifically for the spacecraft will have access to some 120,000 data readouts — more than astronauts have had before, with some bonus information previously only available within Houston's mission control.

Howard Hu, NASA’s Orion deputy program manager, and Brian Jones, Lockheed Martin’s chief engineer for the Callisto project, observe signals from the Orion spacecraft at NASA’s Kennedy Space Center in Florida during a connectivity test.

No astronaut will actually be onboard Orion for this first mission — unless the dummy in the cockpit [tumblr.com] counts. But the inaugural 42-day spaceflight [nasa.gov], testing various orbits and atmosphere reentry, will clear the way for NASA to send a crew on subsequent missions. Whether a virtual assistant is integrated into the spacecraft for those expeditions depends on a successful demonstration during Artemis I.

To test their Alexa, mission control will use video-conferencing software provided by Cisco Webex [webex.com] to ask questions and give verbal commands inside the spacecraft. Cisco will run its software on an iPad in the capsule. Cameras mounted all over Orion will monitor how it's working.

Want more science and tech news delivered straight to your inbox? Sign up for Mashable's Top Stories newsletter [mashable.com] today.

For the most part, the virtual assistant will be answering queries, like "Alexa, how fast is Orion traveling?" and "Alexa, what's the temperature in the cabin?" The only thing the system can actually control are the lights, said Justin Nikolaus, an Alexa voice designer on the project.

"As far as control of the vehicle, we don't have access to any critical components or mission critical software onboard," Nikolaus told Mashable. "We're safely sandboxed in Orion."

The space-faring Alexa might not seem so advanced. But engineers had to figure out how to get the device to recognize a voice in a tin can. The acoustics of Orion, with mostly metal surfaces, were unlike anything developers have encountered before. What they learned from the project is now being applied to other challenging sound environments on Earth, like detecting speech in a moving car with the windows rolled down, Nikolaus said.

The most significant change from off-the-shelf Amazon devices is that the system will debut a new technology the company calls "local voice control," which allows Alexa to work without an internet connection. Back on Earth, Alexa operates on the cloud, which runs on the internet and uses computer servers warehoused in data centers.

In deep space, when Orion is hundreds of thousands of miles away, the time delays to reach the cloud would be, shall we say, astronomical. Looking toward the future, that lag could stretch from seconds to an hour to transmit messages back and forth to a spacecraft on its way to Mars [nasa.gov], about 96 million miles from Earth.

That's why engineers built a spacecraft computer to handle the data processing, Chambers said.

"It's not canned things. It's actual real-time processing," he said. "All that smarts has to be on the spacecraft because we didn't want to suffer the time lag of going back up to the spacecraft, back down to Earth, back up, and back down again."

"All that smarts has to be on the spacecraft because we didn't want to suffer the time lag of going back up to the spacecraft, back down to Earth, back up, and back down again." NASA added a new 111-foot beam waveguide antenna to the Deep Space Network at the ground station in Madrid in February 2022.

For the questions that Alexa can't handle offline, Callisto will tap into the Deep Space Network [nasa.gov], the radio dish system NASA uses to communicate with its farthest spacecraft, and route the signals to the cloud on Earth. This could allow Callisto to support a wider range of requests, like reading the news or reporting sports scores.

Or ordering more toilet paper and trash bags — seriously.

The designers built in the capability for astronauts to buy things from Amazon. Overnight delivery to the moon wouldn't be an option, but sending flowers to a spouse on Earth for a special occasion would.

Cisco also will use the Deep Space Network to provide video-conferencing calls. Engineers say astronauts would be able to use this tool for "whiteboarding" meetings with their Houston colleagues. Imagine how handy that would have been for the Apollo 13 [nasa.gov] crew as NASA tried to talk them through how to make a round air filter fit into a square hole [nasa.gov] with no visual aids.


Original Submission