Alexa's chief scientist thinks the assistant needs a robot body to understand the world
Amazon's Rohit Prasad, head scientist and an instrumental member of the Alexa division, says the company's personal software assistant would be far smarter if it had a robot body and cameras to move around in the real world. Prasad, speaking at MIT Technology Review's EmTech Digital AI conference in San Francisco yesterday, said, "The only way to make smart assistants really smart is to give it eyes and let it explore the world."
Some Alexa-enabled smart devices already have cameras. But a robot body would be new. Prasad's comments suggest that work could be in service of one day giving Alexa a body — although he wouldn't confirm this directly. Prasad works on natural language processing and other machine learning capabilities for Alexa, so it's likely if he wanted to test these features out, he'd be one of the few Amazon employees who could easily go ahead and try it.
Someday, we can truly have sex with Alexa.
Related: Amazon Plans to Add Alexa Voice Support to Microwaves, Amplifiers, Subwoofers, and "In-Car Gadgets"
(Score: 0) by Anonymous Coward on Thursday March 28 2019, @07:29AM (2 children)
If a machine runs into a wall it can learn to map around it.
How can it tell the difference between a wall of wood, concrete, water or fire?
Do these machines need a nervous system like ours?
(Score: 2) by c0lo on Thursday March 28 2019, @10:15AM (1 child)
Senses like ours at the very least.
https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
(Score: 0) by Anonymous Coward on Thursday March 28 2019, @01:44PM
Stick it in a quake game with frog bots and let it learn?