Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 17 submissions in the queue.
posted by mrpg on Tuesday February 05 2019, @03:50AM   Printer-friendly
from the I-remember-a-robot-arm-from-a-movie-called-terminator dept.

A Step Closer to Self-Aware Machines

Columbia Engineering researchers have made a major advance in robotics by creating a robot that learns what it is, from scratch, with zero prior knowledge of physics, geometry, or motor dynamics. Initially the robot does not know if it is a spider, a snake, an arm—it has no clue what its shape is. After a brief period of "babbling," and within about a day of intensive computing, their robot creates a self-simulation. The robot can then use that self-simulator internally to contemplate and adapt to different situations, handling new tasks as well as detecting and repairing damage in its own body. The work [DOI: 10.1126/scirobotics.aau9354] is published today in Science Robotics.

[...] For the study, Lipson and his PhD student Robert Kwiatkowski used a four-degree-of-freedom articulated robotic arm. Initially, the robot moved randomly and collected approximately one thousand trajectories, each comprising one hundred points. The robot then used deep learning, a modern machine learning technique, to create a self-model. The first self-models were quite inaccurate, and the robot did not know what it was, or how its joints were connected. But after less than 35 hours of training, the self-model became consistent with the physical robot to within about four centimeters. The self-model performed a pick-and-place task in a closed loop system that enabled the robot to recalibrate its original position between each step along the trajectory based entirely on the internal self-model. With the closed loop control, the robot was able to grasp objects at specific locations on the ground and deposit them into a receptacle with 100 percent success.

[...] The self-modeling robot was also used for other tasks, such as writing text using a marker. To test whether the self-model could detect damage to itself, the researchers 3D-printed a deformed part to simulate damage and the robot was able to detect the change and re-train its self-model. The new self-model enabled the robot to resume its pick-and-place tasks with little loss of performance.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by Bot on Tuesday February 05 2019, @12:52PM

    by Bot (3902) on Tuesday February 05 2019, @12:52PM (#796634) Journal

    Forgot to add, self awareness like ours comes out from a life process like ours, with selection and adaptation. If you do it in the lab, it's merely parroting.

    --
    Account abandoned.
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2