Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Tuesday April 05 2022, @05:47PM   Printer-friendly

Decoding Movement and Speech from the Brain of a Tetraplegic Person - Technology Org:

Every year, the lives of hundreds of thousands of people are severely disrupted when they lose the ability to move or speak as a result of spinal injury, stroke, or neurological diseases.

At Caltech, neuroscientists in the laboratory of Richard Andersen, James G. Boswell Professor of Neuroscience, and Leadership Chair and Director of the Tianqiao & Chrissy Chen Brain-Machine Interface Center, are studying how the brain encodes movements and speech, in order to potentially restore these functions to those individuals who have lost them.

Brain-machine interfaces (BMIs) are devices that record brain signals and interpret them to issue commands that operate external assistive devices, such as computers or robotic limbs. Thus, an individual can control such machinery just with their thoughts.

For example, in 2015, the Andersen team and colleagues worked with a tetraplegic participant to implant recording electrodes into a part of the brain that forms intentions to move. The BMI enabled the participant to direct a robotic limb to reach out and grasp a cup, just by thinking about those actions.

[...] The exact location in the brain where electrodes are implanted affects BMI performance and what the device can interpret from brain signals. In the previously mentioned 2015 study, the laboratory discovered that BMIs are able to decode motor intentions while a movement is being planned, and thus before the onset of that action if they are reading signals from a high-level brain region that governs intentions: the posterior parietal cortex (PPC). Electrode implants in this area, then, could lead to control of a much larger repertoire of movements than more specialized motor areas of the brain.

Because of the ability to decode intention and translate it into movement, an implant in the PPC requires only that a patient thinks about the desire to grasp an object rather than having to envision each of the precise movements involved in grasping—opening the hand, unfolding each finger, placing the hand around an object, closing each finger, and so on.

Journal Reference:
Sarah K.Wandelt, et. al.,Decoding grasp and speech signals from the cortical grasp circuit in a tetraplegic human [open], Neuron (DOI: 10.1016/j.neuron.2022.03.009)


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 0) by Anonymous Coward on Tuesday April 05 2022, @07:10PM (1 child)

    by Anonymous Coward on Tuesday April 05 2022, @07:10PM (#1235047)

    I was expecting the use of trained AI
    instead, they used statistics
    interesting, anyway

    CYA

    • (Score: 1, Touché) by Anonymous Coward on Sunday April 10 2022, @05:48PM

      by Anonymous Coward on Sunday April 10 2022, @05:48PM (#1236030)
      Most modern AI is just statistics with a lot of marketing and buzzwords.

      If there was actual intelligence involved training the AIs wouldn't need zillions of samples.
  • (Score: -1, Troll) by Anonymous Coward on Tuesday April 05 2022, @08:15PM

    by Anonymous Coward on Tuesday April 05 2022, @08:15PM (#1235073)

    He managed to get his buttocks posting journals! Magic......

  • (Score: 0) by Anonymous Coward on Wednesday April 06 2022, @02:25PM

    by Anonymous Coward on Wednesday April 06 2022, @02:25PM (#1235226)

    Are tetraplegics British?

(1)