Stories
Slash Boxes
Comments

SoylentNews is people

posted by hubie on Tuesday December 02, @11:40AM   Printer-friendly
from the is-this-your-card? dept.

Ethicists say AI-powered advances will threaten the privacy and autonomy of people who use neurotechnology:

Before a car crash in 2008 left her paralysed from the neck down, Nancy Smith enjoyed playing the piano. Years later, Smith started making music again, thanks to an implant that recorded and analysed her brain activity. When she imagined playing an on-screen keyboard, her brain–computer interface (BCI) translated her thoughts into keystrokes — and simple melodies, such as 'Twinkle, Twinkle, Little Star', rang out

But there was a twist. For Smith, it seemed as if the piano played itself. "It felt like the keys just automatically hit themselves without me thinking about it," she said at the time. "It just seemed like it knew the tune, and it just did it on its own."

Smith's BCI system, implanted as part of a clinical trial, trained on her brain signals as she imagined playing the keyboard. That learning enabled the system to detect her intention to play hundreds of milliseconds before she consciously attempted to do so, says trial leader Richard Andersen, a neuroscientist at the California Institute of Technology in Pasadena.

[...] Andersen's research also illustrates the potential of BCIs that access areas outside the motor cortex. "The surprise was that when we go into the posterior parietal, we can get signals that are mixed together from a large number of areas," says Andersen. "There's a wide variety of things that we can decode."

The ability of these devices to access aspects of a person's innermost life, including preconscious thought, raises the stakes on concerns about how to keep neural data private. It also poses ethical questions about how neurotechnologies might shape people's thoughts and actions — especially when paired with artificial intelligence.

Meanwhile, AI is enhancing the capabilities of wearable consumer products that record signals from outside the brain. Ethicists worry that, left unregulated, these devices could give technology companies access to new and more precise data about people's internal reactions to online and other content.

Ethicists and BCI developers are now asking how previously inaccessible information should be handled and used. "Whole-brain interfacing is going to be the future," says Tom Oxley, chief executive of Synchron, a BCI company in New York City. He predicts that the desire to treat psychiatric conditions and other brain disorders will lead to more brain regions being explored. Along the way, he says, AI will continue to improve decoding capabilities and change how these systems serve their users. "It leads you to the final question: how do we make that safe?"

[...] Although accurate user numbers are hard to gather, many thousands of enthusiasts are already using neurotech headsets. And ethicists say that a big tech company could suddenly catapult the devices to widespread use. Apple, for example, patented a design for EEG sensors for future use in its Airpods wireless earphones in 2023.

Yet unlike BCIs aimed at the clinic, which are governed by medical regulations and privacy protections, the consumer BCI space has little legal oversight, says David Lyreskog, an ethicist at the University of Oxford, UK. "There's a wild west when it comes to the regulatory standards," he says.

In 2018, Ienca and his colleagues found that most consumer BCIs don't use secure data-sharing channels or implement state-of-the-art privacy technologies2. "I believe that has not changed," Ienca says. What's more, a 2024 analysis3 of the data policies of 30 consumer neurotech companies by the Neurorights Foundation, a non-profit organization in New York City, showed that nearly all had complete control over the data users provided. That means most firms can use the information as they please, including selling it.

Responding to such concerns, the government of Chile and the legislators of four US states have passed laws that give direct recordings of any form of nerve activity protected status. But Ienca and Nita Farahany, an ethicist at Duke University in Durham, North Carolina, fear that such laws are insufficient because they focus on the raw data and not on the inferences that companies can make by combining neural information with parallel streams of digital data. Inferences about a person's mental health, say, or their political allegiances could still be sold to third parties and used to discriminate against or manipulate a person.

"The data economy, in my view, is already quite privacy-violating and cognitive- liberty-violating," Ienca says. Adding neural data, he says, "is like giving steroids to the existing data economy".

Several key international bodies, including the United Nations cultural organization UNESCO and the Organisation for Economic Co-operation and Development, have issued guidelines on these issues. Furthermore, in September, three US senators introduced an act that would require the Federal Trade Commission to review how data from neurotechnology should be protected.
Heading to the clinic

While their development advances at pace, so far no implanted BCI has been approved for general clinical use. Synchron's device is closest to the clinic. This relatively simple BCI allows users to select on-screen options by imagining moving their foot. Because it is inserted into a blood vessel on the surface of the motor cortex, it doesn't require neurosurgery. It has proved safe, robust and effective in initial trials4, and Oxley says Synchron is discussing a pivotal trial with the US Food and Drug Administration that could lead to clinical approval.

Elon Musk's neurotech firm Neuralink in Fremont, California, has surgically implanted its more complex device in the motor cortices of at least 13 volunteers who are using it to play computer games, for example, and control robotic hands. Company representatives say that more than 10,000 people have joined waiting lists for its clinical trials.

At least five more BCI companies have tested their devices in humans for the first time over the past two years, making short-term recordings (on timescales ranging from minutes to weeks) in people undergoing neurosurgical procedures. Researchers in the field say the first approvals are likely to be for devices in the motor cortex that restore independence to people who have severe paralysis — including BCIs that enable speech through synthetic voice technology.

As for what's next, Farahany says that moving beyond the motor cortex is a widespread goal among BCI developers. "All of them hope to go back further in time in the brain," she says, "and to get to that subconscious precursor to thought."


Original Submission

 
This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by gnuman on Friday December 05, @10:15AM

    by gnuman (5013) on Friday December 05, @10:15AM (#1425877)

    Jews are above all nations and Chabad is above the Jews.

    So, if you replace any other group here, it's basically a statement of supremacy. It's kind of opposite of what I wrote too. You may just put Aryans and Nazis above and it fits well.

    It is impossible for me to not radicalize myself against anyone who systematically wants to colonize and enslave the whole planet, for thousands of years in a row. Intentionally destroying other cultures in the process...

    But you have to understand *why* they are doing this. And it's not "them", it's for any "other" finds and fights "enemies". They are doing this not because of kumbaya feelings or actually even feeling supremacy. This all happens when you allow *fear* to drive you. Someone that says "I'm better than all of you untermenschen" is not saying this because they are happy or actually powerful. They do this because they are scared shitless of the very people they belittle. You may have noticed, this is always the "we must defend us from the ...." and then you put here savages or whatever. This is a mindset of fear. This is how cults indoctrinate members.

    You have just hardened me more radical.

    No, that I'm definitely not doing. You can't even look into the mirror, after all.

    You can fight fire with fire. You get more fire. Then the cycle repeats. Then when everyone is tired, they try to find solace and magically rediscover faith. Eventually complacency sets in. Fear wedges itself into mindsets and then it drives hate. Then back to fire with fire?

    Personally, I always thought that you have to be radical against radicals. In some ways, that gives satisfaction. But, it does not prevent more radicals from radicalizing themselves. And today, that is even easier. For everyone, you can find a way to radicalize them. For everyone, you can find something they will be disturbed by. There is a reason why you do not want righteous people as members of police -- you want people to enforce the law, not take it into their own hands.

    The *only* solution we have is faith in each other. This also means de-radicalizing the radicalized. Only then things will get better. More seriously, look around you. Look at your neighbourhood. Walk and look around. Talk to people. Then you realize that people that are living there are better off than almost anyone 100 years ago. Better than kings 200 years ago, that's for sure. We do not have 1 dead child for every 2 born -- that was *normal*. For millions of years. So, the question is, do you want to improve this or burn it all? Life is not as scary as you imagine. But without faith, it can be worse than you can image.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2