Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 17 submissions in the queue.
posted by martyb on Wednesday September 18 2019, @08:26AM   Printer-friendly
from the brain-rights dept.

Changes in technology often produce ethical quandaries that did not previously exist. The successful transplantation of human hearts lead some to re-define death as "brain-death", so as to allow removal of organs for transplants. Now we may be faced with similar need for new definitions and limitations, as tech moves into neural interfaces. The article is to be found at Vox.

“Nothing was your own except the few cubic centimeters inside your skull.” That’s from George Orwell’s dystopian novel 1984, published in 1949. The comment is meant to highlight what a repressive surveillance state the characters live in, but looked at another way, it shows how lucky they are: At least their brains are still private.

Over the past few weeks, Facebook and Elon Musk’s Neuralink have announced that they’re building tech to read your mind — literally.

Mark Zuckerberg’s company is funding research on brain-computer interfaces (BCIs) that can pick up thoughts directly from your neurons and translate them into words. The researchers say they’ve already built an algorithm that can decode words from brain activity in real time.

And Musk’s company has created flexible “threads” that can be implanted into a brain and could one day allow you to control your smartphone or computer with just your thoughts. Musk wants to start testing in humans by the end of next year.

Of course, with medical technology, one could always make the argument that the issue was saving humans lives. Somehow we do not suspect that Zuckerberg or Musk are contaminated by such motives.

Your brain, the final privacy frontier, may not be private much longer.

Some neuroethicists argue that the potential for misuse of these technologies is so great that we need revamped human rights laws — a new “jurisprudence of the mind” — to protect us. The technologies have the potential to interfere with rights that are so basic that we may not even think of them as rights, like our ability to determine where our selves end and machines begin. Our current laws are not equipped to address this.

It's an in-depth article; a few highlights:

One of the main people pushing for these new human rights is neuroethicist Marcello Ienca, a researcher at ETH Zurich, one of Europe’s top science and technology universities. In 2017, he released a paper outlining four specific rights for the neurotechnology age he believes we should enshrine in law. I reached out to ask what he thought of the recent revelations from Facebook and Neuralink.

The four rights are:

1. The right to cognitive liberty
You should have the right to freely decide you want to use a given neurotechnology or to refuse it.
. . .
2. The right to mental privacy
You should have the right to seclude your brain data or to publicly share it.
. . . .
3. The right to mental integrity
You should have the right not to be harmed physically or psychologically by neurotechnology.
. . .
4. The right to psychological continuity
You should have the right to be protected from alterations to your sense of self that you did not authorize.

Alright, I know what you are thinking; wait, no, I don't! Not really. Let's keep it that way.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by FatPhil on Wednesday September 18 2019, @10:51AM (5 children)

    by FatPhil (863) <{pc-soylent} {at} {asdf.fi}> on Wednesday September 18 2019, @10:51AM (#895560) Homepage
    You won't have a choice - it will be bundled with something that you do want, and eventually need.

    I don't want to run untrusted code from an untrustworthy source in a sandbox known historically to be as tight as Michelle Dugger's furry front bottom, so I use NoScript to protect myself whilst browsing. However, in order to see the timetables for the 23 bus (which passes the street two to the left) and the 43A (which passes a zigzag away to the right - therefore I *need* to make a decision which one I'm going for, and I'd rather make an informed decision - I apparently now *need* to turn on Javascript from several rando domains.

    It's the future, but it will be your head not your browser next time.
    --
    Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by c0lo on Wednesday September 18 2019, @11:03AM (4 children)

    by c0lo (156) Subscriber Badge on Wednesday September 18 2019, @11:03AM (#895565) Journal

    You won't have a choice - it will be bundled with something that you do want, and eventually need.

    I'll always have a choice to not want that and I don't see something that I absolutely need that won't work without implant.
    Unless you condition my food/water/air by me accepting an implant - which is to say you try to deny my right to life - and even then I still have a choice.

    --
    https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 0) by Anonymous Coward on Wednesday September 18 2019, @01:41PM (1 child)

      by Anonymous Coward on Wednesday September 18 2019, @01:41PM (#895633)

      I don't see something that I absolutely need that won't work without implant.

      Yet.

      • (Score: 2) by c0lo on Wednesday September 18 2019, @01:56PM

        by c0lo (156) Subscriber Badge on Wednesday September 18 2019, @01:56PM (#895639) Journal

        Well, I don't see yet a whole list of things, much simpler than controlling the complexity of a human brain. Things like:

        1. fusion reactors
        2. self-driving cars
        3. the P5000 loader (wadda hell could be so hard to build this one? Do we actually need acid blooded aliens for it?)
        --
        https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 0) by Anonymous Coward on Wednesday September 18 2019, @03:29PM (1 child)

      by Anonymous Coward on Wednesday September 18 2019, @03:29PM (#895686)

      Don't worry. This kind of shit will be implanted rigth after born. When the subject don't have the choice of say no.

      • (Score: 2) by fyngyrz on Wednesday September 18 2019, @11:06PM

        by fyngyrz (6567) on Wednesday September 18 2019, @11:06PM (#895886) Journal

        Here in the good 'ol  freedumb freedom-loving USA, what'll most likely happen is a lot of talk about "rights", and then the government will find a reason to say "no, we have to read your mind, because [Terrorism, The Children, the Homeland, OMG Sex, Taxes, Drug War, etc.]"

        You can pretty much count on it. If the tech arrives, US citizens will be at its mercy. Our history in this area is very consistent.

        --
        I wouldn't do anything for a Klondyke bar,
        but I'd do some sketchy stuff for pizza.