Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Thursday January 28 2021, @10:04AM   Printer-friendly
from the dealer's-choice? dept.

Should a self-driving car kill the baby or the grandma? Depends on where you're from.:

In 2014 researchers at the MIT Media Lab designed an experiment called Moral Machine. The idea was to create a game-like platform that would crowdsource people's decisions on how self-driving cars should prioritize lives in different variations of the "trolley problem." In the process, the data generated would provide insight into the collective ethical priorities of different cultures.

The researchers never predicted the experiment's viral reception. Four years after the platform went live, millions of people in 233 countries and territories have logged 40 million decisions, making it one of the largest studies ever done on global moral preferences.

A new paper published in Nature presents the analysis of that data and reveals how much cross-cultural ethics diverge on the basis of culture, economics, and geographic location.

[...] Awad hopes the results will also help technologists think more deeply about the ethics of AI beyond self-driving cars. "We used the trolley problem because it's a very good way to collect this data, but we hope the discussion of ethics don't stay within that theme," he said. "The discussion should move to risk analysis—about who is at more risk or less risk—instead of saying who's going to die or not, and also about how bias is happening." How these results could translate into the more ethical design and regulation of AI is something he hopes to study more in the future.

"In the last two, three years more people have started talking about the ethics of AI," Awad said. "More people have started becoming aware that AI could have different ethical consequences on different groups of people. The fact that we see people engaged with this—I think that that's something promising."

Journal Reference:
Edmond Awad, Sohan Dsouza, Richard Kim, et al. The Moral Machine experiment, Nature (DOI: 10.1038/s41586-018-0637-6)


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
1 (2)
  • (Score: 5, Insightful) by bradley13 on Thursday January 28 2021, @02:37PM (10 children)

    by bradley13 (3053) Subscriber Badge on Thursday January 28 2021, @02:37PM (#1106080) Homepage Journal

    Honestly, I think this is all a non-issue. I understand that people like to philosophize about it, and I'm sure it makes ambulance-chasing lawyers salivate. But consider just how far down the priority list this really is:

    How many of you have had an accident? Of those, how many involved serious bodily injury? Of those, how many were simply avoidable - i.e., were caused by human error? That's where the big win is, with self-driving cars.

    How many people have ever been in an accident where they had a "trolley-problem" choice? I.e., an accident was unavoidable, someone was going to get hurt, and there was actually a chance to pick the victim? The number is vanishingly small, and that's not going to change with self-driving cars.

    --
    Everyone is somebody else's weirdo.
    • (Score: 0) by Anonymous Coward on Thursday January 28 2021, @03:51PM (5 children)

      by Anonymous Coward on Thursday January 28 2021, @03:51PM (#1106112)

      You are right. This has nothing to do with tech and is just navel gazing, but look at the source:
      "researchers at the MIT Media Lab".

      The MIT Media Lab is an island of fluff and pontification in an otherwise hard science/engineering university. I think it's called the Media Lab because all their "work" is purposed towards getting attention from the popular media.

      • (Score: 0) by Anonymous Coward on Thursday January 28 2021, @04:15PM (3 children)

        by Anonymous Coward on Thursday January 28 2021, @04:15PM (#1106132)

        > "researchers at the MIT Media Lab".

        Media Lab just announced a new director, https://finance.yahoo.com/news/mit-media-lab-names-dava-190447292.html [yahoo.com]

        MIT's famous Media Lab, the multidisciplinary idea factory that produces many a fascinating invention and influential thinker, has found a new director in its backyard after scouring the globe for candidates. Dava Newman, MIT professor of aeronautics and astronautics and former deputy administrator of NASA under Obama, will helm the intellectual hub.

        The Media Lab is famed for its freewheeling techno-intellectual prowess, but for more than a year has been leaderless following the resignation of former head Joi Ito. Ito resigned when it was discovered that billionaire and alleged child sex trafficker Jeffrey Epstein had given funding and reportedly received special treatment and access to the Media Lab under his leadership.

        The ensuing leadership search no doubt looked for, if not exactly new blood (Newman has been involved with MIT for decades) then certainly a break from the past. Out of 60 candidates, they interviewed 13 and ended up picking Newman for a variety of reasons.

        "In a field of outstanding candidates, Professor Newman stood out for her pioneering research, wide range of multidisciplinary engagements, and exemplary leadership. She is a designer, a thinker, a maker, an engineer, an educator, a mentor, a convener, a communicator, a futurist, a humanist and, importantly, an optimist," wrote Dean Hashim Sarkis in a letter announcing the appointment.

        Coincidentally (or is it?), Newman just last week was a speaker at TC Sessions: Space, where she seemed to give a preview of her new responsibilities talking about the importance of inclusion in major efforts like NASA's Artemis.

        "It's going to bring the scientists and engineers together, but we need the artists, we need the designers, they're the visionaries," she said. (If you missed the event, you can watch this and all our other panels on Extra Crunch.)

        Newman seems to be starting off the job by emphasizing one of the best qualities a leader should have: listening to the people she'll be leading.

        "I plan to start by doing a lot of listening and learning," she said in the MIT announcement. "I like to meet people where they are, and to encourage them to put all their great ideas on the table. I think that’s the best way to go forward, working with the whole community — faculty, students and staff — to tap into everyone’s creativity. I can’t wait to get started."

        Among other things, she designed one of the new generations of space suit.

        • (Score: 0) by Anonymous Coward on Thursday January 28 2021, @04:44PM (2 children)

          by Anonymous Coward on Thursday January 28 2021, @04:44PM (#1106151)

          AND she's a woman! There really was never going to be another man picked to head the woke Media Lab.

          • (Score: 0) by Anonymous Coward on Thursday January 28 2021, @07:04PM (1 child)

            by Anonymous Coward on Thursday January 28 2021, @07:04PM (#1106219)

            Aaaaand that explains the other quote

            Attempts to formulate a "perfect society" on any foundation other than "Women and children first!" is not only witless, it is automatically genocidal. Nevertheless, starry-eyed idealists (all of them male) have tried endlessly - and no doubt will keep on trying.

            palpable disdain

            • (Score: 0) by Anonymous Coward on Friday January 29 2021, @04:20AM

              by Anonymous Coward on Friday January 29 2021, @04:20AM (#1106469)

              Well we all know "women and children first" is not really how it works. That's what rich dicks say about once a year to prove they are gentlemen and the rest of the time it's underage rape on private islands am I even joking no.

      • (Score: 0) by Anonymous Coward on Thursday January 28 2021, @07:01PM

        by Anonymous Coward on Thursday January 28 2021, @07:01PM (#1106217)

        The MIT Media Lab is an island of fluff and pontification

        Oh so I guess that explains the quote above

        All else is surplusage, excrescence, adornment, luxury, or folly

        what a mellifluous turlingdrome

    • (Score: 2) by legont on Thursday January 28 2021, @11:23PM (2 children)

      by legont (4179) on Thursday January 28 2021, @11:23PM (#1106341)

      People who actually know how to drive - and I know and asked some - behave like this. I am just giving an example here:
      Somebody or something human size runs across the way. When they are alone they break hard. If they have their children on the back seats, they don't break at all but go for a controlled crash using the softest obstacle available. If is is a pedestrian, so be it. They don't want to be hit from behind and their children to be injured.

      --
      "Wealth is the relentless enemy of understanding" - John Kenneth Galbraith.
      • (Score: 2) by arslan on Friday January 29 2021, @04:44AM (1 child)

        by arslan (3462) on Friday January 29 2021, @04:44AM (#1106476)

        Realistically though, situations like these are almost always unexpected and sudden enough that most people will be driven by their reflex no matter what they would hypothetically do on paper.

        Having such thought experiments are fun, but they should never really be a pre-requisite for self driving cars. The default for self driving cars should always be to brake in as short a time as possible while maintaining passenger safety - if it can't find alternate course free of collision.

        The minimum bar should also be that the processing power for the computer surpass that of a human in those split second scenarios and also the information gathering of the surrounding surpass that of a human with normal eyesight in all conditions. Otherwise no self-driving cars should be on the road if it has worse reaction & sensory capability than a normal healthy human.

        • (Score: 2) by The Mighty Buzzard on Friday January 29 2021, @04:12PM

          by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Friday January 29 2021, @04:12PM (#1106623) Homepage Journal

          See, you've shot your own argument down there. Reflex does not require much of any processing power and you don't have much of any time to think during the moment. Which is why good drivers do not rely on thinking during the moment. They think ahead of time. They assume everyone else around them are inattentive morons and act/plan accordingly.

          --
          My rights don't end where your fear begins.
    • (Score: 2) by The Mighty Buzzard on Friday January 29 2021, @04:06PM

      by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Friday January 29 2021, @04:06PM (#1106621) Homepage Journal

      No, but accountability will. There will be none unless the AI malfunctioned. And even then there will be no chance of anyone going to jail unless they knew it was defective ahead of time.

      --
      My rights don't end where your fear begins.
  • (Score: 2, Funny) by Anonymous Coward on Thursday January 28 2021, @02:41PM (1 child)

    by Anonymous Coward on Thursday January 28 2021, @02:41PM (#1106082)

    ... the car needs to scan the facebook pages of the grandma and the baby. Whoever has more kitten videos wins.

    • (Score: 2) by looorg on Thursday January 28 2021, @02:57PM

      by looorg (578) on Thursday January 28 2021, @02:57PM (#1106089)

      So it will become sort of like a deathmatch then, they could stream it for profit!

  • (Score: 3, Interesting) by legont on Thursday January 28 2021, @02:42PM (3 children)

    by legont (4179) on Thursday January 28 2021, @02:42PM (#1106083)

    A driving AI sees a small animal running across the street. It estimates that it is a baby bear with 99% and a baby human with 1%. An emergency braking, it estimates again, would kill the passengers (should it conciser the ages of those as well?) with 1% too.
    It decides to avoid braking and so, it happened, kills a child.

    --
    "Wealth is the relentless enemy of understanding" - John Kenneth Galbraith.
    • (Score: 0) by Anonymous Coward on Thursday January 28 2021, @04:03PM

      by Anonymous Coward on Thursday January 28 2021, @04:03PM (#1106124)

      An old person that can't hardly see any more plows into a bus stop killing 20 people. Sad.

      In other news, AR-15 sales are up 200% amid growing claims of stolen election.

    • (Score: 3, Informative) by Common Joe on Friday January 29 2021, @01:14PM (1 child)

      by Common Joe (33) <common.joe.0101NO@SPAMgmail.com> on Friday January 29 2021, @01:14PM (#1106574) Journal

      Probabilities would be nice, but unfortunately they won't be there. Some types of AI works on probabilities and can give you reasons for their decisions, but neural networks cannot. Neural networks are trained on specific scenarios and cannot give a reason as to why they chose one thing over another. They simply know through their previous training scenarios that they should make a decision similar to the training they've had.

      • (Score: 2) by legont on Friday January 29 2021, @10:26PM

        by legont (4179) on Friday January 29 2021, @10:26PM (#1106724)

        No, that's not the case.
        In general, a probability in real life arises as a result of averaging many deterministic outcomes. The same way neural network decisions may look based on training, but in reality are a chance over training.

        --
        "Wealth is the relentless enemy of understanding" - John Kenneth Galbraith.
  • (Score: 3, Touché) by inertnet on Thursday January 28 2021, @03:14PM

    by inertnet (4071) Subscriber Badge on Thursday January 28 2021, @03:14PM (#1106096) Journal

    What if grandma is the only caretaker for the baby? If Mr/Ms AI kills grandma, her grandchild will die as well.

  • (Score: 2) by Runaway1956 on Thursday January 28 2021, @03:35PM (6 children)

    by Runaway1956 (2926) Subscriber Badge on Thursday January 28 2021, @03:35PM (#1106103) Homepage Journal

    Just curious, really. Who has been there, and done that?

    --
    Abortion is the number one killed of children in the United States.
    • (Score: 0) by Anonymous Coward on Thursday January 28 2021, @04:07PM

      by Anonymous Coward on Thursday January 28 2021, @04:07PM (#1106126)

      Well it's not the same but one time I had the choice to bag yo momma or yo sista. Tough call but yo momma promised anal so no choice really.

    • (Score: 2) by DannyB on Thursday January 28 2021, @05:59PM (4 children)

      by DannyB (5839) Subscriber Badge on Thursday January 28 2021, @05:59PM (#1106182) Journal

      Exceedingly few people if any.

      But the question reveals things about the ethical character of the person. Similar to asking whether they use vi or emacs.

      --
      If you think a fertilized egg is a child but an immigrant child is not, please don't pretend your concerns are religious
      • (Score: 1, Funny) by Anonymous Coward on Thursday January 28 2021, @06:32PM (2 children)

        by Anonymous Coward on Thursday January 28 2021, @06:32PM (#1106206)

        No, just no!

        > Similar to asking whether they use vi or emacs.

        We do car analogies here, text editor analogies are NOT welcome.

        • (Score: 2) by DannyB on Thursday January 28 2021, @06:53PM (1 child)

          by DannyB (5839) Subscriber Badge on Thursday January 28 2021, @06:53PM (#1106211) Journal

          Trump, Jonestown, Hitler illustrate that people can become irrationally locked into a view which cannot be questioned. All rational thought is abandoned. The true way is the only way. You will observe this if you try talking to an emacs user.

          --
          If you think a fertilized egg is a child but an immigrant child is not, please don't pretend your concerns are religious
          • (Score: 0) by Anonymous Coward on Friday January 29 2021, @01:26AM

            by Anonymous Coward on Friday January 29 2021, @01:26AM (#1106413)

            > You will observe this if you try talking to an emacs user.

            Tried the rest,
            stuck with the best,
            ...
            ...
            Burma Shave!

            (Started using emacs c.1976, lots to memorize, lots of productivity after that.)

      • (Score: 0) by Anonymous Coward on Thursday January 28 2021, @08:36PM

        by Anonymous Coward on Thursday January 28 2021, @08:36PM (#1106273)

        But do you use vi, nvi, vim, neovim, or vis? PCRE? What about tabwidth?

  • (Score: 2) by DannyB on Thursday January 28 2021, @05:58PM (1 child)

    by DannyB (5839) Subscriber Badge on Thursday January 28 2021, @05:58PM (#1106181) Journal

    Should you honk the horn to warn them? Or not?

    Maybe they wouldn't be able to get out of the way, so why honk.

    --
    If you think a fertilized egg is a child but an immigrant child is not, please don't pretend your concerns are religious
    • (Score: 2) by tangomargarine on Thursday January 28 2021, @06:57PM

      by tangomargarine (667) on Thursday January 28 2021, @06:57PM (#1106214)

      Maybe they wouldn't be able to get out of the way, so why honk.

      Because of the chance that they can?

      --
      "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
  • (Score: 1, Funny) by Anonymous Coward on Thursday January 28 2021, @06:17PM

    by Anonymous Coward on Thursday January 28 2021, @06:17PM (#1106193)

    Multi-track drifting!

  • (Score: 0) by Anonymous Coward on Thursday January 28 2021, @06:28PM

    by Anonymous Coward on Thursday January 28 2021, @06:28PM (#1106201)

    Wanna know which regions go which way? I RTFA so you don't have to.

    The case for killing grandma is obvious, but RAH laid it out better than I can:

    All societies are based on rules to protect pregnant women and young children. All else is surplusage, excrescence, adornment, luxury, or folly, which can - and must - be dumped in emergency to preserve this prime function. As racial survival is the only universal morality, no other basic is possible. Attempts to formulate a "perfect society" on any foundation other than "Women and children first!" is not only witless, it is automatically genocidal. Nevertheless, starry-eyed idealists (all of them male) have tried endlessly - and no doubt will keep on trying.

    France, and to a lesser extent Greece, uphold that standard, with a strong preference to save the baby.

    And the case for ambivalence -- i.e., deciding these cases on other factors -- is equally clear; you think those other factors, adornment, luxury or folly though they may be, are more relevant here. After all, we're talking about one car wreck, not a racial existential crisis; there's no call to dump them just yet.
    You'll find most European countries (and the US, Canada, etc.) clustered around zero.

    But what kind of insectoid culture would actively prefer killing babies to save the elderly?
    I mean, when you put it like that it becomes pretty obvious, but yeah. The answer is asian bugmen.
    Singapore less so, China and Taiwan more, but all the asian countries listed are solidly on the save-grandma side, sacrificing the future for the past.

  • (Score: 2) by srobert on Friday January 29 2021, @01:47AM

    by srobert (4803) on Friday January 29 2021, @01:47AM (#1106419)

    Back in 1976 my 13 year old self played quite a bit of Death Race. It was an arcade video game. In my experience it should kill the Grandma first, and then go back for the baby for the extra points.

  • (Score: 1, Touché) by Anonymous Coward on Friday January 29 2021, @07:51PM

    by Anonymous Coward on Friday January 29 2021, @07:51PM (#1106678)

    They'll do whatever their manufacturers think will cost them the least. Look what happened with the 737 Max because Boeing wanted to save money.

    The safe easy thing for a self driving car to do is to try to stop "when stuff happens". Yeah that means bad stuff can happen if it stops at a railway crossing for too long but in most scenarios the occupants should get out.

    If the road is detected to be slippery, limit the speed and increase the braking distance estimates. If the road is too icy for the tyres (wrong tyres for icy conditions) then don't even move or start. Getting sued for not moving is usually less costly than getting sued for moving and killing someone.

1 (2)