Stories
Slash Boxes
Comments

SoylentNews is people

posted by takyon on Tuesday May 08 2018, @03:11PM   Printer-friendly
from the false-negative dept.

The first machine to kill a human entirely on its own initiative was "Likely Caused By Software Set to Ignore Objects On Road" according to a new report on the collision which happened last March:

The car's sensors detected the pedestrian, who was crossing the street with a bicycle, but Uber's software decided it didn't need to react right away. That's a result of how the software was tuned. Like other autonomous vehicle systems, Uber's software has the ability to ignore "false positives," or objects in its path that wouldn't actually be a problem for the vehicle, such as a plastic bag floating over a road. In this case, Uber executives believe the company's system was tuned so that it reacted less to such objects. But the tuning went too far, and the car didn't react fast enough, one of these people said.

Fast enough? She walked across three and a half lanes in what should have been plain view of the car's LIDAR the entire time.

takyon: Also at Reuters. Older report at The Drive.

Previously: Uber Pulls Self-Driving Cars After First Fatal Crash of Autonomous Vehicle
Video Released of Fatal Uber - Pedestrian Accident, and More


Original Submission #1Original Submission #2

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by Justin Case on Tuesday May 08 2018, @03:20PM (39 children)

    by Justin Case (4239) on Tuesday May 08 2018, @03:20PM (#677044) Journal

    Suppose I make a machine that has the ability to kill people. Not a gun, where the kill decision is made by a human, but a machine that can make that decision on its own initiative.

    Now I add the ability for this machine to wander around in public, so fast, in fact, that you can't possibly run away.

    Let me give it the capacity to kill not just enemy troops, nor even members of some experimental team who have consented to accept the risk -- no -- I will deliberately include in my design the ability to kill people who are entirely uninvolved in the project. People who never agreed to accept the risk (you know, except for the generic Planetary Occupancy EULA) or were even warned that they would be in the crosshairs (you know, except for the generic Planetary Occupancy EULA).

    Does this sound like a good idea to anyone??? Wouldn't I be severely prosecuted for even attempting to produce such a weapon? Is there any justification I could offer that would suffice?

    • (A) I plan to make a lot of money from this. Nope.
    • (B) But other people are doing it. Haha.
    • (C) It will be so super cool. Seriously? That's a defense now?
    • (D) I "promise" (trust me!) it will do some good things too. OK so we've reached the point where we can calculate the conditions under which it is OK for me to design and build a killbot???

    Assuming this goes to trial (and it should) and a jury finds people guilty (and I suppose they should) I fully support the death penalty for those responsible. Moreover, I will be happy to personally swing a sledge hammer, on live TV, up to the heads of any

    • programmer
    • project "manager" barking "is it done yet?"
    • actual manager, all the way up to CEO
    • negligent QA tester
    • marketdroid

    on any other person who knowingly and with utter disregard for human life participated in the design, manufacture, promotion, or rollout of this killbot technology. Yes I'm serious. Premeditated murder of random bystanders is inexcusable.

    No, I do not buy the argument that "other lives will be saved". For one thing, all the claims about what SDCs "will be" is pure speculation. You don't suddenly get the ability to predict the future just because you add "on a computer".

    I have a modest proposal for you. Let's give Donald Trump permission to kill people.

    We'll debate it in Congress, all proper like, then hold a vote. Even though SDC deployment has not been similarly vetted.

    Here's the plan: Donald Trump can kill people any time he wants. No advance notice, no trial, no opportunity to defend yourself. You won't even be put on notice that you're in his crosshairs, other than the general notice given to the entire world that he now has this power and can use it with no checks or limits.

    Your *only* possible defense will be to stay at least 50 feet away from any expanse of pavement. Forever.

    We'll mutter some vague platitudes about how giving Trump authority to murder on any whim will cost fewer than 30,000* lives per year, so it will be OK. No actual guarantee, of course, that the limit will not be exceeded, or penalties when it is.

    * substitute whatever number you think should be here

    Meanwhile, nobody seems to be considering the potential for mass exploits. Ever heard of a software zero-day? They are discovered all the time, thanks to careless software development practices combined with management haste to get it out the door. Is there any reason at all to expect SDC software will be different? I mean, besides platitudes and empty promises? Before you answer consider the present case involves "mis-tuned" software. Someone made that decision. Someone who is now a murderer.

    Anyway, consider yourself lucky if you have never experienced a software hack that takes down thousands of machines at once. It is entirely possible, and devastating. But so far this has been limited to loss of data and compute power, not loss of life. Picture the hack that sends a million cars into 100-mile-per-hour chaos mode. What will this do to your "lives saved" argument?

    Software development practices are nowhere near reliable enough to bet our lives on them, and likely will not get there for decades, if ever. And even if the software is perfect (it never is) you can't even trust your hardware!

    Starting Score:    1  point
    Moderation   +2  
       Flamebait=1, Insightful=1, Interesting=2, Disagree=2, Total=6
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 0) by Anonymous Coward on Tuesday May 08 2018, @03:30PM (8 children)

    by Anonymous Coward on Tuesday May 08 2018, @03:30PM (#677046)

    Anyway, consider yourself lucky if you have never experienced a software hack that takes down thousands of machines at once. It is entirely possible, and devastating. But so far this has been limited to loss of data and compute power, not loss of life. Picture the hack that sends a million cars into 100-mile-per-hour chaos mode. What will this do to your "lives saved" argument?

    To be fair, killing people by remotely controlling the computers in cars is almost certainly possible already but doesn't seem to happen much. I guess most people just aren't really into mass murder.

    • (Score: 1, Funny) by Anonymous Coward on Tuesday May 08 2018, @03:36PM (6 children)

      by Anonymous Coward on Tuesday May 08 2018, @03:36PM (#677049)

      Isn't that where you show up at church, and start shooting people? Or drowning them in the baptismal?

      • (Score: 2, Flamebait) by realDonaldTrump on Tuesday May 08 2018, @05:29PM (4 children)

        by realDonaldTrump (6614) on Tuesday May 08 2018, @05:29PM (#677102) Homepage Journal

        May God be w/ the people of Charleston, South Carolina and Sutherland Springs, Texas. The tragedies there are incomprehensible. My deepest condolences to all.

        • (Score: 2) by SomeGuy on Wednesday May 09 2018, @05:31PM (3 children)

          by SomeGuy (5632) on Wednesday May 09 2018, @05:31PM (#677514)

          Just so everyone is clear, there is no such thing as "God".

          A certain president needs to learn this.

          • (Score: 0, Touché) by Anonymous Coward on Wednesday May 09 2018, @07:14PM (2 children)

            by Anonymous Coward on Wednesday May 09 2018, @07:14PM (#677573)

            If he wants to believe in God that's his right, just like it's your right to not believe. Otherwise you are no better than the thought police.

            • (Score: 2) by SomeGuy on Wednesday May 09 2018, @09:05PM (1 child)

              by SomeGuy (5632) on Wednesday May 09 2018, @09:05PM (#677616)

              Wow that is a retarded response. The president is supposed to be representing all of the people in the USA, and a such should not be publicly promoting a particular religion. (or social media platform). That is the issue.

              So all the idiots who tell me I should learn about Gerd are also no better than the thought police? Riiigh, imaginary sky fairy worshipers are ALWAYS the good guys and everyone else is BAAAAD.

              • (Score: 0) by Anonymous Coward on Thursday May 10 2018, @02:33PM

                by Anonymous Coward on Thursday May 10 2018, @02:33PM (#677881)

                > Wow that is a retarded response.

                This is usually a tell of Cognitive dissonance where the other person starts with an insult. Let's continue.

                > The president is supposed to be representing all of the people in the USA

                The USA is largely a Christian nation, and it's a message most wish to hear. If it offends you, just don't listen. No one is forcing you to listen, you choose to. So what if he did? Should I be offended when someone wishes me well? Nonsense. I just accept it regardless what the person says, religious or not, and take a way the main message, that they care. I don't know why you are so hung up on this, your one of the few Athesits I've ever seen get this bent out of shape over this.

                > So all the idiots who tell me I should learn about Gerd are also no better than the thought police?

                Again a personal attack. People who believe are idiots? Do you have factual studies to back this up? There have been plenty of brilliant people who have believed in God (The Big Bang Theory was convinced by a Catholic Priest), and many who have not such as Steven Hawking, or many others who escape me at this moment.

                I never said you should learn about God either, I simply said you should respect their right, otherwise why should the other side respect your right to not believe? At the end of the day, I rather we just respect each others beliefs, but you seem not to be willing to that. You wish to hoist your non-belief on me. I don't know why you feel the need to do this, or get personal life this, but I hope you realize that not everyone is out to force this on you.

      • (Score: 2) by DannyB on Tuesday May 08 2018, @06:39PM

        by DannyB (5839) Subscriber Badge on Tuesday May 08 2018, @06:39PM (#677130) Journal

        Spike the communion grape juice.

        --
        People today are educated enough to repeat what they are taught but not to question what they are taught.
    • (Score: 2, Insightful) by nitehawk214 on Tuesday May 08 2018, @08:05PM

      by nitehawk214 (1304) on Tuesday May 08 2018, @08:05PM (#677163)

      The only way it will be useful is if you can program it to kill a specific person. Will the car loiter around the building where the person was last seen? Will the car need to go refuel if the person is difficult to locate or stays inside the building? Is it ok for the car to kill other people in order to kill the target? Does the car need programming to evade law enforcement? What if the target is in another autonomous vehicle? Can the cars negotiate with each other on who should do the killing? Will the car sacrifice itself, such as driving off a cliff or getting in to a collision where the car will not survive, in order to achieve it's goal?

      Wait, are we not talking about how to best turn these things in to weapons?

      --
      "Don't you ever miss the days when you used to be nostalgic?" -Loiosh
  • (Score: 4, Insightful) by Runaway1956 on Tuesday May 08 2018, @03:56PM (11 children)

    by Runaway1956 (2926) Subscriber Badge on Tuesday May 08 2018, @03:56PM (#677065) Journal

    Good post. I wonder if we might introduce a couple more considerations?

    For one thing, all the claims about what SDCs "will be" is pure speculation.

    At the present time, some asshole can stand on his brakes, just for the fun of it. He knows that the law will be on his side if he is rear-ended, and he can be an asshat if he wants to be. The people programming today's "autonomous" vehicles know, just as well as the asshat, that most vehicles are operated by some halfwit with a cell phone. They know that needlessly spiking the brake may well result in the halfwit driving up their non-existent smoke holes. Programmers want to avoid accidents, so they try to balance the margins of error.

    Let's step forward 25 years. Now, about half the people on the road are riding an autonomous vehicles. The chance of having your smoke hole violated has dropped significantly. Shouldn't that mean the programmers have made adjustments to their margins of error?

    Step forward another 25 years. Almost no one drives for himself anymore. Spiking the brakes for the shadow of a bird passing overhead should be fine. Every car around you is networked in, and they know why your are braking madly. No one is going to violate your nonsensical anatomy.

    You're right, I can't know what these vehicles will become in the future. But, we can extrapolate some reasonable expectations. I think that ultimately, people will probably be safer because the car can think and react faster than humans do. But, there WILL BE a lot of lifeless bodies along the way.

    I mean, we didn't just tame horses overnight, all those millenia ago. We aren't going to tame computers overnight either.

    • (Score: 4, Insightful) by The Mighty Buzzard on Tuesday May 08 2018, @05:50PM (10 children)

      SDCs cannot be held accountable for their actions. Human beings can. Therein lies the fundamental difference.

      --
      My rights don't end where your fear begins.
      • (Score: 2) by Runaway1956 on Tuesday May 08 2018, @06:08PM (9 children)

        by Runaway1956 (2926) Subscriber Badge on Tuesday May 08 2018, @06:08PM (#677113) Journal

        Yes, but, a corporation MIGHT be held accountable for the actions of their products. The typical manager wants to decrease the possibility that he or his company might have to answer for the actions of those self-driving cars. Buying off congress critters and judges has it's limits. And, the peasants with torches and pitchforks don't care how much you've paid the judge.

        • (Score: 2) by The Mighty Buzzard on Tuesday May 08 2018, @07:15PM (2 children)

          Which is still offering up a less viable solution to a problem that's been laid to rest longer than you've been alive.

          --
          My rights don't end where your fear begins.
          • (Score: 2) by arslan on Wednesday May 09 2018, @04:19AM (1 child)

            by arslan (3462) on Wednesday May 09 2018, @04:19AM (#677335)

            True, but if the upsides on the new one are more than the old one that should be taken into account when just comparing the downsides which as you've pointed out are not equal.

            If a half-drunk-most-of-the-time-potentially-brain-damaged Runaway can stitch together some lucid upsides, I'm sure most folks here can too with better quality at that.. if they're not too biased one way or another of course.

            • (Score: 2) by The Mighty Buzzard on Wednesday May 09 2018, @07:52AM

              Nah, positive results are not what matters. If they were, we'd be sticking everyone in the slums in gas chambers. They do commit most of the crimes after all. I do not want decisions made based purely on results, humanity needs to be involved even if the same or worse decisions are reached.

              --
              My rights don't end where your fear begins.
        • (Score: 1, Insightful) by Anonymous Coward on Tuesday May 08 2018, @07:33PM (1 child)

          by Anonymous Coward on Tuesday May 08 2018, @07:33PM (#677150)

          How do you hold a corporation responsible? Corporate death penalty? Massive fine?

          Either way its a bankruptcy proceding and the actual people start a new corporation with all the money that wasn't improperly shielded...

          The 'corporation' is barely a thing, it has no skin in the game because it has no skin.

          • (Score: 0) by Anonymous Coward on Tuesday May 08 2018, @11:14PM

            by Anonymous Coward on Tuesday May 08 2018, @11:14PM (#677226)

            Hmm, corporations are skinwalkers!?

        • (Score: 2) by number11 on Wednesday May 09 2018, @04:28AM (2 children)

          by number11 (1170) Subscriber Badge on Wednesday May 09 2018, @04:28AM (#677336)

          Yes, but, a corporation MIGHT be held accountable for the actions of their products.

          But they hardly ever are. What would be a 10 year sentence for an individual is a $10K fine for a corporation. Maybe if you expanded jail time to corporations (6 months means the marshals padlock your premises and freeze your bank accounts for 6 months). Yes, the investors and workers would whine that it hurt them, but criminals usually hurt innocent people. We don't let bank robbers off because it'll hurt their family.

          • (Score: 0) by Anonymous Coward on Wednesday May 09 2018, @07:25AM

            by Anonymous Coward on Wednesday May 09 2018, @07:25AM (#677358)

            Someone I know very well was working in a fintech corporation where his manager asked him to deliberately fudge the numbers and basically commit fraud. He raised objection, so he was given the lowest possible rating and then asked to leave. So he hired a lawyer and decided to whistleblow. Guess what - since he had gotten a new job after being fired, it was NOT established in the court of law that there was any harm done by the termination due to being asked for committing fraud and hence he had full freedom to chose not to commit fraud, and hence nothing bad happened.

            Fortunately he got full payment of 1 month by the HR of the company, but this example is all one needs to know to see how corporations are judged.

          • (Score: 2) by tangomargarine on Wednesday May 09 2018, @03:06PM

            by tangomargarine (667) on Wednesday May 09 2018, @03:06PM (#677458)

            Maybe if you expanded jail time to corporations (6 months means the marshals padlock your premises and freeze your bank accounts for 6 months). Yes, the investors and workers would whine that it hurt them, but criminals usually hurt innocent people. We don't let bank robbers off because it'll hurt their family.

            I would think a better idea would be to just raise their corporate tax rate to 100% of profits for the same length of time. That way the workers are still getting paid.

            Although of course you'd need some oversight to make sure they don't fudge their numbers too hard and claim they're not making any profits like the movie studios.

            --
            "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
        • (Score: 0) by Anonymous Coward on Wednesday May 09 2018, @09:09AM

          by Anonymous Coward on Wednesday May 09 2018, @09:09AM (#677379)
  • (Score: 2) by insanumingenium on Tuesday May 08 2018, @03:58PM (6 children)

    by insanumingenium (4824) on Tuesday May 08 2018, @03:58PM (#677067) Journal
    Intention would be the difference you are totally ignoring here. Accidents happen. By your logic ALL vehicles are murder machines, replacing the computer with a human doesn't resolve your issues.

    In fact, it makes them a whole lot worse, the accident rate for autonomous vehicles is far lower than the human rate according to literally everything I have ever seen, and this isn't supposition about future cars, this is the cars already on the road which you are so upset about. If you have other numbers, I would love to see them. I agree, finding good numbers is hard.

    We don't ban cars from school zones not because they can't kill, but because their benefits are perceived out outweigh the risks. And that is what we are dealing with here is risk tolerance. And yes, there is always socialized risk to just about everything people do or don't do.

    should we do more, better, testing, absolutely.
    should we place more focus on security, absolutely.

    Also, your understanding of the word murder is just plain wrong, murder requires intent, period. The fact that you also called it premeditated makes me think you won't understand that distinction either, but I had to say it.
    • (Score: 3, Insightful) by The Mighty Buzzard on Tuesday May 08 2018, @05:51PM (5 children)

      See above. You're taking something that we already deal with via severe punishment and replacing it with saying "oh well, computers don't know any better".

      --
      My rights don't end where your fear begins.
      • (Score: 2) by arslan on Wednesday May 09 2018, @04:30AM (4 children)

        by arslan (3462) on Wednesday May 09 2018, @04:30AM (#677338)

        But that's what anyone is saying though, at least my read of it - and the op has a point about intent. If you're driving on a highway with no ped-x and some nut decides to chicken run over it in poor visibility, you don't get the "severe punishment" you mentioned - at least no where I'm sitting.

        On the other hand if the computer has been shown that the developers/companies clearly had intent to put in flawed software that kill, the can be severely punished. Take the whole VW debacle as an example and project it to this scenario, if the management knowingly approved poorly written software to autonomous vehicle so they can hit their market deadlines and it ends up killing folks, you can punish them.

        If just so happens with software, the likely of where you can assign blame is a lot less to a manned solution, but that doesn't necessarily means the overall mortality rate is worst off - and that needs to be taken into account instead of just this debate of how to attribute blame in a bubble.

        • (Score: 3, Insightful) by The Mighty Buzzard on Wednesday May 09 2018, @07:55AM (3 children)

          No, the overall mortality rate is not as important as human beings being the ones making life and death decisions. That kind of thinking leads to SkyNet.

          --
          My rights don't end where your fear begins.
          • (Score: 0) by Anonymous Coward on Wednesday May 09 2018, @03:02PM

            by Anonymous Coward on Wednesday May 09 2018, @03:02PM (#677457)

            So you're saying writing self-driving car code is a Trolley Problem?

          • (Score: 2) by arslan on Wednesday May 09 2018, @10:59PM (1 child)

            by arslan (3462) on Wednesday May 09 2018, @10:59PM (#677657)

            That's a far stretch from self-driving cars to Skynet... I wan't my shag-mobile dammit!!

  • (Score: 5, Touché) by tangomargarine on Tuesday May 08 2018, @04:06PM (3 children)

    by tangomargarine (667) on Tuesday May 08 2018, @04:06PM (#677071)

    Premeditated murder of random bystanders is inexcusable.

    The hell? This is absolutely not premeditated murder.

    pre·med·i·tate
    verb
    past tense: premeditated; past participle: premeditated
    think out or plan (an action, especially a crime) beforehand.

    The programmers never planned to kill anybody. What's the thing they say in law enforcement, motive, means, and opportunity? What is the programmer's motive to kill people with self-driving cars?

    And your comparison of a self-driving car to a killbot is stupid. Yes, a car is a 2-ton weapon. But if I could drop an apartment building on somebody, that would make it a weapon, too, wouldn't it?

    You typed a bunch of other words too but were frothing so hard I didn't bother to do more than skim.

    Moreover, I will be happy to personally swing a sledge hammer, on live TV, up to the heads of any

    programmer
    project "manager" barking "is it done yet?"
    actual manager, all the way up to CEO
    negligent QA tester
    marketdroid

    participated in the design, manufacture, promotion, or rollout of this [analogous self-driving car] technology.

    Seriously: get some help.

    --
    "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
    • (Score: 4, Interesting) by DeathMonkey on Tuesday May 08 2018, @06:56PM (2 children)

      by DeathMonkey (1380) on Tuesday May 08 2018, @06:56PM (#677134) Journal

      I don't know why the so-called-nerds on sites like this have such a hard time understanding that intent matters to the law.

      • (Score: 0) by Anonymous Coward on Tuesday May 08 2018, @11:16PM (1 child)

        by Anonymous Coward on Tuesday May 08 2018, @11:16PM (#677228)

        Because that involves squishy soft science stuff that doesn't easily fit into a switch statement. Throw in some emotional "logic" from the human side of the nerd and you have a recipe for insanity.

        • (Score: 1, Interesting) by Anonymous Coward on Wednesday May 09 2018, @03:18AM

          by Anonymous Coward on Wednesday May 09 2018, @03:18AM (#677317)

          Intent = mind reading if the individual has not written or spoken about their intent

  • (Score: 0) by Anonymous Coward on Tuesday May 08 2018, @04:24PM

    by Anonymous Coward on Tuesday May 08 2018, @04:24PM (#677079)

    Does this sound like a good idea to anyone???

    You're wasting your breath when the answer is obviously "yes": somebody already thought it was a good idea, or you wouldn't be here ranting.

    Moreover, I will be happy to personally swing a sledge hammer, on live TV, up to the heads of any other person who knowingly and with utter disregard for human life participated in the design, manufacture, promotion, or rollout of this killbot technology. Yes I'm serious. Premeditated murder of random bystanders is inexcusable.

    Oh good, sanity. Premeditated murder is bad, yet here you are declaring that you would happily murder people with full premeditation.

    No, I do not buy the argument that "other lives will be saved".

    Yeah, a lot of people find math hard and history boring.

    It's well-established that meatbags behind the wheel kill a lot of people. There's loads of documentation confirming that. Get the cars scanning the environment and intercommunicating and, as meatbags are removed from guiding their killing machines, it's inevitable that there will be fewer casualties. It won't happen tomorrow -- people won't want to give up control -- but give it time.

    I have a modest proposal for you. Let's give Donald Trump permission to kill people.

    This must come as a surprise, but the American people already did that. Believe it or not, they elected him president. Insane, right?

    Here's the plan: Donald Trump can kill people any time he wants. No advance notice, no trial, no opportunity to defend yourself. You won't even be put on notice that you're in his crosshairs, other than the general notice given to the entire world that he now has this power and can use it with no checks or limits.

    Already done.

    Your *only* possible defense will be to stay at least 50 feet away from any expanse of pavement. Forever.

    That doesn't work if you factor in the blue-uniformed gun-slingers.

    Meanwhile, nobody seems to be considering the potential for mass exploits. Ever heard of a software zero-day? They are discovered all the time, thanks to careless software development practices combined with management haste to get it out the door.

    Do you have evidence of that? Just because it's not in the news -- it's not exactly glamorous, so hardly surprising that it wouldn't make it to the news -- doesn't mean that nobody is considering the potential. I would be shocked if nobody was considering the potential. Not to say they won't handle it as poorly as is done with the many IoT devices out there, but people likely are considering it.

    Picture the hack that sends a million cars into 100-mile-per-hour chaos mode. What will this do to your "lives saved" argument?

    Depends on how they're configured, doesn't it? If they're configured to pull over and stop when things go wonky, there goes your "chaos mode".

    Software development practices are nowhere near reliable enough to bet our lives on them,

    People are nowhere near reliable enough to bet our lives on, yet we've done so for decades. Software doesn't have to be perfect (though it would be nice if it was), it just needs to be less unreliable than people for a net positive result.

  • (Score: 0, Flamebait) by realDonaldTrump on Tuesday May 08 2018, @05:43PM

    by realDonaldTrump (6614) on Tuesday May 08 2018, @05:43PM (#677106) Homepage Journal

    Thank you for your loyal support, Justin. But I already have the power to kill whoever I want -- without the Due Process. President Obama gave it to himself. He gave it a fancy name, I call it the kill list. Because that's what it is. The great American people didn't tell him "no." And they elected me, overwhelmingly. As everybody knows. I miss my old life, I sacrificed a lot to become President. But it comes with some nice perks!!

  • (Score: 2) by DeathMonkey on Tuesday May 08 2018, @06:58PM (2 children)

    by DeathMonkey (1380) on Tuesday May 08 2018, @06:58PM (#677135) Journal

    Not a gun, where the kill decision is made by a human,

    There are enough accidental gun deaths that you could continue with that metaphor if you wanted to.

    • (Score: 2, Disagree) by number11 on Wednesday May 09 2018, @04:30AM (1 child)

      by number11 (1170) Subscriber Badge on Wednesday May 09 2018, @04:30AM (#677337)

      Most gun deaths may involve not intent, but stupidity or negligence, on the part of the killer.

  • (Score: 2) by c0lo on Tuesday May 08 2018, @11:35PM

    by c0lo (156) Subscriber Badge on Tuesday May 08 2018, @11:35PM (#677238) Journal

    Suppose I make a machine that has the ability to kill people. Not a gun, where the kill decision is made by a human, but a machine that can make that decision on its own initiative.

    If you make this machine using the software industry way (need to release, the management says so. And don't forget the costs), expect that kill machine to pass the child-care certification on the basis of how many bugs the firmware will have.

    My point? "Hanlon razor" is likely to have acted in Uber's case, their management is just in damage control now (trying to save their skin now while still keeping the 'Uber self-driving car' idea afloat)

    --
    https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
  • (Score: 2) by Mykl on Wednesday May 09 2018, @01:08AM

    by Mykl (1112) on Wednesday May 09 2018, @01:08AM (#677277)

    Wow, that's an overreaction.

    What's your view on the (extremely rare) death due to vaccination? Should all of the people involved in the supply chain of vaccines, which save millions of lives, be lined up and shot because one person died of an adverse reaction?