Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 17 submissions in the queue.
posted by takyon on Tuesday January 01 2019, @04:50PM   Printer-friendly
from the defenseless-car dept.

The old gray lady reports that the people of Tempe AZ, a popular testing location for self driving cars, are fighting back. Here are a couple of snippets from the longer article:

The [tire] slashing was one of nearly two dozen attacks on driverless vehicles over the past two years in Chandler, a city near Phoenix where Waymo started testing its vans in 2017. In ways large and small, the city has had an early look at public misgivings over the rise of artificial intelligence, with city officials hearing complaints about everything from safety to possible job losses.

Some people have pelted Waymo vans with rocks, according to police reports. Others have repeatedly tried to run the vehicles off the road. One woman screamed at one of the vans, telling it to get out of her suburban neighborhood. A man pulled up alongside a Waymo vehicle and threatened the employee riding inside with a piece of PVC pipe.

[...] "There are other places they can test," said Erik O'Polka, 37, who was issued a warning by the police in November after multiple reports that his Jeep Wrangler had tried to run Waymo vans off the road — in one case, driving head-on toward one of the self-driving vehicles until it was forced to come to an abrupt stop.

His wife, Elizabeth, 35, admitted in an interview that her husband "finds it entertaining to brake hard" in front of the self-driving vans, and that she herself "may have forced them to pull over" so she could yell at them to get out of their neighborhood. The trouble started, the couple said, when their 10-year-old son was nearly hit by one of the vehicles while he was playing in a nearby cul-de-sac.

"They said they need real-world examples, but I don't want to be their real-world mistake," said Mr. O'Polka, who runs his own company providing information technology to small businesses. "They didn't ask us if we wanted to be part of their beta test," added his wife, who helps run the business.

It looks like The New York Times used this article from December 11 as part of their story:

A slashed tire, a pointed gun, bullies on the road: Why do Waymo self-driving vans get so much hate?

This seems to be happening everywhere Waymo is testing, not just Tempe.

Lots of comments about this article on other sites, SoylentNews should get in on the fun too! A quote from a "media analyst" suggests that driverless cars are like scabs, hired to break a union strike.

Also at The Hill.


Original Submission #1Original Submission #2

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by The Mighty Buzzard on Tuesday January 01 2019, @07:34PM (61 children)

    by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Tuesday January 01 2019, @07:34PM (#780679) Homepage Journal

    I mean, you're the media, right? Isn't it your job to tell people what to think? That means unless they're misbehaving and thinking for themselves anyway, you're qualified to speak for them.

    What self-driving vehicle folks can't seem to understand is it is primarily an accountability issue, even if most folks can't articulate it well. Listen to their arguments. You really should, being as it's the majority opinion. People want someone to exist who can be held accountable in a severe way if a dangerous machine operating in heavily peopled areas kills or injurs anyone. Many of us will simply never be willing to allow this level of caused harm to go unpunished and you can't punish software.

    --
    My rights don't end where your fear begins.
    Starting Score:    1  point
    Moderation   +4  
       Insightful=4, Total=4
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 2, Disagree) by Anonymous Coward on Tuesday January 01 2019, @08:11PM (3 children)

    by Anonymous Coward on Tuesday January 01 2019, @08:11PM (#780699)

    Waymo has done their homework, their vehicles have already been tested thousands of miles and thousands of hours and now have a safety record better than most humans. They should not be lumped in with the Tesla/Uber crashes. In fact, as soon as self driving cars start becoming a sizable percentage of road traffic, watch traffic fatalities drop like a rock.

    • (Score: 3, Informative) by The Mighty Buzzard on Tuesday January 01 2019, @08:56PM

      by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Tuesday January 01 2019, @08:56PM (#780714) Homepage Journal

      You don't listen well. It's not about safety.

      --
      My rights don't end where your fear begins.
    • (Score: 2, Informative) by Anonymous Coward on Wednesday January 02 2019, @02:03AM (1 child)

      by Anonymous Coward on Wednesday January 02 2019, @02:03AM (#780831)
      Look at the videos on youtube, including the AR, read the comments. One person who did similar following/taping said that Waymo in automatic mode drives a bit better than a stoned driver, and a bit worse than a first time driver. You call that "better than most humans?" Even the AR video shows dangerous driving (dangerous by indecision, a moving obstacle.) It is not safe to do what Waymos do on a busy street; this is why they drove their "thousands of miles" literally around the block in residential areas - a feat that a child can do. This is why they are hated - because they dare to test on small streets. I saw Waymo on Central Expressway in Sunnyvale, it was hopeless (25 in 55 mph zone) - a church lady would look like a Formula 1 driver compared to it. (They are equally slow on other roads, but speed limits there are lower.) I do not know how often Waymos drive on freeways; chances are that a Waymo has difficulties with entering the freeway because of dense traffic - it needs a large free space to merge in - good luck finding one.
      • (Score: 5, Informative) by Knowledge Troll on Wednesday January 02 2019, @03:02AM

        by Knowledge Troll (5948) on Wednesday January 02 2019, @03:02AM (#780859) Homepage Journal

        I've seen a video on Youtube of a Waymo car taking a freeway on ramp, indicating that it needed to merge, crawled along the on ramp while trying to find a place to fit in, not finding one, and taking the connected next offramp.

        I've also seen a video of a Waymo car take a freeway on ramp and be unable to find a spot to merge. There was no connected off ramp though so it had to merge. It just came to a stop.

  • (Score: 3, Informative) by fritsd on Tuesday January 01 2019, @08:33PM (24 children)

    by fritsd (4586) on Tuesday January 01 2019, @08:33PM (#780704) Journal

    Good point. The companies owning those vehicles should be ordered to put their company name or logo, in large visible print, on their vehicles.

    Because near-accidents in this case are *not* caused by a single individual bad driver, but a misbehaving collective company AI. So you *are* allowed to tally them up: "all those vehicles of company X are driving like maniacs, let's sue their owners' pants off *before* they kill one of our kids!"

    I didn't understand your first paragraph at all.

    • (Score: 1, Interesting) by Anonymous Coward on Tuesday January 01 2019, @09:09PM

      by Anonymous Coward on Tuesday January 01 2019, @09:09PM (#780721)

      > put their company name or logo, in large visible print, on their vehicles.

      This. Like the signs on the back of semi trailers from some of the big trucking companies, "How is my driving? Call 1-800-xxx-xxxx to comment".

    • (Score: 2) by The Mighty Buzzard on Tuesday January 01 2019, @09:21PM (22 children)

      by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Tuesday January 01 2019, @09:21PM (#780728) Homepage Journal

      Not enough. Maybe, maybe, if every corporate officer and programmer faced criminal charges for every single accident. Short of that, a lot of people are never going to be willing to share tbe roads with multi-ton killbots.

      --
      My rights don't end where your fear begins.
      • (Score: 5, Interesting) by bzipitidoo on Tuesday January 01 2019, @10:34PM (20 children)

        by bzipitidoo (4388) on Tuesday January 01 2019, @10:34PM (#780763) Journal

        In this discussion, I haven't read any mention, let alone appreciation, of just how dangerous our road transportation system is. Of all the things people do on a regular basis, traveling by car is by far the most dangerous activity. And then people don't show those dangers proper respect by driving carefully at all times. Instead, people take crazy risks, make things much worse, and think nothing of it. They talk and text on phones while driving, eat while driving, they let their emotions overcome their sense and risk other lives as well as their own to vent their road rage, they take significantly more risks and drive much more aggressively when running late, etc. Not to say there hasn't been progress. At least drunk driving has been slapped hard, and is now much rarer. And cars now have airbags and other safety improvements. Seatbelt use is high. But the highways are still a bloodbath and there's much more that is worth doing.

        What about dangerous intersections? How many "dead man's curves" are people still living with, can't be bothered to force improvements in the road? One I know is US highway 175 near its terminus in Dallas. They might have eliminated that sharp curve by now, if it wasn't in an area where the majority population is brown. How about railroad crossings? As rich as the US is, somehow we can't find the money to eliminate every grade level crossing in the nation, instead adding more and adopting laws that, for instance, school buses have to stop at railroad crossings. And speaking of school buses, why aren't the kids all wearing seatbelts?? When something new comes along, people go berserk and question the safety, unfairly hold the new to way, way higher standards, while year after year, bad, known dangers are grumbled about but left unchanged. People are too busy, feel too helpless to do anything about it.

        When there is sufficient motivation to force improvements, the forces of corruption are all too likely to warp it to their own nefarious desires for more profit and safety be damned. Red light cameras are an excellent example of that. Whether there's corrupt motivations behind the push to have more driverless vehicles is a good question. As usual, this story has focused on the drama. Dwelling on rock throwing incidents distracts from the important issues. I can certainly see the profit motive pushing to replace human drives with AI and trying to brush off real problems by talking up the rock throwing and other unhinged ranting, trying to lump in the people who are asking good questions with the people who are doing crazy crap. And meanwhile, real, known problems that can be fixed without busting the budget are left to fester another year.

        • (Score: 2) by The Mighty Buzzard on Tuesday January 01 2019, @11:15PM (19 children)

          by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Tuesday January 01 2019, @11:15PM (#780776) Homepage Journal

          Safety is irrelevant unless they can guarantee perfect safety. Accountability when it falls short of perfection like we have with human drivers is my primary concern.

          --
          My rights don't end where your fear begins.
          • (Score: 1) by fustakrakich on Wednesday January 02 2019, @02:02AM (14 children)

            by fustakrakich (6150) on Wednesday January 02 2019, @02:02AM (#780830) Journal

            Accountability when it falls short of perfection like we have with human drivers is my primary concern.

            *cough* What? Are we talking about accountability or falling short of perfection?

            Accountability should be simple, first is the owner (for maintenance issues), then the manufacturer. When we get to version 1.0, the operator should as liable as a passenger in an elevator.

            --
            La politica e i criminali sono la stessa cosa..
            • (Score: 2) by The Mighty Buzzard on Wednesday January 02 2019, @12:30PM (13 children)

              by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Wednesday January 02 2019, @12:30PM (#780990) Homepage Journal

              You reckon then that devs will be lining up to code software whose bugs could get them charged with vehicular manslaughter? Unless that's a possibility you have removed human accountability entirely andthat is what people will not stand for.

              --
              My rights don't end where your fear begins.
              • (Score: 1) by fustakrakich on Wednesday January 02 2019, @08:38PM (12 children)

                by fustakrakich (6150) on Wednesday January 02 2019, @08:38PM (#781169) Journal

                It's up to the managers to make sure it's safe. Let's make 'em justify their salaries and bonuses. If a grunt fucks up, he gets the boot and a mark on his *Permanent Record* (watch out for that), unless you can prove malicious intent of course. How does it work in the military?

                --
                La politica e i criminali sono la stessa cosa..
                • (Score: 2) by The Mighty Buzzard on Thursday January 03 2019, @01:01AM (11 children)

                  by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Thursday January 03 2019, @01:01AM (#781287) Homepage Journal

                  Exactly, and that is utterly insufficient to anyone who's lost a loved one.

                  --
                  My rights don't end where your fear begins.
                  • (Score: 1) by fustakrakich on Thursday January 03 2019, @04:15AM (10 children)

                    by fustakrakich (6150) on Thursday January 03 2019, @04:15AM (#781381) Journal

                    So, arrest the entire team? How many degrees of separation are needed to ensure innocence? Are we going to let emotion drive the whole issue?

                    --
                    La politica e i criminali sono la stessa cosa..
                    • (Score: 2) by The Mighty Buzzard on Thursday January 03 2019, @02:47PM (9 children)

                      by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Thursday January 03 2019, @02:47PM (#781478) Homepage Journal

                      When the emotion is overwhelming grief coupled with rage that you will receive nothing anywhere near justice? You bet your ass we are.

                      --
                      My rights don't end where your fear begins.
                      • (Score: 1) by fustakrakich on Thursday January 03 2019, @05:00PM (8 children)

                        by fustakrakich (6150) on Thursday January 03 2019, @05:00PM (#781545) Journal

                        There's a point when it's no longer justice, it becomes politics, and we have to push back. There comes a time when we have tell people to fuck off when they become a mob. Shit happens, sometimes even by accident. You have to prove negligence and maliciousness or get the hell out.

                        --
                        La politica e i criminali sono la stessa cosa..
                        • (Score: 2) by The Mighty Buzzard on Thursday January 03 2019, @09:04PM (7 children)

                          by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Thursday January 03 2019, @09:04PM (#781689) Homepage Journal

                          Luck with that. This country exists because of that attitude from government.

                          --
                          My rights don't end where your fear begins.
                          • (Score: 1) by fustakrakich on Thursday January 03 2019, @09:14PM (6 children)

                            by fustakrakich (6150) on Thursday January 03 2019, @09:14PM (#781694) Journal

                            Oh stop! We ARE the damn government!

                            --
                            La politica e i criminali sono la stessa cosa..
                            • (Score: 2) by The Mighty Buzzard on Thursday January 03 2019, @10:50PM (5 children)

                              by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Thursday January 03 2019, @10:50PM (#781740) Homepage Journal

                              You think? That kind of makes me despair for your intelligence.

                              --
                              My rights don't end where your fear begins.
                              • (Score: 1) by fustakrakich on Thursday January 03 2019, @10:53PM (4 children)

                                by fustakrakich (6150) on Thursday January 03 2019, @10:53PM (#781743) Journal

                                You think?

                                I know! 95% of voters voted for exactly what he have. Can it be any more obvious?

                                --
                                La politica e i criminali sono la stessa cosa..
                                • (Score: 2) by The Mighty Buzzard on Thursday January 03 2019, @11:13PM (3 children)

                                  by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Thursday January 03 2019, @11:13PM (#781761) Homepage Journal

                                  You're assuming the dog is wagging the tail. That's a foolish assumption in politics.

                                  --
                                  My rights don't end where your fear begins.
                                  • (Score: 1) by fustakrakich on Thursday January 03 2019, @11:39PM (2 children)

                                    by fustakrakich (6150) on Thursday January 03 2019, @11:39PM (#781769) Journal

                                    Passivity is no excuse.

                                    --
                                    La politica e i criminali sono la stessa cosa..
          • (Score: 0) by Anonymous Coward on Wednesday January 02 2019, @06:57AM (3 children)

            by Anonymous Coward on Wednesday January 02 2019, @06:57AM (#780923)

            Most likely the car company will pay out settlements if it can be proven the AI was negligent.

            If you're looking for someone to send to jail, there probably won't be one. But that's already true [theguardian.com], and it's also not necessary if you're out for someone to take irrational vengeance on. Those who have killed other people with their cars already take irrational vengeance on themselves.

            Everyone reading this think about how you would feel if you killed another human being with your car. Even if you don't get sent to jail for it. Even if it wasn't legally your fault but, maybe, with a little extra vigilance, you could have prevented the death. I've read other articles about this topic. Suicidal depression and lifelong guilt are the automatic, inescapable punishments for accidentally killing someone with your car.

            I looked into this topic when I hit a deer and it shook me up. I drive a lot more safely now; I make very few mistakes, and I usually notice my mistakes right after I make them. I don't make many mistakes: I'm a good driver. But I do make mistakes. It could happen. It probably won't, but if a pedestrian crosses against the light, or a cyclist hits a rock, and I'm not at the top of my game ... well probably I'd rather just end it in that case than live the rest of my life reliving that moment, trying to forgive myself for something I know I won't be able to.

            Self-driving cars can't come soon enough.

            • (Score: 0) by Anonymous Coward on Wednesday January 02 2019, @11:29AM

              by Anonymous Coward on Wednesday January 02 2019, @11:29AM (#780973)

              > Self-driving cars can't come soon enough.

              Per your description, your driving is well above average in terms of attention (or lack of distraction). Assuming that you also avoid other common hazards like driving drunk/impaired, road rage, and driving when sleepy, you are perhaps an order of magnitude "safer" than an average driver.

              I claim that you shouldn't use self driving cars until their fleet average is as good as your demographic. That criteria is going to take massive amounts of testing time to establish, much more than Waymo (who I believe is the industry leader) has done to date. Think about it, Waymo has hundreds or perhaps a few thousand cars, the USA has ~250 million cars.

            • (Score: 2) by The Mighty Buzzard on Wednesday January 02 2019, @12:33PM

              by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Wednesday January 02 2019, @12:33PM (#780992) Homepage Journal

              Money ain't a fair exchange for a life. That you're willing to allow it to shed your own accountability is fairly well sickening.

              --
              My rights don't end where your fear begins.
            • (Score: 2) by bzipitidoo on Thursday January 03 2019, @03:17AM

              by bzipitidoo (4388) on Thursday January 03 2019, @03:17AM (#781356) Journal

              I've had a near miss that way, so I know what you mean. Kid ran a red light as I was approaching the intersection. It was green my way, had been green long enough for half a dozen cars in front of me to go through. And there was another car waiting at the red, his way, and this inexperienced kid saw none of that. I could not see him until he was at the intersection, thanks to it being an underpass and his direction hidden from my view by an embankment. Nevertheless, I saw him as soon as he appeared, and I didn't like how fast he was approaching the intersection. Didn't look like he could stop in time. So I shifted one lane away, in case he came to a stop with his nose in the intersection. I never dreamed that he would instead punch it and leap out in front of me. Had I considered that possibility, maybe I could have swerved and avoided him. Maybe. Instead, I t-boned him. You know how time seems to slow down when you're in great danger? That's what it was like for me. I saw what to do, saw I might miss him if I swerved hard enough, saw that standing on the brake was not going to be enough, but I just couldn't seem to move fast enough. My arms and feet were in super slow motion. I didn't make it to the brake, and had only just started the swerve when we hit. Was still green my way when we came to a halt.

              Very lucky for him that I was driving a small, light vehicle, or someone could have been maimed or killed. Also, very lucky I had slowed a little upon seeing him approach too fast. I was going 10 under the speed limit of 55 mph. He and his two passengers were only cut up a little. The unoccupied seat in their car was where my car hit. As for me and my passengers, one had a broken ankle, one was only bruised and cut, and I was bruised, the deep kind that take months to heal up. Though the accident was totally his fault, I still would have felt terrible if anyone had been permanently maimed or worse.

              Had another idiot run a light in front of me a few years later. That time I was the lead vehicle. We were already in motion because we'd been waiting at another red light 2 blocks further back. Had I not taken action, I would have t-boned that car too. Instead, I swerved into the left turn lane and braked hard, completely avoiding the other driver. Also helped that we were all doing about 30 mph. And I'd thought about the previous experience, which helped greatly in taking decisive action instantly. Already had a plan, you know.

      • (Score: 1, Interesting) by Anonymous Coward on Wednesday January 02 2019, @02:25AM

        by Anonymous Coward on Wednesday January 02 2019, @02:25AM (#780843)

        Short of that, a lot of people are never going to be willing to share tbe roads with multi-ton killbots.

        Nobody is going to ask the people. Their lives are bought and sold on the state level. However the people can tell the mayor or some other suitable politician(s) to prohibit such vehicles in the city.

  • (Score: 1) by khallow on Tuesday January 01 2019, @09:48PM (30 children)

    by khallow (3766) Subscriber Badge on Tuesday January 01 2019, @09:48PM (#780745) Journal

    People want someone to exist who can be held accountable in a severe way if a dangerous machine operating in heavily peopled areas kills or injurs anyone.

    The company that made the dangerous machine is the obvious choice, unless someone's negligence elsewhere created the risk. Sounds like a solved problem to me.

    Sorry, but I think more that it's the typical aversion to change that happens. Self-driving cars? It's icky until people get accustomed to them driving about.

    • (Score: 3, Insightful) by The Mighty Buzzard on Tuesday January 01 2019, @10:05PM (29 children)

      by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Tuesday January 01 2019, @10:05PM (#780752) Homepage Journal

      When was the last time you saw a CEO or programmer held criminally liable for bad code? So, yeah, not really a solved problem. And how many people would be willing to take the jobs if jail time were a risk for any bug?

      --
      My rights don't end where your fear begins.
      • (Score: 1) by khallow on Tuesday January 01 2019, @10:52PM (28 children)

        by khallow (3766) Subscriber Badge on Tuesday January 01 2019, @10:52PM (#780767) Journal

        .When was the last time you saw a CEO or programmer held criminally liable for bad code?

        When's the last time they churned out criminally liable bad code? I don't think that happens so often in the first place because it needs to be a crime first. Same goes for driving. Just because something bad happens or someone dies doesn't mean a crime happened. There's plenty of cases where people die on the road without a crime occurring.

        And how many people would be willing to take the jobs if jail time were a risk for any bug?

        Why would there be jail time? What crime happened? In US law, one has to show a certain level of negligence in order to win a civil case. That wouldn't be any different for cars no matter who drives them. There is such a thing as criminal negligence, but it requires things like a disregard for human life or safety as part of the negligence. Ignoring a bug that kills a thousand people a year? Gross negligence. Turning the business's annual review of the deaths known by the business to be caused by said bug into a drinking game? Criminal negligence.

        As I see it, the same sort of people who'd jail someone for dropping a semicolon in the code are a small subset of the people who will grow accustomed to self-driving cars eventually. Just because I live in a democracy doesn't mean that I should respect the passing hysteria of the public or that private projects should be subordinate to this hysteria.

        • (Score: 2) by The Mighty Buzzard on Tuesday January 01 2019, @11:16PM (2 children)

          by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Tuesday January 01 2019, @11:16PM (#780777) Homepage Journal

          You're not a Microsoft customer, I take it.

          --
          My rights don't end where your fear begins.
          • (Score: 1) by khallow on Wednesday January 02 2019, @03:28AM (1 child)

            by khallow (3766) Subscriber Badge on Wednesday January 02 2019, @03:28AM (#780867) Journal
            I am, but they have successfully managed my expectations.
            • (Score: 0) by Anonymous Coward on Wednesday January 02 2019, @09:55AM

              by Anonymous Coward on Wednesday January 02 2019, @09:55AM (#780959)

              Ecce [nypost.com] a man of little needs.

        • (Score: 0) by Anonymous Coward on Wednesday January 02 2019, @02:53AM (3 children)

          by Anonymous Coward on Wednesday January 02 2019, @02:53AM (#780853)

          When's the last time they churned out criminally liable bad code?

          Uber's coders did that. Their code disabled the Volvo's factory automatic braking function, silenced the alarm (so that the test driver will not be warned,) and told the system to not brake for "insignificant" objects. Death resulted. The guilt of the programmers is separate from the guilt of the operator. One did not warn when they could; another did not look when she was required to.

          In this case the relatives of the victim took the blood money. The guilty persons were not determined, everything was swept under the rug. Perhaps some other incident will result in a trial, and the whole country will be watching it. Many dirty secrets will be revealed. Techies know some of them, but once the whole country knows, the Waymo/Uber/etc. will be in trouble.

          • (Score: 1) by khallow on Wednesday January 02 2019, @03:42AM (2 children)

            by khallow (3766) Subscriber Badge on Wednesday January 02 2019, @03:42AM (#780869) Journal

            Their code disabled the Volvo's factory automatic braking function, silenced the alarm (so that the test driver will not be warned,) and told the system to not brake for "insignificant" objects.

            Ok, what makes that a crime? There's not a law explicitly to criminalize that. And they no doubt had ideas about how to compensate for the disabling of those systems.

            It's worth keeping in mind that disabling safety systems is not evidence of a crime because the safety systems can fail to work properly in context. Consider the situation of a fire alarm that starts generating false alarms every fifteen minutes. One can't empty a busy office building or hotel every time and conduct a search by the fire department each time. And prohibiting habitation for the few days while the alarm is repaired can result in huge hardship while the problem is resolved. So a common approach is to manual patrol the building on a regular basis through the full day (for example [ua.edu]) till the fire alarm is repaired.

            The alarm has been circumvented, but no crime has occurred because the people responsible have implemented alternate procedures for the alarm system's task.

            The same occurs here. Sure, these systems were disabled by Uber personnel in the accident which killed Elaine Herzberg. But the vehicle wasn't traveling fast and there was a human driver at the wheel. On paper, I'm sure they thought they had covered the dangers that these safety systems were supposed to address - which is particularly innocuous-looking since these systems aren't required for safe driving. There probably is a consistent pattern of taking short cuts and complacency, but that isn't usually good enough to qualify as a crime in the absence of criminalizing regulations.

            • (Score: 2) by The Mighty Buzzard on Wednesday January 02 2019, @12:36PM (1 child)

              by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Wednesday January 02 2019, @12:36PM (#780993) Homepage Journal

              That is precisely the problem. Removing accountability is bad, m'kay.

              --
              My rights don't end where your fear begins.
              • (Score: 1) by khallow on Wednesday January 02 2019, @03:03PM

                by khallow (3766) Subscriber Badge on Wednesday January 02 2019, @03:03PM (#781032) Journal
                None of the systems mentioned provided accountability. They were optional systems installed by a manufacturer for a human driver not a self-driving car.
        • (Score: 0) by Anonymous Coward on Wednesday January 02 2019, @02:58AM (10 children)

          by Anonymous Coward on Wednesday January 02 2019, @02:58AM (#780855)

          > When's the last time they churned out criminally liable bad code?

          How about Boeing and their jet that insisted it needed to crash into the water, even after the pilots managed to keep it up during several previous dives? While the investigation isn't over yet, I'd say there is a good chance that there will be time in court (somewhere, maybe not in USA).

          • (Score: 1) by khallow on Wednesday January 02 2019, @03:43AM (9 children)

            by khallow (3766) Subscriber Badge on Wednesday January 02 2019, @03:43AM (#780870) Journal

            How about Boeing and their jet that insisted it needed to crash into the water, even after the pilots managed to keep it up during several previous dives?

            What's the crime? Did Boeing personnel deliberately crash the jet?

            • (Score: 0) by Anonymous Coward on Wednesday January 02 2019, @08:08AM (8 children)

              by Anonymous Coward on Wednesday January 02 2019, @08:08AM (#780938)

              Only a court can tell what their crime is, if any. [Lion Air case] But they can be accused of a few misdeeds. For example, they haven't mentioned the new stability system in pilot manuals. They haven't said a word during the difference training. (Pilots of American Airlines and Southwest did not know either. It's already clear that all MAX were dangerous from day 0 just because of lack of training on a new system that can override the pilot.) The prosecutor will add several more from the book, just for the fact of a crash and multiple deaths.) One lawsuit is already in process, per Wikipedia:

              On 31 December, the family of the first officer filed a lawsuit against Boeing, claiming negligence. The lawsuit also claimed that the aircraft's sensors provided inaccurate flight data resulting in the anti-stall system disengaging, as well as Boeing not providing proper instructions to pilots about how to handle the situation.

              • (Score: 1) by khallow on Wednesday January 02 2019, @03:03PM (7 children)

                by khallow (3766) Subscriber Badge on Wednesday January 02 2019, @03:03PM (#781033) Journal
                Misdeeds != crimes.
                • (Score: 2) by Gaaark on Wednesday January 02 2019, @08:46PM (4 children)

                  by Gaaark (41) on Wednesday January 02 2019, @08:46PM (#781174) Journal

                  So if an Uber/Waymo car hit you and killed you, your family should just shrug and go "meh, whatever"?

                  --
                  --- Please remind me if I haven't been civil to you: I'm channeling MDC. ---Gaaark 2.0 ---
                  • (Score: 1) by khallow on Thursday January 03 2019, @02:14AM (3 children)

                    by khallow (3766) Subscriber Badge on Thursday January 03 2019, @02:14AM (#781329) Journal

                    So if an Uber/Waymo car hit you and killed you, your family should just shrug and go "meh, whatever"?

                    They can sue. The act doesn't need to be criminal to generate legal liability.

                    • (Score: 2) by The Mighty Buzzard on Thursday January 03 2019, @02:50PM (2 children)

                      by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Thursday January 03 2019, @02:50PM (#781482) Homepage Journal

                      You've obviously never lost anyone close if you think any amount of money can come close to being justice.

                      --
                      My rights don't end where your fear begins.
                      • (Score: 1) by khallow on Friday January 04 2019, @02:15AM (1 child)

                        by khallow (3766) Subscriber Badge on Friday January 04 2019, @02:15AM (#781850) Journal

                        You've obviously never lost anyone close if you think any amount of money can come close to being justice.

                        Back at you. What's supposed to be special about jail time (or perhaps more exotic punishments) that when added to said amount of money comes closer to being justice? Dead person is still dead no matter how much you punish someone or some business. Meanwhile excessive punishment means businesses die, jobs lost, peoples' lives aren't bettered, society has to take up unreasonable burdens (for jailing people for what should be non-crimes), and so on.

                        • (Score: 1) by khallow on Friday January 04 2019, @02:36AM

                          by khallow (3766) Subscriber Badge on Friday January 04 2019, @02:36AM (#781860) Journal
                          Looking at my post, one can correctly say that it goes too far in the opposite direction. Any sort of punishment is going to be a burden on someone, so why punish at all?

                          To correct this, I have an observation to make. Do we in the developed world have huge troubles with businesses killing people because it's only money? To the contrary, death rates from typical business-related areas like workplace deaths, are at an all-time low. For example, workplace deaths [osha.gov] in the US are at their lowest point of the last 40 years (and they weren't getting better before that!).

                          Since the passage of the OSH Act, the rate of reported serious workplace injuries and illnesses has declined from 11 per 100 workers in 1972 to 3.6 per 100 workers in 2009.

                          A factor of three improvement despite this lack of justice. Something is working. I think that same something will work with self-driving vehicles as well.

                • (Score: 0) by Anonymous Coward on Thursday January 03 2019, @01:52AM (1 child)

                  by Anonymous Coward on Thursday January 03 2019, @01:52AM (#781313)
                  The prosecutor will decide which misdeeds appear to be crimes according to the laws of the country, and then the court will decide which of those actually are, given the circumstances. As we do not know yet even in what countries the trials will be held, talking about crimes is premature.
                  • (Score: 1) by khallow on Thursday January 03 2019, @02:38AM

                    by khallow (3766) Subscriber Badge on Thursday January 03 2019, @02:38AM (#781339) Journal

                    The prosecutor will decide which misdeeds appear to be crimes according to the laws of the country, and then the court will decide which of those actually are, given the circumstances.

                    And yet, it's odd how no one has mentioned such a misdeed which was actually a crime. The prosecutors tend to decide otherwise.

        • (Score: 2) by Knowledge Troll on Wednesday January 02 2019, @03:44AM (9 children)

          by Knowledge Troll (5948) on Wednesday January 02 2019, @03:44AM (#780871) Homepage Journal

          When's the last time they churned out criminally liable bad code?

          The Toyota Prius unintended acceleration issue that is commonly attributed to user error or floor mats causing the accelerator pedal to get stuck was neither limited only to the Prius nor involved any user error or mechanical issue at all. Instead the ECU software was so poorly implemented that a bug existed (among tons of others later discovered during audit because of the trial) that could cause both the current throttle position variable in memory to become corrupted with the value that was intended to be stored in another memory location it would also prevent that value from being properly set again to the current value from the throttle position sensor. The ECU defines what the current power output of the propulsion system will be and there is no mechanical override.

          That didn't even kill people and that is criminal.

          Ignoring a bug that kills a thousand people a year? Gross negligence. Turning the business's annual review of the deaths known by the business to be caused by said bug into a drinking game? Criminal negligence.

          This is why computer people should not be allowed in the real world where stuff has mass and/or stored energy and consequences are extreme. The very notion of brushing off a design defect that kills thousands of people as not being criminal is utterly fucked. Roller coaster kills 1,000 people per year - is that ignored? Airplane kills 1,000 people a year? Building? Elevators?

          All of those things are engineered by professionals who go to jail if they fuck up. And fucking up is defined as not working hard enough to ensure you've defined exactly how to make it not fuck up, that it won't fuck up outside what you expect it to, that you've proved it mathematically, and also were not wrong in the proof.

          That's how airplanes stay in the sky. Since the robot car is supposed to be so good because airplanes are so good I suggest they start with the parts that make airplanes good. That includes throwing people in jail for mistakes.

          • (Score: 1) by khallow on Wednesday January 02 2019, @04:18AM (3 children)

            by khallow (3766) Subscriber Badge on Wednesday January 02 2019, @04:18AM (#780886) Journal

            The Toyota Prius unintended acceleration issue that is commonly attributed to user error or floor mats causing the accelerator pedal to get stuck was neither limited only to the Prius nor involved any user error or mechanical issue at all. Instead the ECU software was so poorly implemented that a bug existed (among tons of others later discovered during audit because of the trial) that could cause both the current throttle position variable in memory to become corrupted with the value that was intended to be stored in another memory location it would also prevent that value from being properly set again to the current value from the throttle position sensor. The ECU defines what the current power output of the propulsion system will be and there is no mechanical override.

            That didn't even kill people and that is criminal.

            Ok, what's criminal about it?

            This is why computer people should not be allowed in the real world where stuff has mass and/or stored energy and consequences are extreme. The very notion of brushing off a design defect that kills thousands of people as not being criminal is utterly fucked. Roller coaster kills 1,000 people per year - is that ignored? Airplane kills 1,000 people a year? Building? Elevators?

            Why? Brushing off still means liability of around $10 billion per year (at $10 million per life). Autos kill around half a million people a year global. There's a lot of brushing off happening here.

            • (Score: 2) by Knowledge Troll on Wednesday January 02 2019, @04:43AM (2 children)

              by Knowledge Troll (5948) on Wednesday January 02 2019, @04:43AM (#780892) Homepage Journal

              Ok, what's criminal about it?

              I did some searching and in 2013 in Oklahoma - nothing. There was a trial, it was civil, Toyota was found liable for producing software that was utterly unfit for purpose and lots of money changed hands, just as you say. I recall a trial in Japan that led to conviction but I can't find any reference for it. It also looks like I was wrong - at least one person did die. I'd call this criminal negligence. It seems a Jury does not disagree that Toyota screwed up really bad. Would a grand jury vote for a trial if one were conveyed for this? I give it better than 50/50 chance.

              Also if you are curious this is a nice summary of the defects in the software: https://www.edn.com/design/automotive/4423428/Toyota-s-killer-firmware--Bad-design-and-its-consequences [edn.com]

              Autos kill around half a million people a year

              Automobile fatalities in total right? Not number of automobile fatalities that have a root cause or were made worse by a defect in the machine? From what I remember when GM makes an ignition lock that malfunctions and locks the steering column and a few hundred people died over a few years they couldn't cover it up anymore and they fixed it. Seems like 10s of thousands of people per year dying because of defects is a reasonable guess.

              Though that just makes you right again because there was no criminal trial. Though did anyone ask a grand Jury?

              • (Score: 1) by khallow on Wednesday January 02 2019, @06:29AM (1 child)

                by khallow (3766) Subscriber Badge on Wednesday January 02 2019, @06:29AM (#780911) Journal

                Autos kill around half a million people a year

                Automobile fatalities in total right? Not number of automobile fatalities that have a root cause or were made worse by a defect in the machine?

                And includes regions that are a lot slacker about quality of cars and road safety. Point is that a lot of death is acceptable in automobile travel and engineering bugs aren't normally going to respect country borders.

                It's also worth noting that there are serious problems with how the developed world handles the liability of bug fixes. We already have various critics conflating research into a problem as proof of negligence or worse and some of that occasionally gets [nytimes.com] into the courts. Even mistakes made by a virtuous company which aggressively pursues dangerous bugs and flaws in its products is enough to end up in the courts. And there will be mistakes.

                These are two big reasons why criminalizing car design and construction is a bad idea. Perfection is not possible and people will die due to flaws in design or code. Similarly, detection isn't perfect either and more people will die before a virtuous business can fix the problem. That's why all these examples of dangerous flaws in products mentioned in this thread shouldn't be considered crimes.

                • (Score: 2) by Knowledge Troll on Wednesday January 02 2019, @05:17PM

                  by Knowledge Troll (5948) on Wednesday January 02 2019, @05:17PM (#781089) Homepage Journal

                  You are correct that criminalizing any mistake or anything that leads to injury is a mistake. For instance when the Dehaveland Commet started exploding a little bit while flying people died. Turns out square windows in a pressure vessel isn't a very good idea. Who knew? Well, nobody. We got some new science from that and oval windows in every airplane that follows. Ignorant? Yes. Criminal? No.

                  Toyota produces a control system for a machine that is intended to have humans inside of it, the control system manages the high power output propulsion system, and there is no mechanical override. It turns out that writing shoddy code that ignores the lessons of the past and further ignores industry practices that were created to avoid those very problems is a bad idea. Who knew? Essentially everyone that's a professional software developer and anyone that is an electrical engineer creating software for control systems. Ignorant? No. Criminal? Yes. At least in my eyes. I don't see how Toyota can get a pass here as if the situation they created was full of unknown results or surprises.

                  In the case of GM it is possible the engineers were not entirely aware the product had a design flaw. Though there is the pesky issue where GM corporate laid down some rules regarding the adjectives engineers are allowed to describe the machines they produce. For instance the term "rolling sarcophagus" is right out, can't say that anymore. You have to say "does not work as intended." Hmmmmm. Maybe they do know something isn't right in this process and culture?

                  Another example of a non-crime dangerous machine I think is the Corvair. Dangerous at any speed like Nader wrote about or just a twitchy rear-wheel drive car that's prone to oversteer? It's the latter. I also drove a car that was rear wheel drive, twitchy and prone to oversteer. It's called a sports car. As the driver of one it comes with responsibilities. It's not the car's fault it is a handful.

          • (Score: 2) by linuxrocks123 on Wednesday January 02 2019, @07:49AM (2 children)

            by linuxrocks123 (2557) on Wednesday January 02 2019, @07:49AM (#780930) Journal

            All of those things are engineered by professionals who go to jail if they fuck up. And fucking up is defined as not working hard enough to ensure you've defined exactly how to make it not fuck up, that it won't fuck up outside what you expect it to, that you've proved it mathematically, and also were not wrong in the proof.

            I don't believe that is the law. Yes, you can go to jail if your work is so sloppy it is criminally negligent, but that is a high bar. It also doesn't only apply to professional engineers: anyone can be criminally negligent for their actions.

            Your PE stamp might be evidence against you in the trial, but I think that's it. I'm not a lawyer, nor am I an expert in this area, but I can't find any particular statutes criminalizing actions by negligent PEs. If anyone can find a statute specifically articulating criminal liability for negligent PEs, I'd love for you to reply.

            • (Score: 0) by Anonymous Coward on Wednesday January 02 2019, @08:31AM (1 child)

              by Anonymous Coward on Wednesday January 02 2019, @08:31AM (#780945)

              Here is an example of piercing the corporate veil [wbtv.com] of a tiny corporation. One of the owners personally committed a crime. It's very difficult to do with a publicly traded corporation. At best the company, like Microsoft, can be convicted of wrongdoing, fined and sentenced to (something with IE in MS case - a null punishment in the end.)

              This means that when self-driving cars start to kill [more] people, the grieving families will be written a check. Nothing more. The businessmen who manage the self-driving empire are safely isolated from the incident; and the money is just running expenses.

              • (Score: 1) by khallow on Wednesday January 02 2019, @03:12PM

                by khallow (3766) Subscriber Badge on Wednesday January 02 2019, @03:12PM (#781039) Journal

                One of the owners personally committed a crime.

                You don't need to pierce the corporate veil when someone personally commits a crime.

                It's very difficult to do with a publicly traded corporation.

                Only because they usually aren't committing crimes (hint hint).

                (something with IE in MS case - a null punishment in the end.)

                An example of a non-crime.

                This means that when self-driving cars start to kill [more] people, the grieving families will be written a check. Nothing more. The businessmen who manage the self-driving empire are safely isolated from the incident; and the money is just running expenses.

                And that differs from any other situation like it how? If the machine made by my small business or personal hobby kills someone, that's the likely outcome as well (except of course, I might not have the money to pay that check!).

          • (Score: 0) by Anonymous Coward on Wednesday January 02 2019, @02:23PM (1 child)

            by Anonymous Coward on Wednesday January 02 2019, @02:23PM (#781018)

            The Toyota Prius unintended acceleration issue

            Good example, but not for your side. There never was any criminal indictment in that case, it was settled out-of-court. Worse, to this day, Toyota remains publically "of the opinion" that there never was a software error.

            • (Score: 2) by Knowledge Troll on Wednesday January 02 2019, @05:27PM

              by Knowledge Troll (5948) on Wednesday January 02 2019, @05:27PM (#781093) Homepage Journal

              Yeah I hate it when I prove someone else's point :-) I'd like to find the case I recall in Japan that was criminal but I just can't.

              Toyota remains publically "of the opinion" that there never was a software error.

              Oh boy and when I was at a Toyota dealer somewhat recently and the sales droid claimed that the unintended acceleration was not Toyota's fault I really let him have it. It was somewhere around where the guy was explaining the truck had so many computers on it that if I didn't get the warranty I'd be an utter fool because they will fail and cost a fortune to replace.

              He got an earful about shoddy software, court cases, and people being hurt for no good reason. Then he got an earful of "why the hell would I buy a machine that I expect to break in a small number of years?" and I walked off.

  • (Score: 0) by Anonymous Coward on Wednesday January 02 2019, @01:39AM

    by Anonymous Coward on Wednesday January 02 2019, @01:39AM (#780825)

    Many of us will simply never be willing to allow this level of caused harm to go unpunished

    Well, not locally, at least. What they should do is test these vehicles on the natives in enemy countries. Then, when the targeting system is adequately trained, just reverse the objective on vehicles used domestically, change the "Kill" option to "Avoid"