Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Saturday December 17 2016, @11:24PM   Printer-friendly
from the doesn't-anybody-drive-themself-anymore dept.

Uber, the master of routing around regulations and exploiting legal loopholes, has found a rather big hole undermining a letter recently sent by the California Department of Motor Vehicles demanding that the company obtain a permit to test "self-driving cars" in San Francisco. Uber is arguing that the cars it plans to use in San Francisco are not truly autonomous and thus don't require a permit to operate:

Uber's position is that the semi-autonomous car system it is testing here is really no different from current advanced driver assistance systems available now for owners of Teslas and other cars that help with parking and collision avoidance. In that light, Uber doesn't believe it needs a permit because what it's working on doesn't meet the DMV requirements for a truly autonomous vehicle, which would be one that drives without the active, physical control or monitoring of a human being.

The permitting process "doesn't apply to us" because "you don't need to get belts and suspenders or whatever else if you're wearing a dress," Anthony Levandowski, who runs Uber's autonomous car programs, said in a press call Friday afternoon. "We cannot in good conscience" comply with a regulation that the company doesn't believe applies to it, he said.

The DMV cease-and-desist letter said that under the California Vehicle Code, an autonomous vehicle must have a permit to ensure that "those testing the vehicle have provided an adequate level of financial responsibility, have adequately trained qualified test drivers on the safe operation of the autonomous technology; and will notify the DMV when the vehicles have been involved in a collision." If Uber does not confirm immediately that it will stop its launch and seek a testing permit, DMV will initiate legal action, DMV attorney Brian Soublet wrote in a letter addressed to Levandowski.

The Uber "self-driving cars" will have not one, but two people at the front capable of taking control of the car.

Previously: Uber to Begin Picking Up Passengers With Autonomous Cars Next Month
Former Uber Employee Claims Widespread Privacy Problems
Uber's Self-Driving Cars to be Tested in San Francisco


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1) by Francis on Sunday December 18 2016, @01:58AM

    by Francis (5544) on Sunday December 18 2016, @01:58AM (#442584)

    There's a difference between refusing to participate in war crimes and refusing to adhere to the rules of the road. Your whole argument assumes that all regulations are more or less the same.

    I'm sure that once the driverless cars' AI improves that having somebody to handle things if something goes wrong won't be necessary, but we're not anywhere near that point. The AI can handle relatively simple things like highway driving, but isn't yet ready for handling city streets.

  • (Score: 1) by khallow on Sunday December 18 2016, @06:20AM

    by khallow (3766) Subscriber Badge on Sunday December 18 2016, @06:20AM (#442616) Journal

    There's a difference between refusing to participate in war crimes and refusing to adhere to the rules of the road.

    Sure there is, Francis.

    • (Score: 1) by Francis on Sunday December 18 2016, @06:25AM

      by Francis (5544) on Sunday December 18 2016, @06:25AM (#442619)

      This is the kind of post where having an emoticon to indicate that you're joking would be helpful.

      Only somebody that's batshit insane can't tell the difference between a helpful regulation and one that's a crime against humanity.

      • (Score: 2, Insightful) by khallow on Sunday December 18 2016, @07:30AM

        by khallow (3766) Subscriber Badge on Sunday December 18 2016, @07:30AM (#442631) Journal

        Only somebody that's batshit insane can't tell the difference between a helpful regulation and one that's a crime against humanity.

        Please explain how that difference is relevant? Note that the earlier original poster made a blanket claim about complying with law being a thing that people with conscience do. I didn't even mention crimes against humanity. But since you mentioned it, that is a standard counterexample that following the law isn't necessarily good.

        Second, what is helpful about the regulation that self-driving cars, which aren't really self-driving, need to have a permit in order to be tested? The key problem with permits is that they can be abused to prevent or obstruct an activity. By requiring a positive permit in order to do the activity, it allows government the power to block the activity. What happens if some future taxi consortium manages to bribe enough California officials to nuke self-driving car development in California by creating onerous standards? Or the big companies bribe regulators to make the standards for permits onerous so that only big companies can compete?

        And who knows? Maybe the whole point of this exercise is that Uber is trolling for a ruling that will disable Tesla's partially automated driving systems in California. Setting back a competitor via regulatory obstruction isn't a game that only works against Uber.

        • (Score: 1) by Francis on Sunday December 18 2016, @04:37PM

          by Francis (5544) on Sunday December 18 2016, @04:37PM (#442703)

          Optimism is the difference here, I thought that you had actually learned to reason. Apparently, I was wrong. You're still looking at the world like a teenager, not even a particularly sharp teenager.

          If the regulation doesn't apply to them, the correct way of handling it is to go to the state and talk with them about that or take it to court. Uber already has a history of just ignoring regulations that they deem to be inconvenient. It's astonishing to me the extent to which regulators have bent over backwards for the illegal taxi services.

          In this case, this technology is extremely dangerous when it doesn't work the way that it's supposed to. A car is a deadly weapon and there have been instances of a car mowing down a crowd of innocent bystanders, permits for these sorts of cars exist in large part to mitigate that risk. We haven't yet had a situation where a self-driving car did anything like that, but it doesn't stand to reason that it couldn't happen. These cars are highly regulated because there's a significant risk and the technology is still new.

          Whether it's just handsfree or completely autonomous, there should be additional testing and permitting required. It astonishes me that Tesla has been allowed so much freedom with deploying patches to released cars that add functionality.

          Self-driving cars are the domain of large companies when used on the road for good reason. It takes a lot of money to design the cars and it takes a lot of money to pay for the damages if something goes wrong. If you're a hobbyist, then you shouldn't be messing with these things on public roads where people can be killed.

          • (Score: 1) by khallow on Sunday December 18 2016, @06:12PM

            by khallow (3766) Subscriber Badge on Sunday December 18 2016, @06:12PM (#442730) Journal

            If the regulation doesn't apply to them, the correct way of handling it is to go to the state and talk with them about that or take it to court.

            Or find legal loopholes around the situation as Uber routinely does.

            Uber already has a history of just ignoring regulations that they deem to be inconvenient. It's astonishing to me the extent to which regulators have bent over backwards for the illegal taxi services.

            Welcome to one of the benefits of living in a democracy. When a legal loophole benefits a lot of voters, then it has natural protection from regulators. This is also a more effective way to change regulation.

            In this case, this technology is extremely dangerous when it doesn't work the way that it's supposed to. A car is a deadly weapon and there have been instances of a car mowing down a crowd of innocent bystanders, permits for these sorts of cars exist in large part to mitigate that risk. We haven't yet had a situation where a self-driving car did anything like that, but it doesn't stand to reason that it couldn't happen. These cars are highly regulated because there's a significant risk and the technology is still new.

            So what? Even in the complete absence of explicit regulation, we have plenty of regulatory mechanisms such as civil courts to sort this out.

            Whether it's just handsfree or completely autonomous, there should be additional testing and permitting required. It astonishes me that Tesla has been allowed so much freedom with deploying patches to released cars that add functionality.

            Well, better allow that testing then. Else, you've just created a convenient way to block the technology altogether.

            Self-driving cars are the domain of large companies when used on the road for good reason. It takes a lot of money to design the cars and it takes a lot of money to pay for the damages if something goes wrong. If you're a hobbyist, then you shouldn't be messing with these things on public roads where people can be killed.

            That's what insurance or the posting of bonds is for.

      • (Score: 0) by Anonymous Coward on Sunday December 18 2016, @07:42AM

        by Anonymous Coward on Sunday December 18 2016, @07:42AM (#442632)

        This is the kind of post where having an emoticon to indicate that you're joking would be helpful.

        No, it isn't. Wouldn't help at all.