Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 17 submissions in the queue.
posted by martyb on Wednesday August 26 2015, @04:56PM   Printer-friendly
from the could-get-interesting-on-Halloween dept.

A police officer is directing traffic in the intersection when he sees a self-driving car barreling toward him and the occupant looking down at his smartphone. The officer gestures for the car to stop, and the self-driving vehicle rolls to a halt behind the crosswalk. "This seems like a pretty plausible interaction. Human drivers are required to pull over when a police officer gestures for them to do so. It’s reasonable to expect that self-driving cars would do the same." But Will Oremus writes that while it's clear that police officers should have some power over the movements of self-driving cars, "what’s less clear is where to draw the line." Should an officer be able to "do the same if he suspects the passenger of a crime? And what if the passenger doesn’t want the car to stop—can she override the command, or does the police officer have ultimate control?"

According to a RAND Corp. report on the future of technology and law enforcement “the dark side to all of the emerging access and interconnectivity is the risk to the public’s civil rights, privacy rights, and security.” It added, “One can readily imagine abuses that might occur if, for example, capabilities to control automated vehicles and the disclosure of detailed personal information about their occupants were not tightly controlled and secured.”


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by takyon on Wednesday August 26 2015, @07:50PM

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Wednesday August 26 2015, @07:50PM (#228235) Journal

    A traffic stop requires the driver to stop voluntarily. If the driver doesn't stop, then it becomes a chase.

    With this technology the police could skip that step. There's also almost never a reason to stop a self-driving car other than harassment on the part of the cop.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    Starting Score:    1  point
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 1) by tftp on Wednesday August 26 2015, @11:01PM

    by tftp (806) on Wednesday August 26 2015, @11:01PM (#228328) Homepage

    There's also almost never a reason to stop a self-driving car other than harassment on the part of the cop.

    There could be some reasons. For example:

    • The passenger or the vehicle are similar to those that are involved in a bank robbery two miles away
    • The vehicle is not functioning correctly (lights burned out, bald tires, etc.)
    • The vehicle is not authorized to be on the road (no plates, expired tags, etc.)
    • The passenger is not using the vehicle safely (no seat belt, unconstrained dogs where prohibited, etc.)
    • The vehicle is driven (auto or under manual control) where it shouldn't be
    • (Score: 3, Interesting) by takyon on Wednesday August 26 2015, @11:23PM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Wednesday August 26 2015, @11:23PM (#228341) Journal

      Sure you can find some reasons, but the overwhelming majority of stops are traffic stops, which would presumably never apply to a mature self-driving vehicle, and questionable DUI checkpoints, which also don't apply to self-driving cars.

      Using a self-driving car in a getaway is a terrible idea given how interconnected and tracked they will be.

      So it comes down to police wanting to stop a self-driving vehicle (rarely) in a case where the "passenger" might not even respond to sirens (asleep, headphones). If we're going to consider giving them as much leeway as allowing them to stop the car remotely, there could be a remote system that blares noise and flashes lights in the car instead to alert a sleeper to stop the car instead.

      The law should protect, not burden, what will become the safest vehicles on our roads.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 1) by tftp on Wednesday August 26 2015, @11:43PM

        by tftp (806) on Wednesday August 26 2015, @11:43PM (#228351) Homepage

        This system should cover one important case: when the passenger in the car is not there :-) Such as: a self-driving car is a taxicab; or a private self-driving car was sent to pick up someone or something from somewhere. I'm sure that drive-through grocery shopping will instantly get a new life.

        The police should be able to stop an empty robot car if it is damaged. I cannot say how much introspection a self-driving car will get; however the police will have every reason to stop a car that has obscenties spray-painted on its side. Unless that is "new normal," of course :-(

        • (Score: 2) by takyon on Thursday August 27 2015, @12:03AM

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday August 27 2015, @12:03AM (#228360) Journal

          Cyberwar will really make me cry when it kills off the driverless car. Remote stop = remote hack.

          Ashley Madisonmobile

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
          • (Score: 1) by tftp on Thursday August 27 2015, @12:40AM

            by tftp (806) on Thursday August 27 2015, @12:40AM (#228374) Homepage

            Remote stop = remote hack.

            Use laser light to convey a message into a black box. The black box decodes the message, checks the signature on the police order, and issues a simple request to the driving computer(s) to pull over (as one of several possible actions; other could be "drive slowly", etc.) Sometimes the "follow me" order is important when pilot/flag cars are present. I have been in several road construction zones with those cars - and you have to be somewhat sentient to figure out what the worker with the sign is trying to tell you. You may not just proceed. Will a robot car be smart enough on its own to stop, wait for a vehicle with a sign "Follow me", and then follow it? Guidance messages could be handy.

            Laser, or 60 GHz, would be safe because the medium is short-range by definition. Nobody from China will be able to "hack" a car in LA - you have to be within 100 yards, for example. Second, the only action of such a hack would be for the car to pull over and report to the police (and to the owner) that it had been stopped.

            Airliners are all automated for many years now; there was exactly one major hijacking since 9/11 - and who did it? Not the computer - it was the pilot.

        • (Score: 2) by mhajicek on Friday August 28 2015, @03:56AM

          by mhajicek (51) on Friday August 28 2015, @03:56AM (#228840)

          I can see it if the passenger is injured or incapacitated, but there should always should be a manual override. Perhaps the police could send a stop request, and a message would appear on the dash with options to accept or refuse. If no choice is made after a timeout it would default to accept.

          --
          The spacelike surfaces of time foliations can have a cusp at the surface of discontinuity. - P. Hajicek