A police officer is directing traffic in the intersection when he sees a self-driving car barreling toward him and the occupant looking down at his smartphone. The officer gestures for the car to stop, and the self-driving vehicle rolls to a halt behind the crosswalk. "This seems like a pretty plausible interaction. Human drivers are required to pull over when a police officer gestures for them to do so. It’s reasonable to expect that self-driving cars would do the same." But Will Oremus writes that while it's clear that police officers should have some power over the movements of self-driving cars, "what’s less clear is where to draw the line." Should an officer be able to "do the same if he suspects the passenger of a crime? And what if the passenger doesn’t want the car to stop—can she override the command, or does the police officer have ultimate control?"
According to a RAND Corp. report on the future of technology and law enforcement “the dark side to all of the emerging access and interconnectivity is the risk to the public’s civil rights, privacy rights, and security.” It added, “One can readily imagine abuses that might occur if, for example, capabilities to control automated vehicles and the disclosure of detailed personal information about their occupants were not tightly controlled and secured.”
(Score: 3, Insightful) by Anal Pumpernickel on Wednesday August 26 2015, @10:28PM
If you don't control your car, your car controls you. If the cops can control your car, that likely means lots of non-free proprietary software was used, and therefore the car can't be trusted. I hope no one will be dumb enough to buy a car like that. Who knows what other anti-features it would contain? Privacy-invading features, DRM, and who knows what else. Both companies and the government will be more than willing to screw you over.