[NB: The following article makes reference to oft-cited Trolley problem. Highly recommended.--martyb/Bytram]
Imagine a future with self-driving cars that are fully autonomous. If everything works as intended, the morning commute will be an opportunity to prepare for the day's meetings, catch up on news, or sit back and relax.
But what if things go wrong? The car approaches a traffic light, but suddenly the brakes fail and the computer has to make a split-second decision. It can swerve into a nearby pole and kill the passenger, or keep going and kill the pedestrian ahead.
The computer controlling the car will only have access to limited information collected through car sensors, and will have to make a decision based on this. As dramatic as this may seem, we're only a few years away from potentially facing such dilemmas.
Autonomous cars will generally provide safer driving, but accidents will be inevitable—especially in the foreseeable future, when these cars will be sharing the roads with human drivers and other road users.
Tesla does not yet produce fully autonomous cars, although it plans to. In collision situations, Tesla cars don't automatically operate or deactivate the Automatic Emergency Braking (AEB) system if a human driver is in control.
In other words, the driver's actions are not disrupted—even if they themselves are causing the collision. Instead, if the car detects a potential collision, it sends alerts to the driver to take action.
In "autopilot" mode, however, the car should automatically brake for pedestrians. Some argue if the car can prevent a collision, then there is a moral obligation for it to override the driver's actions in every scenario. But would we want an autonomous car to make this decision?
(Score: 2, Insightful) by Anonymous Coward on Monday November 29 2021, @03:07PM (1 child)
Indeed!
"By entering this vehicle you indicate that you accept our terms of service [link to TOS TLDR here]...."
--
Then the obvious solution to the trolley problem is to allow the vehicle's passengers to die in the accident, since - by entering the vehicle - they have agreed to accept the chance of being killed by its operation. The pedestrians etc outside the vehicle have _not_ agreed to any contract governing the operation of the vehicle, and hence may not be killed by the AI's operation.
(Score: 2) by DannyB on Monday November 29 2021, @05:22PM
By entering this vehicle, you accept the EULA, which you will be provided at the end of the ride.
While Republicans can get over Trump's sexual assaults, affairs, and vulgarity; they cannot get over Obama being black.