Should a self-driving car kill the baby or the grandma? Depends on where you're from.:
In 2014 researchers at the MIT Media Lab designed an experiment called Moral Machine. The idea was to create a game-like platform that would crowdsource people's decisions on how self-driving cars should prioritize lives in different variations of the "trolley problem." In the process, the data generated would provide insight into the collective ethical priorities of different cultures.
The researchers never predicted the experiment's viral reception. Four years after the platform went live, millions of people in 233 countries and territories have logged 40 million decisions, making it one of the largest studies ever done on global moral preferences.
A new paper published in Nature presents the analysis of that data and reveals how much cross-cultural ethics diverge on the basis of culture, economics, and geographic location.
[...] Awad hopes the results will also help technologists think more deeply about the ethics of AI beyond self-driving cars. "We used the trolley problem because it's a very good way to collect this data, but we hope the discussion of ethics don't stay within that theme," he said. "The discussion should move to risk analysis—about who is at more risk or less risk—instead of saying who's going to die or not, and also about how bias is happening." How these results could translate into the more ethical design and regulation of AI is something he hopes to study more in the future.
"In the last two, three years more people have started talking about the ethics of AI," Awad said. "More people have started becoming aware that AI could have different ethical consequences on different groups of people. The fact that we see people engaged with this—I think that that's something promising."
Journal Reference:
Edmond Awad, Sohan Dsouza, Richard Kim, et al. The Moral Machine experiment, Nature (DOI: 10.1038/s41586-018-0637-6)
(Score: 2) by bzipitidoo on Thursday January 28 2021, @08:32PM (1 child)
The dilemma is too contrived. Look at it this way. How many times do we face such a dilemma? Once, or many times? That makes a huge difference.
If the Trolley Problem happens repeatedly, even routinely, we should ask ourselves why, and what can we do to avoid ever having to face this dilemma in the first place? Bust the criminals who tied people to the tracks so that doesn't happen any more. Make trolleys safer, perhaps by installing an emergency stop system that could be as simple as an anchor under the trolley, to be dropped to hook onto the cross ties, for an emergency stop. It'll tear up track, but if it stops the trolley in time, it's worth it. Place cameras to watch the track, etc. There will be things that can be done well in advance, to ensure that such dilemmas are never routine.
Whenever something of that sort happens with automobiles, it often results in a recall. Or reforms. Same with planes and ships. And with pretty much any industrial accident. The post mortem inevitably turns up a series of mistakes and flaws that all combined to create a tragedy, most of which are easily prevented. In many cases, the designs were undermined by management and operator neglect, trying to save a few pennies, a little effort, by skimping on the safety measures and checks. Some cases are a sort of groupthink, in which everyone is on the same side, and there was no adversary to wreck the illusions of safety everyone spun. For example, the infamous Titanic being unsinkable. More recently, the Boeing 737 Max crashes may have ultimately been a regulatory failure, in which Boeing had become dependent upon safety agencies to keep them from making safety mistakes, and when that pressure was removed, they didn't handle the regulatory relief well.
(Score: 2) by sjames on Thursday January 28 2021, @10:56PM
The solution is to restrict the speed limit to 25 MPH, but as a society we have chosen to only do that in some cases where a trolly problem is most likely and also most likely to include children. The AI designers don't have the authority to change that.
The real world gets messy as well. Thow down the anchor and tear up the track, then 15 passengers die from the injuries that result from the sudden deceleration...