Accidents involving driverless cars, calculating the probability of recidivism among criminals, and influencing elections by means of news filters—algorithms are involved everywhere. Should governments step in [phys.org]?
Yes, says Markus Ehrenmann of Swisscom.
The current progress being made in processing big data and in machine learning is not always to our advantage. Some algorithms are already putting people at a disadvantage today and will have to be regulated.
For example, if a driverless car recognises an obstacle in the road, the control algorithm has to decide whether it will put the life of its passengers at risk or endanger uninvolved passers-by on the pavement. The on-board computer takes decisions that used to be made by people. It's up to the state to clarify who must take responsibility for the consequences of automated decisions (so-called 'algorithmic accountability'). Otherwise, it would render our legal system ineffective.
The computer made him do it.