Stories
Slash Boxes
Comments

SoylentNews is people

posted by on Wednesday January 04 2017, @09:13AM   Printer-friendly
from the the-computer-made-me-do-it dept.

Accidents involving driverless cars, calculating the probability of recidivism among criminals, and influencing elections by means of news filters—algorithms are involved everywhere. Should governments step in?

Yes, says Markus Ehrenmann of Swisscom.

The current progress being made in processing big data and in machine learning is not always to our advantage. Some algorithms are already putting people at a disadvantage today and will have to be regulated.

For example, if a driverless car recognises an obstacle in the road, the control algorithm has to decide whether it will put the life of its passengers at risk or endanger uninvolved passers-by on the pavement. The on-board computer takes decisions that used to be made by people. It's up to the state to clarify who must take responsibility for the consequences of automated decisions (so-called 'algorithmic accountability'). Otherwise, it would render our legal system ineffective.

[...]
No, says Mouloud Dey of SAS.

We need to be able to audit any algorithm potentially open to inappropriate use. But creativity can't be stifled nor research placed under an extra burden. Our hand must be measured and not premature. Creative individuals must be allowed the freedom to work, and not assigned bad intentions a priori. Likewise, before any action is taken, the actual use of an algorithm must be considered, as it is generally not the computer program at fault but the way it is used.

It's the seemingly mysterious, badly intentioned and quasi-automatic algorithms that are often apportioned blame, but we need to look at the entire chain of production, from the programmer and the user to the managers and their decisions. We can't throw the baby out with the bathwater: an algorithm developed for a debatable use, such as military drones, may also have an evidently useful application which raises no questions.

Two opposing viewpoints are provided in this article; please share yours.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by Dr Spin on Wednesday January 04 2017, @10:12AM

    by Dr Spin (5239) on Wednesday January 04 2017, @10:12AM (#449289)

    The world is bigger than the USA.

    There are over 100 countries in it, each with at least one government.

    Some of these governments may have perfectly sane methods of deciding on policy. Others
    have methods that are widely agreed to be terminally insane.

    What good governments do, bad governments are likely to do also, but unlikely to do
    well.

    All of the above is broadly also true of people and machines.

    Conclusion:

    What ever you do, bad things will happen. Some bad things are worse than others. The world
    has no completely reliable method of ensuring the causes of bad things happening are minimized,
    BUT the method applied to civil aerospace operations has a pretty good track records, and might
    be worth pursuing in this case.

    This involves cooperation of manufacturers and governments across the world, and took the best
    part of 100 years to develop. So don't hold your breath (or step out in the road in front of a
    car, with or without driver).

    Above all, do not step out in front of Internet connected cars as they are likely to be piloted by
    ransomware.

    --
    Warning: Opening your mouth may invalidate your brain!
    Starting Score:    1  point
    Moderation   +2  
       Interesting=2, Total=2
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 0) by Anonymous Coward on Wednesday January 04 2017, @10:56AM

    by Anonymous Coward on Wednesday January 04 2017, @10:56AM (#449297)

    No reason to do ANYTHING special for algorithms, if a car gets in an accident the police get to charge it with a crime and confiscate it, just like they can do with any other possession.

    Buggy software just means more cars lining LEO's pockets.

    Self solving problem! :)