Slash Boxes

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 15 submissions in the queue.
posted by janrinok on Friday October 14 2016, @05:28PM   Printer-friendly
from the avoiding-Samaritan dept.

The UK government has been urged to establish an AI ethics board to tackle the creeping influence of machine learning on society.

The call comes from a Robotics and Artificial Intelligence report published yesterday by the the House of Commons science and technology select committee. It quotes experts who warned the panel that AI "raises a host of ethical and legal issues".

"We recommend that a standing Commission on Artificial Intelligence be established, based at the Alan Turing Institute, to examine the social, ethical and legal implications of recent and potential developments in AI," the report said.

It highlighted that methods are required to verify that AI systems are operating in a transparent manner, to make sure that their behaviour is not unpredictable, and that any decisions made can be explained.

Innovate UK – an agency of's Department of Business – said that "no clear paths exist for the verification and validation of autonomous systems whose behaviour changes with time."

They think they can stop Samaritan?

Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by tibman on Friday October 14 2016, @07:46PM

    by tibman (134) Subscriber Badge on Friday October 14 2016, @07:46PM (#414423)

    Yeah, emergent behavior is probably impossible to audit. It's like auditing a bunch of ants to determine what the exact shape/size/design the colony is going to be. Any prediction is a guess at best.

    SN won't survive on lurkers alone. Write comments.
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 0) by Anonymous Coward on Friday October 14 2016, @08:30PM

    by Anonymous Coward on Friday October 14 2016, @08:30PM (#414440)

    We are basically going to come back around to the messy, imperfect nature of human behavior so abhorred by literalist geeks - the need for "good" judgment where "good" is some sort of consensus (aka cultural) concept that has no fixed, absolute definition.