Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Wednesday July 04 2018, @01:22PM   Printer-friendly
from the No-irrational-AI-for-me...-make-mine-real! dept.

Ibrahim Diallo was allegedly fired by a machine. Recent news reports relayed the escalating frustration he felt as his security pass stopped working, his computer system login was disabled, and finally he was frogmarched from the building by security personnel. His managers were unable to offer an explanation, and powerless to overrule the system.

Some might think this was a taste of things to come as artificial intelligence is given more power over our lives. Personally, I drew the opposite conclusion. Diallo was sacked because a previous manager hadn't renewed his contract on the new computer system and various automated systems then clicked into action. The problems were not caused by AI, but by its absence.

The systems displayed no knowledge-based intelligence, meaning they didn't have a model designed to encapsulate knowledge (such as human resources expertise) in the form of rules, text and logical links. Equally, the systems showed no computational intelligence – the ability to learn from datasets – such as recognising the factors that might lead to dismissal. In fact, it seems that Diallo was fired as a result of an old-fashioned and poorly designed system triggered by a human error. AI is certainly not to blame – and it may be the solution.

This man was fired by a computer

What do you guys think about hiring and firing by AI? Would you agree with the article's premise?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1) by khallow on Thursday July 05 2018, @03:55AM

    by khallow (3766) Subscriber Badge on Thursday July 05 2018, @03:55AM (#702830) Journal

    But no, please tell me how AI will solve all the worlds problems in ways that people couldn't.

    Legal blame deflection. Don't sue me, the computer did that.

    I noticed that the problem triggered because the victim's manager had been laid off (and didn't renew a crucial data record for the victim) and that no one could reverse the decision of the computer after being triggered by such a minor problem. Why set up a system that way? Because you want to avoid legal responsibility for the decisions of the computer system. So given the presence of laid off employees and a system that was designed to fire people without recourse, it appears that the business in question was probably planning to downsize more in some way and lay off a bunch of people, perhaps on an ongoing, permanent basis. The computer system was probably legal protection for the firings.