Stories
Slash Boxes
Comments

SoylentNews is people

posted by CoolHand on Thursday May 21 2015, @02:02PM   Printer-friendly
from the oh-the-inhumanity-of-it-all dept.

Algorithms tell you how to vote. Algorithms can revoke your driver’s license and terminate your disability benefits. Algorithms predict crimes. Algorithms ensured you didn’t hear about #FreddieGray on Twitter. Algorithms are everywhere, and, to hear critics, they are trouble. What’s the problem? Critics allege that algorithms are opaque, automatic, emotionless, and impersonal, and that they separate decision-makers from the consequences of their actions. Algorithms cannot appreciate the context of structural discrimination, are trained on flawed datasets, and are ruining lives everywhere. There needs to be algorithmic accountability. Otherwise, who is to blame when a computational process suddenly deprives someone of his or her rights and livelihood?

But at heart, criticism of algorithmic decision-making makes an age-old argument about impersonal, automatic corporate and government bureaucracy. The machine like bureaucracy has simply become the machine. Instead of a quest for accountability, much of the rhetoric and discourse about algorithms amounts to a surrender—an unwillingness to fight the ideas and bureaucratic logic driving the algorithms that critics find so creepy and problematic. Algorithmic transparency and accountability can only be achieved if critics understand that transparency (no modifier is needed) is the issue. If the problem is that a bureaucratic system is impersonal, unaccountable, creepy, and has a flawed or biased decision criteria, then why fetishize and render mysterious the mere mechanical instrument of the system’s will ?

http://www.slate.com/articles/technology/future_tense/2015/05/algorithms_aren_t_responsible_for_the_cruelties_of_bureaucracy.single.html

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by kurenai.tsubasa on Friday May 22 2015, @04:25PM

    by kurenai.tsubasa (5227) on Friday May 22 2015, @04:25PM (#186521) Journal

    The only reason decision-makers get separated from the consequences .. is because we let them throw their hands up and say “It's too technical!"

    What do you mean "we let them"? How can you stop them? Pin their arms to their side?

    I realized I didn't even respond to this. My apologies! Somebody mod my other comment offtopic.

    Programmers and other IT folks have no professional association with ethical oversight that can throw around some weight in these matters. So, yes, I'm essentially saying we should pin their arms to their side when we get asked to implement systems that may provide voters with disinformation, but we can't because a paycheck is a nice thing to have. Essentially, since getting programmers interested in some kind of professional association like that would be like herding cats, we let them get away with it.

    Of course, that doesn't stop the political machine at large, and I'm not saying there shouldn't be free speech, but when an information system skews information one way for one user and another way for another user in the background without the users being aware, that's insidious.

    The part where “It's too technical!” comes in is that most folks are not yet used to the reality that if we tell a computer to evaluate a complicated set of rules, it will occasionally come to a bizarre result. In a traditional bureaucracy, there are always ways around the rules, so silly conclusions can be worked around. Involve a computer, and there's no leeway. The decision-makers haven't adjusted to that and will blame the computer instead of revisiting the complicated set of rules.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2