Stories
Slash Boxes
Comments

SoylentNews is people

posted by CoolHand on Thursday May 21 2015, @02:02PM   Printer-friendly
from the oh-the-inhumanity-of-it-all dept.

Algorithms tell you how to vote. Algorithms can revoke your driver’s license and terminate your disability benefits. Algorithms predict crimes. Algorithms ensured you didn’t hear about #FreddieGray on Twitter. Algorithms are everywhere, and, to hear critics, they are trouble. What’s the problem? Critics allege that algorithms are opaque, automatic, emotionless, and impersonal, and that they separate decision-makers from the consequences of their actions. Algorithms cannot appreciate the context of structural discrimination, are trained on flawed datasets, and are ruining lives everywhere. There needs to be algorithmic accountability. Otherwise, who is to blame when a computational process suddenly deprives someone of his or her rights and livelihood?

But at heart, criticism of algorithmic decision-making makes an age-old argument about impersonal, automatic corporate and government bureaucracy. The machine like bureaucracy has simply become the machine. Instead of a quest for accountability, much of the rhetoric and discourse about algorithms amounts to a surrender—an unwillingness to fight the ideas and bureaucratic logic driving the algorithms that critics find so creepy and problematic. Algorithmic transparency and accountability can only be achieved if critics understand that transparency (no modifier is needed) is the issue. If the problem is that a bureaucratic system is impersonal, unaccountable, creepy, and has a flawed or biased decision criteria, then why fetishize and render mysterious the mere mechanical instrument of the system’s will ?

http://www.slate.com/articles/technology/future_tense/2015/05/algorithms_aren_t_responsible_for_the_cruelties_of_bureaucracy.single.html

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by kurenai.tsubasa on Thursday May 21 2015, @02:45PM

    by kurenai.tsubasa (5227) on Thursday May 21 2015, @02:45PM (#186029) Journal

    They're complaining that Facebook and Google may present different information to different users that may influence who they vote for.

    As if the deluge of mailings and commercials on TV and streaming services every other fall doesn't tell voters who to vote for.

    Critics allege that algorithms are opaque, automatic, emotionless, and impersonal, and that they separate decision-makers from the consequences of their actions.

    This is the part I had trouble with. The only reason decision-makers get separated from the consequences of the algorithms they approve is because we let them throw their hands up and say “It's too technical! I can't be expected to know how the rules I asked be programmed work! Blame the programmer!”

    Instead of a quest for accountability, much of the rhetoric and discourse about algorithms amounts to a surrender—an unwillingness to fight the ideas and bureaucratic logic driving the algorithms that critics find so creepy and problematic.

    Ah, much better. All computers do is enable the ones calling the shots to hide behind “It's too technical! I'm not a wizard!”

    We can't decide if programmers are misogynist assholes who refuse to let women have the secret sauce that makes programming easy or if they're a bunch of wizards practicing a mysterious art who shout at bugs, “Flame of Udûn! You shall not pass!”

    Starting Score:    1  point
    Moderation   +2  
       Interesting=2, Total=2
    Extra 'Interesting' Modifier   0  

    Total Score:   3  
  • (Score: 0) by Anonymous Coward on Thursday May 21 2015, @03:41PM

    by Anonymous Coward on Thursday May 21 2015, @03:41PM (#186046)

    The decision maker was the one who said "let's put the likely race match based on name (or zip code) into the business intelligence regression and see what correlations it comes up with." So a human decision maker (programmer) could remove those heuristics, sure.

  • (Score: 0) by Anonymous Coward on Thursday May 21 2015, @04:21PM

    by Anonymous Coward on Thursday May 21 2015, @04:21PM (#186063)

    We can't decide if programmers are misogynist assholes who refuse to let women have the secret sauce that makes programming easy or if they're a bunch of wizards practicing a mysterious art who shout at bugs, “Flame of Udûn! You shall not pass!”

    The two are not mutually exclusive. Also wizards can be misogynist assholes, and wizardry has always been associated with secrets.

    And remember, there's white magic and black magic.

    • (Score: 5, Funny) by kurenai.tsubasa on Thursday May 21 2015, @05:43PM

      by kurenai.tsubasa (5227) on Thursday May 21 2015, @05:43PM (#186118) Journal

      You might be on to something.

      I remember when I used to visit virtual spaces (MOOs, MUDs, etc) and I took an interest in how I might be able to program one (late 80s). I was never able to find any books at my local library about it. Eventually I noticed something odd about the Dewey Decimal system. I would have expected to find the books I was looking for in the 500s or 600s, but oddly, the system skips over 626 [wikipedia.org].

      It occurred to me that 626, if you square the 6s and add together with the 2, is equal to 74. I knew something was afoot. A martial artist I knew had an extensive collection of obscure techniques, some forbidden, on scrolls that were hidden in his family's dojo. One night during a new moon, I snuck into the dojo and located the 74th scroll. It spoke of a school of indiscriminate programming, or “hacking,” and the method to obtain access to the materials at the library I needed.

      A few days later, I hid myself in the library past closing. There is a certain book in every library in the world.* Each one looks different, but if you know about the school of indiscriminate programming, you will know that book when you see it. Apprehensively, I pulled the book forward and spoke the passcode: “shibboleet.”

      The entire shelf rose to reveal a secret section of the library for programmers only. These books are classified as 000-099, as most people begin counting at 1, but only programmers would begin counting at 0. I studied for what seemed like weeks only sneaking outside to get a drink from the library's water fountain or to get some Cheetos or Doritos from the vending machine.

      Finally, I obtained the dark secrets of the school of indiscriminate programming, and I was finally able access the hidden screens on my home computer that enabled me to program a short little text adventure of my own. I used to have a copy of it, but it's disappeared to the sands of time. It wasn't really that good anyway.

      * These are not to be confused with the Gideon's Bible found in every hotel room, but I've said too much already.

      • (Score: 2) by edIII on Thursday May 21 2015, @11:11PM

        by edIII (791) on Thursday May 21 2015, @11:11PM (#186246)

        May I, please, subscribe to your newsletter?

        --
        Technically, lunchtime is at any moment. It's just a wave function.
  • (Score: 2) by Nuke on Friday May 22 2015, @10:18AM

    by Nuke (3162) on Friday May 22 2015, @10:18AM (#186389)
    kurenai.tsubasa wrote :-

    The only reason decision-makers get separated from the consequences .. is because we let them throw their hands up and say “It's too technical!"

    What do you mean "we let them"? How can you stop them? Pin their arms to their side?

    I tried to open a savings account, on-line, the other day, an account the UK Post Office runs. The form on the screen wanted my address and one compulsory field was the road name. But I live on a country road with no name, so I put in the Transport Ministry designation code (for those not familiar with the UK, this is like A1234 and is never normally used in an address). The application failed, because it did not match with any other address they could find associated with my name, I guess like the voting register.

    I phoned the help line and basically they said f#@k off, and I should live on a road with a name.

    I visited a post office to try to sort it, but the clerk was powerless to do anything about it, and metaphorically threw her hands up and said “It's too technical!". What should I have done - hit the woman? I have given up trying to open this account now.

    • (Score: 2) by kurenai.tsubasa on Friday May 22 2015, @04:08PM

      by kurenai.tsubasa (5227) on Friday May 22 2015, @04:08PM (#186511) Journal

      Well, this is the larger problem of bureaucracy, but I think your situation is more hopeful than what TFS mentioned.

      The clerk has no more power than the form on the screen. In fact, I wouldn't be surprised if the clerk was using that same form. The question I would have is why didn't she escalate the issue to her superiors? When we design a form, often we miss some scenario we aren't familiar with. If the issue is escalated, the decision-makers can then get things moving to allow for the scenario. The goal is clear and the problem is technical. It can be fixed.

      Now the rest of this is a bit different from what TFS was talking about.

      However, I've often found myself in a situation where I'm asked to implement some byzantine procedure with loads of unintended consequences by what I can only imagine are committees who are stuck in the “expert system” paradigm (i.e. we can replace experience and expertise with a near-minimum wage worker in a call center navigating a decision tree, and if it doesn't work, it's only because we haven't included enough decision points). By the time it gets to my desk, the decision tree is usually completely divorced from any kind of goal or mission, and it only gets weirder as the decision points are piled on.

      Some times I can work with the client and help them design something that will accomplish the mission better. The majority of clients, though, just say “Stop being rude! All you have to do is…!” (Being rude, apparently, to some people, is having the chutzpah to dare question one's betters.)

      Say we need to dispatch a tow truck. Seems easy, right? Well, a call comes in from 911 dispatch and the end user marks the call as a routine matter for office hours and it gets dropped. Now the masters of the universe are (rightly) breathing down the neck of the higher-ups at the towing company. But, but! For some reason, we don't step back and see that the root cause was that the call was marked incorrectly. Now we need another decision point for 911 dispatch calling.

      So, somebody calls in with minor vehicle damage after bumping into a downed tree and wondering if they should contact 911, and the operator selects 911 dispatch and sends out a tow truck. When it gets there, there's nobody to tow! Then we get an actual call from 911 dispatch, but since the truck is on the other side of the county, it takes the driver too long to get there. So, the masters of the universe get rumbling again.

      Wash, rinse, repeat about 50 times, and as this decision tree gets more complicated, things get more byzantine and the frequency of these unscripted things happening increases.

      At this point, I bring up the idea that perhaps the operators need a little training and we should revert our changes.

      Nope, nope, can't be done! Add in more decision points! Why can't you just make the computer do the right thing?! Then I say, “Look, the reason that call was sent to the wrong tow truck was because we went through the decision tree this way and you wanted, on blue moons where the dew point is below 50 degrees and there's a 50-60% probability of precipitation and the moon is at perihelion and the CFO is at 2nd residence instead of his cottage, to send the call to that tow truck instead of the one you're saying it should have gone to. There must be some broader goal or concern behind all these decision points you've asked me to put in. Can't we set up and meeting with all the stakeholders and simplify this a bit?”

      Nope, I don't have time to understand the decision tree! It's too technical! Just throw in another decision point! I'm not going to tell you why that was the wrong tow truck, only that we can't sent that tow truck if it's a 2008 Mercedes, only if it's a 2009 Mercedes, except when there's an amber alert! I'm not going to tell you the logic behind this! Just do it, you illiterate, arrogant nerd!

    • (Score: 2) by kurenai.tsubasa on Friday May 22 2015, @04:25PM

      by kurenai.tsubasa (5227) on Friday May 22 2015, @04:25PM (#186521) Journal

      The only reason decision-makers get separated from the consequences .. is because we let them throw their hands up and say “It's too technical!"

      What do you mean "we let them"? How can you stop them? Pin their arms to their side?

      I realized I didn't even respond to this. My apologies! Somebody mod my other comment offtopic.

      Programmers and other IT folks have no professional association with ethical oversight that can throw around some weight in these matters. So, yes, I'm essentially saying we should pin their arms to their side when we get asked to implement systems that may provide voters with disinformation, but we can't because a paycheck is a nice thing to have. Essentially, since getting programmers interested in some kind of professional association like that would be like herding cats, we let them get away with it.

      Of course, that doesn't stop the political machine at large, and I'm not saying there shouldn't be free speech, but when an information system skews information one way for one user and another way for another user in the background without the users being aware, that's insidious.

      The part where “It's too technical!” comes in is that most folks are not yet used to the reality that if we tell a computer to evaluate a complicated set of rules, it will occasionally come to a bizarre result. In a traditional bureaucracy, there are always ways around the rules, so silly conclusions can be worked around. Involve a computer, and there's no leeway. The decision-makers haven't adjusted to that and will blame the computer instead of revisiting the complicated set of rules.