Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Thursday November 05 2015, @12:17PM   Printer-friendly
from the debugging? dept.

In a kind of counter intuitive argument in this article in The Wall Street Journal , Uber drivers may now have to battle with the fact that no human is actually telling them what to do. Most of the tasks are now being automated. The study by Researchers at the Data and Society research institute at New York University point out that Uber uses software to exert similar control over workers that a human manager would.

The world looks more and more like the Manna short story, where every aspect of our employee life is used to classify our performance. Another interesting discussion point: Is the middle manager role disappearing?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by Thexalon on Thursday November 05 2015, @01:59PM

    by Thexalon (636) on Thursday November 05 2015, @01:59PM (#258835)

    The algorithm is determined by people. And the consequences for defying the algorithm are also determined by people. So no, the algorithm isn't your boss, people are, even if they are hiding behind a computer.

    --
    The only thing that stops a bad guy with a compiler is a good guy with a compiler.
    Starting Score:    1  point
    Moderation   +2  
       Insightful=2, Total=2
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 2) by joshuajon on Thursday November 05 2015, @03:13PM

    by joshuajon (807) on Thursday November 05 2015, @03:13PM (#258881)

    In many ways an impartial algorithm would be preferable to a partial, subjective, biased, bigoted etc human boss. I've only read the first couple chapters of the Manna story but it seemed like initially the workers liked it very much. The problems arose when the program began altering itself. Call it AI, or genetic algorithms, or what have you. If the program is consistently modified with the sole metric being improved productivity it will necessarily decline into tyrannical abuse.

    • (Score: 3, Insightful) by Thexalon on Thursday November 05 2015, @03:22PM

      by Thexalon (636) on Thursday November 05 2015, @03:22PM (#258889)

      My point here is that (1) the algorithm is only as impartial as its programmers and the bosses that direct them, and (2) will be modified with the goal of either improved productivity or lower wages because those are the sole goals of management when it comes to their staff. Anybody who expects the algorithm to be any more benevolent than a human boss is kidding themselves.

      This is all a variation of an old rule: For every computer error, there are at least two human errors, if you count the error of blaming the problem on the computer.

      --
      The only thing that stops a bad guy with a compiler is a good guy with a compiler.
      • (Score: 0) by Anonymous Coward on Thursday November 05 2015, @08:03PM

        by Anonymous Coward on Thursday November 05 2015, @08:03PM (#259051)

        well, the algorithm is impartial in the sense that it will never come to work after a fight with its spouse, or after its' kid had a tantrum at breakfast after waking up at 4 in the morning, and other stuff.

        your 2nd point is valid only as long as the union has no input into the algorithm. if you don't like the word "union", then call it work regulations or whatever. the point is that this algorithm should also be designed with the wellbeing of the employees in mind, whether this is because there are laws protecting them, or there is a union negociating for them.

    • (Score: 2) by sjames on Thursday November 05 2015, @09:03PM

      by sjames (2882) on Thursday November 05 2015, @09:03PM (#259089) Journal

      It's not that it altered itself, it's more that once the people were accustomed to obeying it without question, their obedience was used to warehouse them as cheaply as possible once they weren't economically necessary to the wealthy anymore.

  • (Score: 2) by Jiro on Thursday November 05 2015, @03:35PM

    by Jiro (3176) on Thursday November 05 2015, @03:35PM (#258896)

    I think the idea is that the algorithm doesn't include personal quirks and prejudices. No algorithm of this type is going to have a line in it "reduce the score if the employee is Joe", even if one of the bosses hates Joe and cuts him much less slack than anyone else. Only human beings do that.

    • (Score: 3, Insightful) by zoefff on Thursday November 05 2015, @03:57PM

      by zoefff (5470) on Thursday November 05 2015, @03:57PM (#258905)

      But it is also the other way around. Humans can forgive your mistakes of plain forget them. Computers tend to not to forget and therefore, in principle, your mistakes could haunt you for a longer period than wanted

    • (Score: 3, Insightful) by Thexalon on Thursday November 05 2015, @04:40PM

      by Thexalon (636) on Thursday November 05 2015, @04:40PM (#258928)

      No algorithm of this type is going to have a line in it "reduce the score if the employee is Joe"

      Why wouldn't it? If, for example, a programmer on that algorithm project found out that his girlfriend cheated on him with Joe, he might well put something like that in place. And that modified algorithm would pass through QA no trouble, because the QA analyst wouldn't be told about this and wouldn't catch it on a regression test unless he happened to test using Joe's account specifically.

      Algorithms don't strip people of power. They shift who has the power to the people who control the algorithm.

      --
      The only thing that stops a bad guy with a compiler is a good guy with a compiler.
    • (Score: 2) by sjames on Thursday November 05 2015, @09:06PM

      by sjames (2882) on Thursday November 05 2015, @09:06PM (#259093) Journal

      Unfortunately, it will have no subroutine to compute a mercy score either. It will ruthlessly carry out the policy set by people who never actually relate to the employees as human beings.

      It's easier to shit on someone you've never met, and especially one you will never possibly run in to outside of work.

  • (Score: 1) by nitehawk214 on Thursday November 05 2015, @06:28PM

    by nitehawk214 (1304) on Thursday November 05 2015, @06:28PM (#258991)

    "hiding behind the computer" I think this is far more accurate than anything else. I call phone support to pay my bill, "oh sorry I can't accept payment right now, our computer system is down." Then write my details down you lazy pile of shit. It isn't my responsibility to keep calling you back every couple of hours to see if your computer works.

    --
    "Don't you ever miss the days when you used to be nostalgic?" -Loiosh
    • (Score: 1) by khallow on Thursday November 05 2015, @10:47PM

      by khallow (3766) Subscriber Badge on Thursday November 05 2015, @10:47PM (#259147) Journal

      Then write my details down you lazy pile of shit.

      Where I work, we are not allowed to write down credit card numbers ever. Guests who are physically present at the point of sales can, when the computer systems go down, have imprints made of their credit cards, but these imprints are promptly destroyed once they're entered into the computer at a later date. I believe this is SOP these days with customer financial information.

      The laziness would be more in not bothering to call you back when the computer systems come back up.

      • (Score: 2) by Thexalon on Friday November 06 2015, @03:07AM

        by Thexalon (636) on Friday November 06 2015, @03:07AM (#259249)

        Where I work, we are not allowed to write down credit card numbers ever.

        And for good reason. At one of the companies I worked for, we noticed that a bunch of customers started reporting fraudulent charges. We did some digging, and noticed that they had all done their purchases talking to a particular customer service rep. The customer service manager went over to the guy's desk and saw a bunch of 16-digit numbers written down on a sheet of paper ... that matched up with the customers' credit cards.

        The customer service rep in question left the building in handcuffs, but still, damage had been done to both customer and company.

        --
        The only thing that stops a bad guy with a compiler is a good guy with a compiler.
      • (Score: 1) by nitehawk214 on Friday November 06 2015, @02:01PM

        by nitehawk214 (1304) on Friday November 06 2015, @02:01PM (#259423)

        Actually the frustration I was venting at yesterday was one power company in particular that has computer systems so poorly set up that you have to call them on the phone and give them a authorization number if you make a payment by phone or online. The 3rd party payer accepted the payment, but after spending an hour on hold I just got a guy that told me there was nothing he could do. He also did not at least even pretend he cared that their hold music was static about half the time. I couldn't just "call back later, eventually we have no idea when it will be fixed", as they were threatening to turn off my power.

        Yeah yeah, customer service is a shitty job. But at least pretend you care. I blame the management for being blindsided by this. Backup plans. A computer system being down is something you can predict.

        --
        "Don't you ever miss the days when you used to be nostalgic?" -Loiosh
        • (Score: 1) by nitehawk214 on Friday November 06 2015, @02:02PM

          by nitehawk214 (1304) on Friday November 06 2015, @02:02PM (#259426)

          Anyhow, it is the benefit of being a government sponsored monopoly. You get to shit on your customers, and they cant even take their money elsewhere. We can't vote them out either and they pay for the politicians to play nice for them.

          --
          "Don't you ever miss the days when you used to be nostalgic?" -Loiosh