Stories
Slash Boxes
Comments

SoylentNews is people

posted by takyon on Sunday May 03 2015, @09:26AM   Printer-friendly
from the garbage-in-garbage-out dept.

Tim O'Reilly has advocated for the idea of algorithmic regulation - reducing the role of people and replacing them with automated systems in order to make goverment policy less biased and more efficient. But the idea has been criticized as utopianism, where actual implementations are likely to make government more opaque and even less responsive to the citizens who have the least say in the operation of society.

Now, as part of New America's annual conference What Drives Innovation Around the Country? Virginia Eubanks has written an essay examining such automation in the cases of pre-crime and welfare fraud. Is it possible to automate away human judgment from the inherently human task of governance and still achieve humane results? Or is inefficiency and waste an unavoidable part of the process?

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by PizzaRollPlinkett on Sunday May 03 2015, @04:07PM

    by PizzaRollPlinkett (4512) on Sunday May 03 2015, @04:07PM (#178140)

    The major problem I see is that the individuals whose lives are affected by these impersonal algorithmic processes usually have no recourse when an error, false positive, or similar thing happens. Individuals don't even usually know what data is being collected on them by shadowy companies that collect and sell data to large corporations. Much of life is being dictated by numbers like credit scores, grades, and so on which are generated by impersonal processes. Vast files are collected about people, the data in which is of dubious quality. When something goes wrong, where does an affected individual go to appeal? The more these impersonal processes crank out "big data" numbers about people, the less anyone is going to be involved. Corporations and governments have little incentive to do anything at all for any random individual, unless they are shamed into it by bad media attention or something. As long as individuals end up as collateral damage by these impersonal algorithmic processes, it's hard to want to trust them. We can't get credit scores right, and we want to use the same "big data" concepts to predict who will be a criminal? What if someone is predicted to be a criminal, what will happen to them?

    Given the track record of IQ tests, lie detector tests, "cyber' defenses, and so on, the probability is that these new algorithms are going to be more snake oil sold by corporations and consultants to big corporations and governments. That doesn't inspire confidence.

    Another major problem is mission creep. Social security numbers have become de facto unique identification numbers. Credit scores are needed for job applications. So when there's a profile on my predicted criminal activity, who is going to want it in the future? Will I be denied a job because some algorithm predicted that I was a criminal?

    --
    (E-mail me if you want a pizza roll!)
    Starting Score:    1  point
    Moderation   +3  
       Insightful=2, Interesting=1, Total=3
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 3, Interesting) by turgid on Sunday May 03 2015, @07:56PM

    by turgid (4318) Subscriber Badge on Sunday May 03 2015, @07:56PM (#178210) Journal

    Here in Blighty, about 150 sub-postmasters were accused of fraud because of a new, buggy computer system they were forced to use. Some went to prison because and some had to remortgage their houses to pay back money that they were accused of stealing.

    Of course the software couldn't have been wrong, could it? Of course not! Over night, all of these people just decided to become crooks...

    See here [bbc.co.uk],here [bbc.co.uk] here [accountingweb.co.uk],here [ukcampaign4change.com] and here [ft.com].