Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Sunday January 21 2018, @06:41PM   Printer-friendly
from the crowdsourced-sentencing dept.

Submitted via IRC for AndyTheAbsurd

n February 2013, Eric Loomis was found driving a car that had been used in a shooting. He was arrested, and pleaded guilty to eluding an officer. In determining his sentence, a judge looked not just to his criminal record, but also to a score assigned by a tool called COMPAS.

Developed by a private company called Equivant (formerly Northpointe), COMPAS—or the Correctional Offender Management Profiling for Alternative Sanctions—purports to predict a defendant's risk of committing another crime. It works through a proprietary algorithm that considers some of the answers to a 137-item questionnaire.

COMPAS is one of several such risk-assessment algorithms being used around the country to predict hot spots of violent crime, determine the types of supervision that inmates might need, or—as in Loomis's case—provide information that might be useful in sentencing. COMPAS classified him as high-risk of re-offending, and Loomis was sentenced to six years.

He appealed the ruling on the grounds that the judge, in considering the outcome of an algorithm whose inner workings were secretive and could not be examined, violated due process. The appeal went up to the Wisconsin Supreme Court, who ruled against Loomis, noting that the sentence would have been the same had COMPAS never been consulted. Their ruling, however, urged caution and skepticism in the algorithm's use.

Source: https://www.theatlantic.com/technology/archive/2018/01/equivant-compas-algorithm/550646/

Also at Wired and Gizmodo


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by bradley13 on Sunday January 21 2018, @07:00PM (2 children)

    by bradley13 (3053) on Sunday January 21 2018, @07:00PM (#625743) Homepage Journal

    Stuff like this absolutely must be open source. It's basically calculating the guy's sentence. Sounds like the judges involved are clueless where software is concerned.

    --
    Everyone is somebody else's weirdo.
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by Bot on Sunday January 21 2018, @08:26PM (1 child)

    by Bot (3902) on Sunday January 21 2018, @08:26PM (#625781) Journal

    There probably is a predetermined combination, in the answers that can be influenced by the accused guy, that makes the algorithm come up with the least pessimistic prediction.

    Even if no backdoor were involved, anybody who can test the algorithm can come up with a good combination to minimize the damages.

    All are equal before the law... after the law, problems arise.

    --
    Account abandoned.
    • (Score: 2) by frojack on Monday January 22 2018, @12:04AM

      by frojack (1554) on Monday January 22 2018, @12:04AM (#625890) Journal

      Well clearly there is a boat load of questions, mean to be asked of a defendant, and not all of them are likely to be meaningful, and might just be there so that when a critical question comes up they answer of of habit, their mind being numbed by the tiresome questioning.

       

      --
      No, you are mistaken. I've always had this sig.