Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 16 submissions in the queue.
posted by martyb on Tuesday October 18 2016, @12:42PM   Printer-friendly
from the does-not-add-up dept.

The BBC is reporting on the Compas assessment, Correctional Offender Management Profiling for Alternative Sanctions. This tool is used by a number of agencies to assess if someone is likely to commit additional crimes and the resulting score is used in determining bail, sentencing, or determining parole. The article points out that while the questions on the assessment do not include race the resulting score may be correlated with race but this is disputed by the software's creators. The assessment scores someone on a 10 point scale but the algorithm used to determine someone's score is kept secret. Because of this defendants are unable to effectively dispute that the score is incorrect.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by Anonymous Coward on Tuesday October 18 2016, @01:05PM

    by Anonymous Coward on Tuesday October 18 2016, @01:05PM (#415641)

    There's another big problem. From the summary:

    The assessment scores someone on a 10 point scale but the algorithm used to determine someone's score is kept secret.

    So the sentencing is based on a secret. Even if the actual sentencing were perfectly fair and reasonable, this alone would constitute a major problem. Sentencing should only ever be based on facts available to the defendant, or at the very least to his lawyer.

    Also from the summary:

    Because of this defendants are unable to effectively dispute that the score is incorrect.

    The defendant should not have to prove that the score is incorrect. The judge should have to prove that the score is correct. If that is not possible, the score should not be allowed to be used.

    Starting Score:    0  points
    Moderation   +5  
       Insightful=5, Total=5
    Extra 'Insightful' Modifier   0  

    Total Score:   5  
  • (Score: 3, Interesting) by JoeMerchant on Tuesday October 18 2016, @02:16PM

    by JoeMerchant (3937) on Tuesday October 18 2016, @02:16PM (#415672)

    A fundamental flaw I see is that they are sentencing a defendant based upon a questionnaire completed by their self or their lawyer... if they make the algorithm public, their lawyer is duty bound to skew the answers as much as he can get away with to make sentencing as light as possible for their client.

    Then, regardless of the secrecy of the algorithm, good lawyers will begin to decode the workings of the black-box based on experience, and so, yet again, those defendants with good lawyers will get preferential sentencing results - regardless of the actual intent of the secret algorithm to make a fair, unbiased judgement.

    What I see a need for is an impartial party (judge?) to input the facts of the case to a public algorithm which makes the decision. Then, all that there is to skew the results is which facts are, or are not, included in the input - which seems to be the primary function of judges in our process anyway.

    --
    🌻🌻 [google.com]
    • (Score: 3, Insightful) by Thexalon on Tuesday October 18 2016, @03:42PM

      by Thexalon (636) on Tuesday October 18 2016, @03:42PM (#415711)

      The simple fact is that there's no algorithm available to solve these kinds of problems, because dispensing justice is difficult. I mean, how do you tell the difference, in a court of law, between a genuine loving family man and an abusive father who threatened his kids into saying that he was a genuine loving family man? And you can't simply say "take away all judge discretion in sentencing", because you want something different to happen for, say, a first-time offender who did something stupid and illegal in difficult circumstances, than you do for a hardened criminal who has been able to get away with their crimes up to now due to witness intimidation.

      --
      The only thing that stops a bad guy with a compiler is a good guy with a compiler.
      • (Score: 2) by darkfeline on Tuesday October 18 2016, @05:08PM

        by darkfeline (1030) on Tuesday October 18 2016, @05:08PM (#415747) Homepage

        >dispensing justice is difficult

        That's why we hire judges, or at least, that's the theory.

        --
        Join the SDF Public Access UNIX System today!
      • (Score: 2) by JoeMerchant on Wednesday October 19 2016, @04:23AM

        by JoeMerchant (3937) on Wednesday October 19 2016, @04:23AM (#415999)

        I agree - algorithms are for lazy judges, or courts that can't be bothered. I know of a county that uses "algorithmic sentencing" and a video taped judge to deal with traffic court - the bailiff explains your options, and you plead accordingly. If you plead that you don't like any of the options, you get to reschedule a hearing with a real judge. 99% take the "solve it now" route.

        --
        🌻🌻 [google.com]
        • (Score: 2) by AthanasiusKircher on Wednesday October 19 2016, @11:56AM

          by AthanasiusKircher (5291) on Wednesday October 19 2016, @11:56AM (#416087) Journal

          99% take the "solve it now" route.

          Is this sort of like how the majority of cases are resolved by plea bargains outside the courtroom, too? That system is known to cause significant problems, since defendants tend to be convinced by lawyers to "take the deal" even if they are innocent. (And, for those that are guilty, it often circumvents criminal statutes and sentencing guidelines by offering lesser charges or whatever.) And often the ramifications of "taking the deal" don't become apparent to defendants until much later. Anyhow, the stakes are usually lower in traffic court than criminal court, but I'm sure a system like this has likely figured out a way to "make a deal" with most defendants so that most people are statistically more likely to accept one of the automated options, even if it's really not the most just outcome.

          • (Score: 2) by JoeMerchant on Wednesday October 19 2016, @07:09PM

            by JoeMerchant (3937) on Wednesday October 19 2016, @07:09PM (#416295)

            I think it works for traffic court, though it does open traffic court up for potential abuse - one of the options is 2 nights (8 hours total) of traffic school and no points on your license, so you can basically get unlimited traffic tickets and never lose your license, whereas without traffic school, 3 citations for 15mph over the speed limit within 12 months would lead to a license suspension, and without court, you can only take traffic school to erase points once per 12 months.

            One would hope that the system would eventually bounce you out to a real judge if you had too many repeat offences - I don't know anybody who went in more than maybe 6 times, so I guess none of us were serious enough repeat offenders to merit a "real judge"'s attention.

            So, I know virtually nothing about what really goes on in criminal court, but if they have a massive case load of relatively minor offences that judges basically deal with by algorithm anyway, I could see this working to handle the front-lines and free up judges to handle the less mundane stuff. But, I would never think this is appropriate for situations that result in serious jail time.

            --
            🌻🌻 [google.com]