Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Wednesday February 26 2014, @11:00PM   Printer-friendly
from the studies-show-poverty-causes-cancer dept.

Angry Jesus writes:

"The Chicago Police Department is mis-applying epidemiological science (the study of entire populations) to target individuals in a real-life version of Minority Report. They have decided that it is a good idea to put people on a secret list based on a Big Data analysis of their social networks. But don't worry, it isn't racist or abusive because, Science!"

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by tynin on Thursday February 27 2014, @12:07AM

    by tynin (2013) on Thursday February 27 2014, @12:07AM (#7654) Journal

    I'll agree the summary could have just presented the facts, without the obvious bias. But this article clearly shows the direction of things to come. It even touched on what I think is a very chilling aspect of this, "are we just closing ourselves off to this small subset of people?"

    Sure, they aren't really just using pre-crime units, it is just another arm of this jurisdiction. However it undeniably focuses on a smaller group of people. In a perfect world, this wouldn't be a problem, but...

    This algorithm is secret. Who is doing the audits of this? How can we be sure that someone in the chain isn't taking money or otherwise to make sure certain people stay off the list, or even put on the list? How do we even know their is an algorithm, and it isn't just someone who thinks they are your master creating this secret lists? So far all attempts to find out anything about how the list is created has been kept from FOIA requests.

    All that said, it is obvious and unavoidable that this information would eventually be used in this fashion. It was going to happen. Hopefully some countries will have the respect for their people and will not allow this kind of data mining to be used without a high degree of openness in the process.

    Starting Score:    1  point
    Moderation   +3  
       Flamebait=1, Insightful=4, Interesting=1, Overrated=1, Total=7
    Extra 'Insightful' Modifier   0  

    Total Score:   4  
  • (Score: 1, Insightful) by The Mighty Buzzard on Thursday February 27 2014, @12:40AM

    by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Thursday February 27 2014, @12:40AM (#7674) Homepage Journal
    I tend to agree. I just get cheesed off when some race-baiter coopts a story to expose racism where it doesn't exist. Racists of any color piss me off but race-baiters are an order of magnitude worse.
    --
    My rights don't end where your fear begins.
  • (Score: 5, Insightful) by mth on Thursday February 27 2014, @01:06PM

    by mth (2848) on Thursday February 27 2014, @01:06PM (#7925) Homepage

    What worries me more than the data mining itself is that after they identified someone with a statistically increased risk of getting involved in crime, their response is to send an officer to intimidate him. I think it would be much more effective to check whether he has a job (the article says he's a high-school dropout), and if not, try to get him a job or training preparing for a job.

    • (Score: 5, Insightful) by Anonymous Coward on Thursday February 27 2014, @02:07PM

      by Anonymous Coward on Thursday February 27 2014, @02:07PM (#7952)

      If an officer appears out of the blue to harass you, it certainly doesn't make you respect the law more. Rather, it will probably make your attitude biased against the laws that allow, or even require, an officer to harass you without you doing anything wrong, and thus makes you more likely to break the law later. Which then will be taken as evidence that the harassment was justified, and the program is a success. Self-fulfilling prophecy at its finest.