Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Friday December 16 2016, @12:50PM   Printer-friendly
from the false-positives-are-a-bad-thing-said-one-of-the-twins dept.

If you've watched any sort of spy thriller or action film over the last few years – think Jason Bourne or Mission: Impossible – the chances are you've seen facial recognition software in action. These movie scenes often involve an artist's sketch compared to mug shots, or sometimes even a live CCTV stream, and with the clock ticking, a match is usually found for the culprit in the nick of time.

It seems natural then to assume that what happens in the film world is similar to what happens (most of the time) in the real world. We might think that our faces are constantly being tracked and recognised as we walk past security cameras in city centres – but this is not actually the case.

Not only would such a system require millions of cameras capable of producing high-quality footage, but it would also require the integration of photo-ID databases such as mugshots from every police force, previous passport images, and driving license images for everyone in the country.

And yet even if this high level of integration was possible, a far more basic problem still exists – facial recognition systems are still not 100% accurate.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by physicsmajor on Friday December 16 2016, @01:51PM

    by physicsmajor (1471) on Friday December 16 2016, @01:51PM (#442038)

    When used in limited fashion, imperfect sensitivity/specificity can be acceptable or worked around. But if you want to roll that out to the general population, better make sure you don't cry wolf very much. Like the legal system is - supposedly - designed, better the guilty go free than to malign the innocent.

    This tech is laughably far from that bar.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by mhajicek on Friday December 16 2016, @04:42PM

    by mhajicek (51) on Friday December 16 2016, @04:42PM (#442089)

    And yet a man was arrested based solely on a false facial recognition match, then released, then arrested again and beaten within an inch of his life despite not resisting based on another false facial recognition match.

    --
    The spacelike surfaces of time foliations can have a cusp at the surface of discontinuity. - P. Hajicek
  • (Score: 4, Interesting) by HiThere on Friday December 16 2016, @06:59PM

    by HiThere (866) Subscriber Badge on Friday December 16 2016, @06:59PM (#442156) Journal

    It may be far from the bar, but what makes you think fingerprints are any more reliable? Accurate finger prints, fully encoded, probably would be, but that's not what gets used. Similarly, this technology doesn't need to be accurate to serve police purposes, it merely needs to be believed. Much of the judicial system is designed to shape the evidence required, and if you aren't among the powerful, you can't feasibly challenge their rules. Lots of people are convicted based on false or known misleading evidence. We don't know how many, because it's often impossible to prove, but it's still been proven repeatedly. Given the small budget that those who attempt to prove this have (private donations) and the problems they have with access to the evidence, and the reluctance of the courts to consider that a mistake even MIGHT have been made, then an evaluation of the evidence would tend to lead to an estimate that a large percentage, but probably in the single digits, of those convicted were innocent-in-fact (as opposed to legally).

    We are repeatedly told that our system is designed to protect the innocent, but this doesn't really appear true, though there are certainly worse systems. What *is* true is that it's designed to protect the powerful, but this is true of all human systems. There was *some* consideration given in the design to protecting the innocent, and this isn't true of all legal systems, but that's not a very strong statement.

    --
    Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
    • (Score: 3, Insightful) by Immerman on Friday December 16 2016, @11:14PM

      by Immerman (3985) on Friday December 16 2016, @11:14PM (#442267)

      Even if the police are 100% honest and competent, faulty fingerprints can still be pretty useful - if there's a million people in the city, there's probably only what, a few thousand who could possibly have left the fingerprints found at the scene? And far fewer where there's a high probability. That gives them one reasonably small set of suspects. If they have any other evidence that can independently generate set of suspects even remotely as small, then a little Venn diagram magic will almost certainly narrow the credible suspect list down to only a handful of people. It's very important to recognize that it's a probabilistic process, and there's no guarantee that your list actually includes the actual perpetrator, but it can still be a huge help in the majority of cases.

  • (Score: 0) by Anonymous Coward on Friday December 16 2016, @11:58PM

    by Anonymous Coward on Friday December 16 2016, @11:58PM (#442285)

    Regardless of how accurate facial recognition technology is, conducting mass surveillance on the populous for any reason is unethical and, if it isn't already, should be outright illegal.