Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Wednesday January 29 2020, @08:59AM   Printer-friendly
from the dystopian-reality dept.

London to deploy live facial recognition to find wanted faces in a crowd:

Officials at the Metropolitan Police Service of London announced last Friday that the organization will soon begin to use "Live Facial Recognition" (LFR) technology deployed around London to identify people of interest as they appear in surveillance video and alert officers to their location. The system, based on NEC's NeoFace Watch system, will be used to check live footage for faces on a police "watch list," a Metropolitan Police spokesperson said.

[...] In Las Vegas, a number of casinos have used facial-recognition systems for decades—not only to spot potential criminals but to also catch "undesirables" such as card counters and others who have been banned from the gaming floors. (I got a first-hand look at some of those early systems back in 2004

[...] private companies' own databases of images have begun to be tapped as well. Amazon's Rekognition system and other facial-recognition services that can process real-time streaming video have been used by police forces in the US as well as for commercial applications

[...] These systems are not foolproof. They depend heavily on the quality of source data and other aspects of the video being scanned. But Ephgrave said that the Metropolitan Police is confident about the system it's deploying—and that it's balancing its deployment with privacy concerns.

[...] Areas under the surveillance of the system will be marked with signs.

Previously:
America Is Turning Against Facial-Recognition Software
ACLU Demonstrates Flaws in Facial Recognition
Amazon and US Schools Normalize Automatic Facial Recognition and Constant Surveillance
Amazon Selling Facial Recognition Systems to Police in Orlando, FL and Washington County, OR


Original Submission

Related Stories

Amazon Selling Facial Recognition Systems to Police in Orlando, FL and Washington County, OR 18 comments

Amazon is selling police departments a real-time facial recognition system

Documents obtained by the ACLU of Northern California have shed new light on Rekognition, Amazon's little-known facial recognition project. Rekognition is currently used by police in Orlando and Oregon's Washington County, often using nondisclosure agreements to avoid public disclosure. The result is a powerful real-time facial recognition system that can tap into police body cameras and municipal surveillance systems.

According to further reporting by The Washington Post, the Washington County Sheriff pays between $6 and $12 a month for access to Rekognition, which allows the department to scan mug shot photos against real-time footage.

The most significant concerns are raised by the Orlando project, which is capable of running real-time facial recognition on a network of cameras throughout the city. The project was described by Rekognition project director Ranju Das at a recent AWS conference in Seoul. "This is an immediate response use case," Das told the crowd. "There are cameras all over the city [of Orlando]. Authorized cameras are streaming the data to Kinesis video stream.... We analyze that data in real time and search against the collection of faces that they have. Maybe they want to know if the mayor of the city is in a place, or there are persons of interest they want to track."

The price is not a typo. It was described as a "giveaway".

Also at NPR and LA Times (AP).


Original Submission

Amazon and US Schools Normalize Automatic Facial Recognition and Constant Surveillance 30 comments

At the Private Internet Access Blog, Glyn Moody writes how Amazon and US schools are following in China's footsteps to normalize automatic facial recognition and constant surveillance. Materials gained Freedom of Information Act requests by the ACLU have documented that Amazon has been marketing in its hosted "Rekognition" products to both police forces and schools to facilitate mass surveillance inside the US and to inure the coming generations to it.

Amazon has developed a powerful cloud-based facial recognition system called "Rekognition", which has major implications for privacy. It is already being used by multiple US police forces to carry out surveillance and make arrests, the ACLU has learned.

Amazon claims that Rekognition offers real-time face matching across tens of millions of individuals held in a database, and can detect up to 100 faces in a single photo of a crowd. Rekognition can be used to analyze videos, and to track people even when their faces are not visible, or as they go in and out of the scene.

As a result of these disclosures, a coalition of organizations including the ACLU has sent a letter to Amazon's CEO Jeff Bezos demanding that the company stop providing its facial recognition tool to the government. The ACLU has also launched a petition that calls for the same.

Emails obtained through freedom of information requests submitted by the ACLU show that Amazon has worked with the city of Orlando, Florida, and the Washington County Sheriff's Office in Oregon to roll out Rekognition in those locations. In addition, law enforcement agencies in California, Arizona, and multiple domestic surveillance "fusion centers" have indicated interest in Rekognition, although it is not clear how many of these have gone on to deploy the system. Orlando has used Rekognition to search for people in footage drawn from the city's video surveillance cameras. Washington County, meanwhile, has built a Rekognition-based mobile app that its deputies can use to run any image against the county's database of 300,000 faces.


Original Submission

ACLU Demonstrates Flaws in Facial Recognition 36 comments

The American Civil Liberties Union, in an effort to demonstrate the dangers of face recognition technology, ran photos of members of Congress against a database of mug shots using Amazon Rekognition software. That test incorrectly identified 28 legislators as criminals (cue the jokes - yes, the Congress members were confirmed to be elsewhere at the time). They hope that demonstrating that this risk hits close to home will get Congress more interested in regulating the use of this technology.

The false matches were disproportionately of people of color, including six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis (D-Ga.). These results demonstrate why Congress should join the ACLU in calling for a moratorium on law enforcement use of face surveillance.

[...] If law enforcement is using Amazon Rekognition, it’s not hard to imagine a police officer getting a “match” indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins. Or an individual getting a knock on the door from law enforcement, and being questioned or having their home searched, based on a false identification.


Original Submission

America Is Turning Against Facial-Recognition Software 16 comments

Until recently Americans seemed willing to let police deploy new technologies in the name of public safety as they saw fit. But crime is much rarer than it was in the 1990s, and technological scepticism is growing. On May 14th San Francisco became the first American city to ban its agencies from using facial-recognition systems. That decision was profoundly unpopular at the police conference. Jack Marks, who manages Panasonic’s public-safety products, called it “short-sighted and reactive”. The technology exists, he said; “the best thing you can do is help shape it.” Other cities, including Somerville in Massachusetts, may soon follow San Francisco’s lead all the same.

Companies are under scrutiny, too. On May 22nd Amazon saw off two challenges by activist shareholders. They wanted the board to commission an independent study to determine whether Rekognition, its facial-recognition platform, imperils civil, human and privacy rights. The activists also wanted to ban the firm from selling Rekognition to governments until the company’s board concludes, “after an evaluation using independent evidence”, that it does not erode those rights.

Senior police officers argue that the technology is a useful crime-fighting tool. Daniel Steeves, chief information officer for the Ottawa Police Service, says that a robbery-investigation unit spent six months testing a facial-recognition system. It lowered the average time required for an officer to identify a subject from an image from 30 days to three minutes. The officers could simply run an image through a database of 50,000 mugshot photos rather than leafing through them manually or sending a picture to the entire department and asking if anybody recognised the suspect. Other officers stress that a facial-recognition match never establishes guilt. It is just a lead to be investigated.

Yet officers sense that the technology is in bad odour. A deputy police chief from an American suburb with a security system that uses facial recognition around the local high school says: “We knew that facial recognition wasn’t going to fly, so we called it an Early Warning Detection System.”

AWS Facial Recognition Platform Misidentified Over 100 Politicians as Criminals 26 comments

AWS Facial Recognition Platform Misidentified Over 100 Politicians As Criminals:

Comparitech's Paul Bischoff found that Amazon's facial recognition platform misidentified an alarming number of people, and was racially biased.

Facial recognition technology is still misidentifying people at an alarming rate – even as it's being used by police departments to make arrests. In fact, Paul Bischoff, consumer privacy expert with Comparitech, found that Amazon's face recognition platform incorrectly misidentified more than 100 photos of US and UK lawmakers as criminals.

Rekognition, Amazon's cloud-based facial recognition platform that was first launched in 2016, has been sold and used by a number of United States government agencies, including ICE and Orlando, Florida police, as well as private entities. In comparing photos of a total of 1,959 US and UK lawmakers to subjects in an arrest database, Bischoff found that Rekognition misidentified at average of 32 members of Congress. That's four more than a similar experiment conducted by the American Civil Liberties Union (ACLU) – two years ago. Bischoff also found that the platform was racially biased, misidentifying non-white people at a higher rate than white people.

These findings have disturbing real-life implications. Last week, the ACLU shed light on Detroit citizen Robert Julian-Borchak Williams, who was arrested after a facial recognition system falsely matched his photo with security footage of a shoplifter.

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 0, Touché) by Anonymous Coward on Wednesday January 29 2020, @09:11AM (2 children)

    by Anonymous Coward on Wednesday January 29 2020, @09:11AM (#950562)

    London to deploy live facial recognition to find wanted faeces in a crowd:

    FTFY, it is subtle, but the fix is there. Trying to identify the Irish, Scots, Welsh, Northumbrians, and the Polacks. You know, undesireable types. Manx gits, Liverpool gents, Cornish game boys, the entire spectrum, in fact. Not racist at all, outside of being British. Turds!

    • (Score: 0) by Anonymous Coward on Wednesday January 29 2020, @10:49AM

      by Anonymous Coward on Wednesday January 29 2020, @10:49AM (#950570)

      You can add the French, Germans, Italians and all the others starting Feb 1st [bbc.com]
      Now the limeys can do whatever the fuck they want to their subjects, there's no Europe to temper them down [europa.eu].
      Even their US masters start to become annoyed by the antics of their lapdog [bloombergquint.com].

    • (Score: 0) by Anonymous Coward on Wednesday January 29 2020, @01:28PM

      by Anonymous Coward on Wednesday January 29 2020, @01:28PM (#950612)

      As it is Londonistabbistan we're talking about here, the system would be hard put to find a white face to profile in the first instance, I escaped from that cesspit back in the late 90's and I noticed it was changing then, I've been told by friends from back then who are still there that it's worse now, they'd bail, if they could.

      Fugue for a Darkening Island [wikipedia.org]

  • (Score: 0) by Anonymous Coward on Wednesday January 29 2020, @12:16PM

    by Anonymous Coward on Wednesday January 29 2020, @12:16PM (#950587)

    Looking for a beard and dark skin shouldn't be that difficult. They could train the neural network using photos of jihadis in Mecca.

  • (Score: 0) by Anonymous Coward on Wednesday January 29 2020, @02:56PM (4 children)

    by Anonymous Coward on Wednesday January 29 2020, @02:56PM (#950645)

    If you're turning it on and looking at everyone all the time, what is the privacy balancing their talking about? They just throw up a few signs and call it good? Don't worry, they're not going to put it up everywhere, just where the people are. If you want privacy you are free to go where the people aren't.

    • (Score: 4, Informative) by All Your Lawn Are Belong To Us on Wednesday January 29 2020, @04:20PM

      by All Your Lawn Are Belong To Us (6553) on Wednesday January 29 2020, @04:20PM (#950694) Journal

      The key to "balancing" is that you have to know the weight that is being applied to each side and what is feeling acceptable. In this case "balanced" does not mean "equal weight" but rather knowing that the fulcrum may be intentionally set off-center. For example, London already had the thumb on the scales with their usage of governmental CCTV all over the city.

      And, gotta say it....
      Pray they do not alter the deal further.

      --
      This sig for rent.
    • (Score: 2) by Freeman on Wednesday January 29 2020, @06:15PM (1 child)

      by Freeman (732) on Wednesday January 29 2020, @06:15PM (#950759) Journal

      Didn't you see the balancing act? I put it right there at the end of the Summary:

      Areas under the surveillance of the system will be marked with signs.

      I mean, that's not really much to balance the scales, if every place you turn, there's a sign.

      --
      Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
      • (Score: 0) by Anonymous Coward on Wednesday January 29 2020, @10:45PM

        by Anonymous Coward on Wednesday January 29 2020, @10:45PM (#950896)

        Signs, signs, everywhere are signs, blocking up the scenery and wasting my time.

    • (Score: 2) by Runaway1956 on Thursday January 30 2020, @01:53AM

      by Runaway1956 (2926) Subscriber Badge on Thursday January 30 2020, @01:53AM (#950951) Journal

      There is no balance. Gubbermint has all the power, and makes all the decisions. You, poor peasant, have the options of following orders willingly, or following orders after being bludgeoned into compliance.

  • (Score: 0) by Anonymous Coward on Wednesday January 29 2020, @03:47PM (1 child)

    by Anonymous Coward on Wednesday January 29 2020, @03:47PM (#950673)

    Step1) decry Totalitarianism in China!!!!!!!
    https://www.npr.org/2019/12/16/788597818/how-china-is-using-facial-recognition-technology [npr.org]
    Step 2) copy China

    • (Score: 1, Touché) by Anonymous Coward on Wednesday January 29 2020, @05:21PM

      by Anonymous Coward on Wednesday January 29 2020, @05:21PM (#950725)

      Thanks for bringing up China...the joke's on them.

      Seen any recent news photos from China or Hong Kong? Nearly everyone is wearing anti-coronavirus masks. So much for their big investment in facial recognition, defeated by a cheap bit of filter paper and some elastic/cord.

      Next time I'm headed downtown (where there are lots of cameras), I'm going to wear a mask too. If anyone asks it's because I think I was exposed and might be contagious?

  • (Score: 2, Insightful) by fustakrakich on Wednesday January 29 2020, @04:09PM (2 children)

    by fustakrakich (6150) on Wednesday January 29 2020, @04:09PM (#950687) Journal

    What a grotesque world we are making!

    For the sake of the galaxy, let's hope this species never gets off the planet, aside from the usual permanent fashion.

    --
    La politica e i criminali sono la stessa cosa..
    • (Score: 2) by DannyB on Wednesday January 29 2020, @05:35PM (1 child)

      by DannyB (5839) Subscriber Badge on Wednesday January 29 2020, @05:35PM (#950730) Journal

      What a grotesque world we are making!

      The title said Facial recognition, not Fecal recognition.

      . . . except maybe in countries that have open defecation.

      --
      People today are educated enough to repeat what they are taught but not to question what they are taught.
      • (Score: 1, Interesting) by Anonymous Coward on Wednesday January 29 2020, @08:00PM

        by Anonymous Coward on Wednesday January 29 2020, @08:00PM (#950809)

        The title said Facial recognition, not Fecal recognition.

        Ah, but remember, this is the UK we're talking about, the place where they once planned to build up a dog shit DNA database to try catch those people who allow their dogs to defecate in public places....a looney Tory idea back then, but as them buggers (literally and figuratively) are back in power for the next decade (thank you very much, you fucking sassenach idiots....), I'd not rule fecal recognition out....

(1)