Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 10 submissions in the queue.
posted by martyb on Saturday August 14 2021, @06:24PM   Printer-friendly
from the see-csam-run dept.

Exclusive: Apple's child protection features spark concern within its own ranks -sources

A backlash over Apple's move to scan U.S. customer phones and computers for child sex abuse images has grown to include employees speaking out internally, a notable turn in a company famed for its secretive culture, as well as provoking intensified protests from leading technology policy groups.

Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread.

Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate is surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy.

Apple says it will refuse gov't demands to expand photo-scanning beyond CSAM:

Apple does not seem to have anticipated the level of criticism its decision to scan user photos would receive. On Thursday night, Apple distributed an internal memo that acknowledged criticism but dismissed it as "screeching voices of the minority."

That portion of the memo was written by NCMEC Executive Director of Strategic Partnerships Marita Rodriguez. "I know it's been a long day and that many of you probably haven't slept in 24 hours. We know that the days to come will be filled with the screeching voices of the minority. Our voices will be louder. Our commitment to lift up kids who have lived through the most unimaginable abuse and victimizations will be stronger," Rodriguez wrote.

The memo was obtained and published by 9to5Mac. The Apple-written portion of the memo said, "We've seen many positive responses today. We know some people have misunderstandings, and more than a few are worried about the implications, but we will continue to explain and detail the features so people understand what we've built."

The call is coming from within the building.

Previously:
(2021-08-06) Apple Plans to Scan US iPhones for Child Abuse Imagery.

Related:
(2021-07-20) Apple Employees Threaten to Quit as Company Takes Hard Line Stance on Remote Work.


Original Submission

Related Stories

Apple Employees Threaten to Quit as Company Takes Hard Line Stance on Remote Work 83 comments

Apple employees threaten to quit as company takes hard line stance on remote work:

Apple employees claim the company is not budging on plans to institute a hybrid work model for corporate workers and is in some cases denying work-from-home exceptions, including one accommodation covered by the Americans with Disabilities Act.

In June, Apple announced a hybrid work schedule that will see employees return to the office for three days a week starting in September, a shift toward normal corporate operations after the pandemic forced a lengthy work-from-home period. Days later, participants of what is assumed to be the same remote work advocacy Slack channel cited by The Verge asked more flexibility, saying that working from home brings a number of benefits including greater diversity and inclusion in retention and hiring, tearing down previously existing communication barriers, better work life balance, better integration of existing remote / location-flexible workers, and reduced spread of pathogens.

That request was flatly denied. In a video to employees late last month, SVP of retail and people Deirdre O'Brien toed the company line on remote work policies, saying, "We believe that in-person collaboration is essential to our culture and our future. If we take a moment to reflect on our unbelievable product launches this past year, the products and the launch execution were built upon the base of years of work that we did when we were all together in-person."


Original Submission

Apple Plans to Scan US iPhones for Child Abuse Imagery 89 comments

Apple plans to scan US iPhones for child abuse imagery:

Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people's personal devices.

Apple detailed its proposed system—known as "neuralMatch"—to some US academics earlier this week, according to two security researchers briefed on the virtual meeting. The plans could be publicized more widely as soon as this week, they said.

The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified. The scheme will initially roll out only in the US.

[...] Security researchers, while supportive of efforts to combat child abuse, are concerned that Apple risks enabling governments around the world to seek access to their citizens' personal data, potentially far beyond its original intent.

"It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of . . . our phones and laptops," said Ross Anderson, professor of security engineering at the University of Cambridge.

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 5, Interesting) by SomeGuy on Saturday August 14 2021, @07:24PM (6 children)

    by SomeGuy (5632) on Saturday August 14 2021, @07:24PM (#1166927)

    Apple is not law enforcement. They have no business doing this. Let them search for this, and next thing you know they will be searching for anything else that they don't like today. Or perhaps they already do. Of course, this is a nice little reminder that your cell phone is not actually "yours".

    Of course, the bigger problem is their use of bullshit "AI". Oh, I! Terabytes of ape porn! Yea, it could identify ANYTHING as illegal material, and all it takes is a single false allegation to destroy someones life. Do you want your life flushed down the toilet because their magic "AI" flagged some random oddball picture that didn't even have nudity or people in it?

    • (Score: 0, Informative) by Anonymous Coward on Saturday August 14 2021, @09:10PM (4 children)

      by Anonymous Coward on Saturday August 14 2021, @09:10PM (#1166945)
      You stupid enough to upload shit to their cloud, you get what you deserve. Just don't allow photos to be automatically backed up to the cloud (Apple or otherwise), problem solved. Stop being such whiners.
      • (Score: 2) by EJ on Saturday August 14 2021, @09:23PM (3 children)

        by EJ (2452) on Saturday August 14 2021, @09:23PM (#1166949)

        Sure. Just check this little box that super-seriously does what it says it does, totes pinky-swear.

        (Note: I would never own an Apple phone)

        • (Score: 0) by Anonymous Coward on Sunday August 15 2021, @02:54AM (2 children)

          by Anonymous Coward on Sunday August 15 2021, @02:54AM (#1167056)

          Do you honestly think Google isn't doing something similar on Android phones?

          • (Score: 3, Interesting) by EJ on Sunday August 15 2021, @04:20AM (1 child)

            by EJ (2452) on Sunday August 15 2021, @04:20AM (#1167086)

            Google doesn't own all of Android. It's open source, and each phone manufacturer makes its own version of the OS. I have a Samsung phone, and I don't expect the Koreans are so much into that sort of surveillance.

            I'm sure they could do whatever they wanted IF I uploaded anything to their cloud, but I don't. Apple does what Apple wants to do, and you WILL like it if you own their devices.

            • (Score: 4, Insightful) by Mykl on Sunday August 15 2021, @10:13AM

              by Mykl (1112) on Sunday August 15 2021, @10:13AM (#1167132)

              There was an article not too long ago here about just how much information Android sends back to Google. It's a lot.

    • (Score: 1) by fakefuck39 on Sunday August 15 2021, @06:30PM

      by fakefuck39 (6620) on Sunday August 15 2021, @06:30PM (#1167248)

      Apple is not law enforcement, but they are under zero obligation to store your photos on their storage. In order to upload photos to their storage, they want to make sure those photos are legal. The way they used to do it was to scan your photos after they were on their storage. Now they want a guarantee from you that the photos are legal, before they accept even temporary storage. They give you a tool to do that, and you must use this tool before they accept your files.

      If you don't want your files scanned by their verification tool, don't have them synced to their storage, and they won't be scanned.

      All that other shit you said is completely irrelevant. It's their disk, and they can make up any rule they want for its usage. They are not scanning everything on your phone. They're only scanning files you selected to upload to their cloud, before uploading them.

  • (Score: 5, Insightful) by AnonTechie on Saturday August 14 2021, @07:27PM

    by AnonTechie (2275) on Saturday August 14 2021, @07:27PM (#1166928) Journal

    Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread.

    Quite sure that this will be exploited by governments in the guise of "think of the children" and will lead to a privacy nightmare !! It is almost impossible to put the genie back in the bottle ...

    --
    Albert Einstein - "Only two things are infinite, the universe and human stupidity, and I'm not sure about the former."
  • (Score: 2) by takyon on Saturday August 14 2021, @07:33PM (3 children)

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday August 14 2021, @07:33PM (#1166930) Journal

    Apple Faces Internal Opposition Over Child Protection Features; "Screeching Voices" Found?

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 4, Insightful) by MostCynical on Saturday August 14 2021, @10:03PM (2 children)

      by MostCynical (2589) on Saturday August 14 2021, @10:03PM (#1166955) Journal

      I know a doctor who has pictures of skin conditions sent to her for urgent diagnosis.
      Some of these are of 'sensitive areas' of small children.
      Will these be flagged?

      If I have photos of my kids and their cousin in the bath when they were very small, will they be flagged?

      Is being concerned about this and expressing those concerns "screeching"?

      argue the content, not the volume or the tone.

      --
      "I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
      • (Score: 2) by takyon on Saturday August 14 2021, @10:36PM (1 child)

        by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday August 14 2021, @10:36PM (#1166959) Journal

        It has to be already known images uploaded to NCMEC's databases. And something like 30 of them before you get vanned. That's assuming they are not overly optimistic about the false positive rate, and don't alter the deal further.

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
        • (Score: 1, Touché) by Anonymous Coward on Sunday August 15 2021, @04:00AM

          by Anonymous Coward on Sunday August 15 2021, @04:00AM (#1167080)

          They always alter the deal further.

  • (Score: 1, Disagree) by fustakrakich on Saturday August 14 2021, @07:48PM (1 child)

    by fustakrakich (6150) on Saturday August 14 2021, @07:48PM (#1166932) Journal

    If not, there's nothing more to say. Arguing about it won't make a difference. Probably best to just roll your own encryption on files you want to upload

    --
    La politica e i criminali sono la stessa cosa..
    • (Score: 1, Touché) by Anonymous Coward on Saturday August 14 2021, @07:58PM

      by Anonymous Coward on Saturday August 14 2021, @07:58PM (#1166934)

      *upload to local device

  • (Score: 5, Insightful) by sjames on Saturday August 14 2021, @08:08PM (5 children)

    by sjames (2882) on Saturday August 14 2021, @08:08PM (#1166936) Journal

    It's not child protection. Protection would mean tracking down the people making new images and making them stop.

    • (Score: 3, Informative) by takyon on Saturday August 14 2021, @08:53PM (4 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday August 14 2021, @08:53PM (#1166943) Journal

      They announced three features at the same time. One of them detects and blocks incoming nudes using a machine learning algorithm or something like that.

      https://www.theverge.com/2021/8/10/22613225/apple-csam-scanning-messages-child-safety-features-privacy-controversy-explained [theverge.com]

      The first change affects Apple’s Search app and Siri. If a user searches for topics related to child sexual abuse, Apple will direct them to resources for reporting it or getting help with an attraction to it. That’s rolling out later this year on iOS 15, watchOS 8, iPadOS 15, and macOS Monterey, and it’s largely uncontroversial.

      The other updates, however, have generated far more backlash. One of them adds a parental control option to Messages, obscuring sexually explicit pictures for users under 18 and sending parents an alert if a child 12 or under views or sends these pictures.

      The final new feature scans iCloud Photos images to find child sexual abuse material, or CSAM, and reports it to Apple moderators — who can pass it on to the National Center for Missing and Exploited Children, or NCMEC. Apple says it’s designed this feature specifically to protect user privacy while finding illegal content. Critics say that same designs amounts to a security backdoor.

      [...] The update — coming to accounts set up as families in iCloud on iOS 15, iPadOS 15, and macOS Monterey — also includes an additional option. If a user taps through that warning and they’re under 13, Messages will be able to notify a parent that they’ve done it. Children will see a caption warning that their parents will receive the notification, and the parents won’t see the actual message. The system doesn’t report anything to Apple moderators or other parties.

      The images are detected on-device, which Apple says protects privacy. And parents are notified if children actually confirm they want to see or send adult content, not if they simply receive it. At the same time, critics like Harvard Cyberlaw Clinic instructor Kendra Albert have raised concerns about the notifications — saying they could end up outing queer or transgender kids, for instance, by encouraging their parents to snoop on them.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 3, Interesting) by sjames on Saturday August 14 2021, @09:53PM (3 children)

        by sjames (2882) on Saturday August 14 2021, @09:53PM (#1166953) Journal

        All based on hashes that detect existing and known objectionable material. As I said, they need to detect NEW material being made by child abusers.

        If they want to do that, they need to not burn resources looking at the 99.99% of people who want nothing to do with child abuse, and focus on where it comes from.They'll need warrants and shoe leather.

        Tattling on children looking at naughty legal porn won't help with that, but may be helpful to parents. Depending on the situation, it may contribute to harming the children as Kendra Albert pointed out.

        • (Score: 2) by takyon on Saturday August 14 2021, @10:34PM (2 children)

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday August 14 2021, @10:34PM (#1166957) Journal

          All based on hashes that detect existing and known objectionable material.

          No, the feature that detects and blurs incoming nude pics works on new material. And even that feature has objections to it.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
          • (Score: 2) by sjames on Saturday August 14 2021, @10:55PM (1 child)

            by sjames (2882) on Saturday August 14 2021, @10:55PM (#1166961) Journal

            I doubt very much the phone is computing that for itself unless it just blurs anything that could possibly be flesh colored.

            It is objectionable if the phone is burning most of it's CPU power and at least half of the battery life to hang fig leaves on fine art photos.

            • (Score: 2) by takyon on Saturday August 14 2021, @11:07PM

              by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday August 14 2021, @11:07PM (#1166963) Journal

              All modern flagship phones now have dedicated "AI" chips inside, usually capable in the trillions of operations per second (TOPS) range with low power consumption. iPhone 8 introduced a "2-core Neural Engine" in 2017. That went to "8-core" with the iPhone XS in 2018, and "16-core" with iPhone 12 in 2020. They use this for things like the camera.

              It doesn't matter if the algorithm is too zealous since you can just tap to ignore the warning and show the image, and parents will get a notification if they have turned that on, I assume with the received image.

              --
              [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 1, Interesting) by Anonymous Coward on Saturday August 14 2021, @10:38PM

    by Anonymous Coward on Saturday August 14 2021, @10:38PM (#1166960)

    This will also be a feature of Windows 11 (hence the requirement for TPM)

  • (Score: 5, Insightful) by darkfeline on Sunday August 15 2021, @03:42AM (1 child)

    by darkfeline (1030) on Sunday August 15 2021, @03:42AM (#1167071) Homepage

    It's interesting that the headline says "Child Protection Features" and not "Privacy Invading Features". Never forget that there's always a political agenda, no matter what you believe and where you get your news.

    --
    Join the SDF Public Access UNIX System today!
    • (Score: 0) by Anonymous Coward on Monday August 16 2021, @07:22PM

      by Anonymous Coward on Monday August 16 2021, @07:22PM (#1167604)

      That's what some people say about race and racism too.

      The most important thing in effective communication is to pick apart and read intent into every use of every word, and even more importantly, every word that WASN'T used. You can tell an awful lot about a person by analyzing all the words that you can think of that the author didn't use.

(1)