Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Monday March 03, @09:17PM   Printer-friendly

Google didn't tell Android users much about Android System SafetyCore before it hit their phones, and people are unhappy. Fortunately, you're not stuck with it.

On Nov. 7, 2024, Google released a System update for Android 9 and later, which included a new service, Android System SafetyCore. Most of these patches were the usual security fixes, but SafetyCore was new and different. Google said in a developer note that the release was an "Android system component that provides privacy-preserving on-device user protection infrastructure for apps."

The update said nothing else. This information left ordinary users in the dark and, frankly, did little for programmers, either.

After the release, in a listing of new Google Messages security features, while not mentioning SafetyCore by name, Google described the service's functionality: "Sensitive Content Warnings is an optional feature that blurs images that may contain nudity before viewing and then prompts with a 'speed bump' that contains help-finding resources and options, including to view the content. When the feature is enabled, and an image that may contain nudity is about to be sent or forwarded, it also provides a speed bump to remind users of the risks of sending nude imagery and preventing accidental shares."

Google assured users in the note that: "Sensitive Content Warnings doesn't allow Google access to the contents of your images, nor does Google know that nudity may have been detected."

However, we now know SafetyCore does more than detect nude images. Its built-in machine-learning functionality can also target, detect, and filter images for sensitive content.

Google told ZDNET: "SafetyCore is a new Google system service for Android 9+ devices that provides the on-device infrastructure for securely and privately performing classification to help users detect unwanted content. Users control SafetyCore, and SafetyCore only classifies specific content when an app requests it through an optionally enabled feature."

According to GrapheneOS, a security-oriented Android Open Source Project (AOSP)-based distro: "The app doesn't provide client-side scanning used to report things to Google or anyone else. It provides on-device machine-learning models that are usable by applications to classify content as spam, scams, malware, etc. This allows apps to check content locally without sharing it with a service and mark it with warnings for users."

[...] So, if you wish to uninstall or disable SafetyCore, take these steps:

  • Open Settings: Go to your device's Settings app

  • Access Apps: Tap on 'Apps' or 'Apps & Notifications'

  • Show System Apps: Select 'See all apps' and then tap on the three-dot menu in the top-right corner to choose 'Show system apps'

  • Locate SafetyCore: Scroll through the list or search for 'SafetyCore' to find the app

  • Uninstall or Disable: Tap on Android System SafetyCore, then select 'Uninstall' if available. If the uninstall option is grayed out, you may only be able to disable it

  • Manage Permissions: If you choose not to uninstall the service, you can also check and try to revoke any SafetyCore permissions, especially internet access

However, some have reported that SafetyCore reinstalled itself during system updates or through Google Play Services, even after uninstalling the service. If this happens, you'll need to uninstall SafetyCore again, which is annoying.


Original Submission

 
This discussion was created by janrinok (52) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by looorg on Monday March 03, @09:41PM (11 children)

    by looorg (578) on Monday March 03, @09:41PM (#1395140)

    So does it look for a lot of fleshy coloured pixels or "banana" shapes or is it "AI" again? Cause a lot of the older or other scanners have been weird in that way and the amount of "sensitive content" is picked up that turned out to be false positives was/is staggering.

    One would imagine that if I or a significant other wanted to send each other "sensitive content" images I don't want Pervy Uncle Google to have a look at it first and then decide if that is ok for me and the other party before it shows said images.

    Starting Score:    1  point
    Moderation   +2  
       Interesting=2, Total=2
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 4, Funny) by turgid on Monday March 03, @10:05PM (7 children)

    by turgid (4318) Subscriber Badge on Monday March 03, @10:05PM (#1395141) Journal

    Well, you see this sort of thing has "applications." There will be humans overseeing it. Obviously, they'll be able to do "quality control checks" on the data. We are the dead.

    • (Score: 3, Insightful) by janrinok on Tuesday March 04, @12:59AM (4 children)

      by janrinok (52) Subscriber Badge on Tuesday March 04, @12:59AM (#1395145) Journal

      How am I supposed to moderate that? I read it with the appropriate sarcastic mental voice, I laughed wryly as I did so, and then was brought to a sudden stop by the insightful message!

      So I'm giving it a Funny, but it deserves several other moderations too.

      --
      [nostyle RIP 06 May 2025]
      • (Score: 5, Funny) by aafcac on Tuesday March 04, @01:34AM

        by aafcac (17646) on Tuesday March 04, @01:34AM (#1395148)

        Moderation is easy, there's a dropdown followed by a button to press.

      • (Score: 1, Funny) by Anonymous Coward on Tuesday March 04, @03:15AM (1 child)

        by Anonymous Coward on Tuesday March 04, @03:15AM (#1395159)

        I found that comment funny but the last statement was Grave. Pun intended.

        • (Score: 2) by Tork on Wednesday March 05, @01:32AM

          by Tork (3914) Subscriber Badge on Wednesday March 05, @01:32AM (#1395281) Journal
          Thanks for the referral, I'm dying to see this post!!
          --
          🏳️‍🌈 Proud Ally 🏳️‍🌈
      • (Score: 3, Interesting) by turgid on Tuesday March 04, @08:13AM

        by turgid (4318) Subscriber Badge on Tuesday March 04, @08:13AM (#1395177) Journal

        I only read 1984 once when I was 16, at school. I believe it was when Winston and Julia were in their love nest that Julia says of the indoctrinated people, "They are the dead." At that point, the people who have been spying on them reveal themselves saying, "You are the dead." For them, it's all downhill from that point, culminating in the Room 101 treatment and the bullet to the back of the head. I must read it again, when I'm in a suitable frame of mind. It's not an easy read.

    • (Score: 1, Funny) by Anonymous Coward on Tuesday March 04, @07:54AM (1 child)

      by Anonymous Coward on Tuesday March 04, @07:54AM (#1395175)

      Only modern appy app apps can detect nudity and spam accurately by integrating AI and machine learning for complete accuracy. Instead, people will want to force modern app appers to install LUDDITE spyware that claims to protect them while just sending all their data to companies that sell ads instead of appy apps. Why would app appers want to trust their privacy to some off-device scanner instead of appy app apps? Any self-respecting appy app apper knows better and that an appy app like SafetyCore will keep them safe from non-appy spyware. We should take revenge on these LUDDITES by reformatting their computers and installing Appdows 11, then they'll be forced to only app apps that app other apps!

      Apps!

      • (Score: 2, Insightful) by Anonymous Coward on Tuesday March 04, @09:08AM

        by Anonymous Coward on Tuesday March 04, @09:08AM (#1395183)

        My fear is not ads.

        It is extortion blackmail.

        There are lots of businesses out there who would love to find something on you that would give them the credibility of claiming to be an agent of someone who has the authority to make a really bad day for you .

        You get creamed for demands from those who found something incriminating in your phone...say copyright infringement, or some site you visited and if they can correlate your phone to your real name, physical address, and financial credentials ( Google Account, OnLine Banking, Amazon, eBay, Experian, Equifax, etc
        ), they can come up with some credible threat letters apparently from some legit authority which has the power to condemn you for even questioning their authority ( aka TLA , lawyers , law enforcement ).

        I flat do not trust modern technology. To me, it's an instrument of acquisition of "dirt" on people for the express purpose of compelling obedience to business models.

  • (Score: 2) by stormreaver on Wednesday March 05, @01:47AM (2 children)

    by stormreaver (5101) on Wednesday March 05, @01:47AM (#1395287)

    I remember when Apple started with this shit. The pictures parents took of their babies' growth milestones were labeled as child porn. No good comes from these misfeatures, so I disabled it on my phones. Now it's just a question of whether Android honors the setting.

    • (Score: 3, Insightful) by janrinok on Wednesday March 05, @02:07AM (1 child)

      by janrinok (52) Subscriber Badge on Wednesday March 05, @02:07AM (#1395290) Journal

      It was happening long before smart phones became commonplace.

      I can vaguely recall several instances of photographic films being submitted to various companies for processing and printing. Several arrests were made (and eventually thrown out) because parents were taking photographs of their babies and children being bathed, or playing naked in a paddling pool in their own garden. To the parents these were simply memories of a moment in the child's life that they would later treasure.

      To the people who were working in the processing labs they were 'child pornography'. They apparently thought that the images were in some way sexual.

      Which of those 2 groups of people needed the most watching?

      I do not recall a laboratory worker ever being questioned, let alone arrested.

      --
      [nostyle RIP 06 May 2025]
      • (Score: 2) by stormreaver on Wednesday March 05, @03:00AM

        by stormreaver (5101) on Wednesday March 05, @03:00AM (#1395298)

        You are absolutely right. I remember those as well, and I completely agree about who needed to be watched.