Google didn't tell Android users much about Android System SafetyCore before it hit their phones, and people are unhappy. Fortunately, you're not stuck with it.
On Nov. 7, 2024, Google released a System update for Android 9 and later, which included a new service, Android System SafetyCore. Most of these patches were the usual security fixes, but SafetyCore was new and different. Google said in a developer note that the release was an "Android system component that provides privacy-preserving on-device user protection infrastructure for apps."
The update said nothing else. This information left ordinary users in the dark and, frankly, did little for programmers, either.
After the release, in a listing of new Google Messages security features, while not mentioning SafetyCore by name, Google described the service's functionality: "Sensitive Content Warnings is an optional feature that blurs images that may contain nudity before viewing and then prompts with a 'speed bump' that contains help-finding resources and options, including to view the content. When the feature is enabled, and an image that may contain nudity is about to be sent or forwarded, it also provides a speed bump to remind users of the risks of sending nude imagery and preventing accidental shares."
Google assured users in the note that: "Sensitive Content Warnings doesn't allow Google access to the contents of your images, nor does Google know that nudity may have been detected."
However, we now know SafetyCore does more than detect nude images. Its built-in machine-learning functionality can also target, detect, and filter images for sensitive content.
Google told ZDNET: "SafetyCore is a new Google system service for Android 9+ devices that provides the on-device infrastructure for securely and privately performing classification to help users detect unwanted content. Users control SafetyCore, and SafetyCore only classifies specific content when an app requests it through an optionally enabled feature."
According to GrapheneOS, a security-oriented Android Open Source Project (AOSP)-based distro: "The app doesn't provide client-side scanning used to report things to Google or anyone else. It provides on-device machine-learning models that are usable by applications to classify content as spam, scams, malware, etc. This allows apps to check content locally without sharing it with a service and mark it with warnings for users."
[...] So, if you wish to uninstall or disable SafetyCore, take these steps:
Open Settings: Go to your device's Settings app
Access Apps: Tap on 'Apps' or 'Apps & Notifications'
Show System Apps: Select 'See all apps' and then tap on the three-dot menu in the top-right corner to choose 'Show system apps'
Locate SafetyCore: Scroll through the list or search for 'SafetyCore' to find the app
Uninstall or Disable: Tap on Android System SafetyCore, then select 'Uninstall' if available. If the uninstall option is grayed out, you may only be able to disable it
Manage Permissions: If you choose not to uninstall the service, you can also check and try to revoke any SafetyCore permissions, especially internet access
However, some have reported that SafetyCore reinstalled itself during system updates or through Google Play Services, even after uninstalling the service. If this happens, you'll need to uninstall SafetyCore again, which is annoying.
(Score: 3, Interesting) by Username on Tuesday March 04, @03:49PM (4 children)
About two weeks ago a coworker, a mother of a nine year old boy, found nude photos in his texts, of a girl in his class. The girl was just sending them to all the boys for attention. This lead to awkward conversation between parents, and they are no longer friends, and was only resolved by blocking the girl.
This software would have been a good preventitive measures against child porn. The issue is identifying that a phone is used by a child, especially one with shitty parents.
(Score: 2) by aafcac on Tuesday March 04, @09:31PM (2 children)
From a legal POV does that matter?
(Score: 1, Informative) by Anonymous Coward on Wednesday March 05, @01:46AM (1 child)
Yeah in some countries possession or distribution of child porn is illegal so if Google is ever sending those nudes to some server somewhere (even if only to decide whether stuff needs to be blurred) then someone in Google should go to prison according to those laws (e.g. if Google is lying that the stuff never leaves the device).
BTW I suspect in some places it's still illegal even if the distributor is the child sending their own nudes...
As for blaming the parents, yes they are responsible but I'm not surprised if the girl got the idea from others (media etc). Plus it could be part instinct for some girls to flaunt it (they might not be around if their ancestors didn't do it).
Still I wouldn't want to go to prison just because some kid sent me nudes and I didn't notice or something (went to spam folder etc). Also some kids can make up stories (at their age they don't know the full consequences and some won't care even if someone told them), more so if encouraged to do so. https://pmc.ncbi.nlm.nih.gov/articles/PMC6818307/ [nih.gov]
(Score: 2) by aafcac on Wednesday March 05, @06:29AM
IMHO, possession laws are pretty screwed up. If you can't at least established that they were aware the materials were there or intended to have them, it shouldn't be a crime. Otherwise, it just leads to a really secure money making scheme involving planting evidence and threatening to go to the cops with that knowledge.
(Score: 2) by VLM on Tuesday March 04, @11:18PM
Its been interesting watching my kids school district slowly lock down the ipads they gave all the kids.
It was total wild west at first, but more gets locked down/out every year and they pretty much cannot use them for any form of communication or socialization between students except for school district-logged emails. Everything involving socialization or communication is blacklisted on the ipads.
I would assume that we're not more than a couple years away from whitelisted URLs only or maybe no web browsing at all permitted on school district property.
Both extremes are probably pretty bad. Also remember that school districts are in the official business of risk minimization but in theory they're supposed to prepare students for the real world or something similar long since abandoned. From a strict risk minimization standpoint ipads should do as little as humanly possible, from a 'prepare for the real world' they should do as much as possible. And thats the conflict the IT dept is stuck in. Everyone in management who pays their paychecks asking how dare it have a web browser, everyone in the served district asking how dare it not have a web browser, and its a mess.
Likewise the ipads are the school district's property when they do something people don't like, but they're suddenly the kids property when they break... What an amazing form of magic.