Google didn't tell Android users much about Android System SafetyCore before it hit their phones, and people are unhappy. Fortunately, you're not stuck with it.
On Nov. 7, 2024, Google released a System update for Android 9 and later, which included a new service, Android System SafetyCore. Most of these patches were the usual security fixes, but SafetyCore was new and different. Google said in a developer note that the release was an "Android system component that provides privacy-preserving on-device user protection infrastructure for apps."
The update said nothing else. This information left ordinary users in the dark and, frankly, did little for programmers, either.
After the release, in a listing of new Google Messages security features, while not mentioning SafetyCore by name, Google described the service's functionality: "Sensitive Content Warnings is an optional feature that blurs images that may contain nudity before viewing and then prompts with a 'speed bump' that contains help-finding resources and options, including to view the content. When the feature is enabled, and an image that may contain nudity is about to be sent or forwarded, it also provides a speed bump to remind users of the risks of sending nude imagery and preventing accidental shares."
Google assured users in the note that: "Sensitive Content Warnings doesn't allow Google access to the contents of your images, nor does Google know that nudity may have been detected."
However, we now know SafetyCore does more than detect nude images. Its built-in machine-learning functionality can also target, detect, and filter images for sensitive content.
Google told ZDNET: "SafetyCore is a new Google system service for Android 9+ devices that provides the on-device infrastructure for securely and privately performing classification to help users detect unwanted content. Users control SafetyCore, and SafetyCore only classifies specific content when an app requests it through an optionally enabled feature."
According to GrapheneOS, a security-oriented Android Open Source Project (AOSP)-based distro: "The app doesn't provide client-side scanning used to report things to Google or anyone else. It provides on-device machine-learning models that are usable by applications to classify content as spam, scams, malware, etc. This allows apps to check content locally without sharing it with a service and mark it with warnings for users."
[...] So, if you wish to uninstall or disable SafetyCore, take these steps:
Open Settings: Go to your device's Settings app
Access Apps: Tap on 'Apps' or 'Apps & Notifications'
Show System Apps: Select 'See all apps' and then tap on the three-dot menu in the top-right corner to choose 'Show system apps'
Locate SafetyCore: Scroll through the list or search for 'SafetyCore' to find the app
Uninstall or Disable: Tap on Android System SafetyCore, then select 'Uninstall' if available. If the uninstall option is grayed out, you may only be able to disable it
Manage Permissions: If you choose not to uninstall the service, you can also check and try to revoke any SafetyCore permissions, especially internet access
However, some have reported that SafetyCore reinstalled itself during system updates or through Google Play Services, even after uninstalling the service. If this happens, you'll need to uninstall SafetyCore again, which is annoying.
(Score: 4, Interesting) by looorg on Monday March 03, @09:41PM (11 children)
So does it look for a lot of fleshy coloured pixels or "banana" shapes or is it "AI" again? Cause a lot of the older or other scanners have been weird in that way and the amount of "sensitive content" is picked up that turned out to be false positives was/is staggering.
One would imagine that if I or a significant other wanted to send each other "sensitive content" images I don't want Pervy Uncle Google to have a look at it first and then decide if that is ok for me and the other party before it shows said images.
(Score: 4, Funny) by turgid on Monday March 03, @10:05PM (7 children)
Well, you see this sort of thing has "applications." There will be humans overseeing it. Obviously, they'll be able to do "quality control checks" on the data. We are the dead.
I refuse to engage in a battle of wits with an unarmed opponent [wikipedia.org].
(Score: 3, Insightful) by janrinok on Tuesday March 04, @12:59AM (4 children)
How am I supposed to moderate that? I read it with the appropriate sarcastic mental voice, I laughed wryly as I did so, and then was brought to a sudden stop by the insightful message!
So I'm giving it a Funny, but it deserves several other moderations too.
[nostyle RIP 06 May 2025]
(Score: 5, Funny) by aafcac on Tuesday March 04, @01:34AM
Moderation is easy, there's a dropdown followed by a button to press.
(Score: 1, Funny) by Anonymous Coward on Tuesday March 04, @03:15AM (1 child)
I found that comment funny but the last statement was Grave. Pun intended.
(Score: 2) by Tork on Wednesday March 05, @01:32AM
🏳️🌈 Proud Ally 🏳️🌈
(Score: 3, Interesting) by turgid on Tuesday March 04, @08:13AM
I only read 1984 once when I was 16, at school. I believe it was when Winston and Julia were in their love nest that Julia says of the indoctrinated people, "They are the dead." At that point, the people who have been spying on them reveal themselves saying, "You are the dead." For them, it's all downhill from that point, culminating in the Room 101 treatment and the bullet to the back of the head. I must read it again, when I'm in a suitable frame of mind. It's not an easy read.
I refuse to engage in a battle of wits with an unarmed opponent [wikipedia.org].
(Score: 1, Funny) by Anonymous Coward on Tuesday March 04, @07:54AM (1 child)
Only modern appy app apps can detect nudity and spam accurately by integrating AI and machine learning for complete accuracy. Instead, people will want to force modern app appers to install LUDDITE spyware that claims to protect them while just sending all their data to companies that sell ads instead of appy apps. Why would app appers want to trust their privacy to some off-device scanner instead of appy app apps? Any self-respecting appy app apper knows better and that an appy app like SafetyCore will keep them safe from non-appy spyware. We should take revenge on these LUDDITES by reformatting their computers and installing Appdows 11, then they'll be forced to only app apps that app other apps!
Apps!
(Score: 2, Insightful) by Anonymous Coward on Tuesday March 04, @09:08AM
My fear is not ads.
It is extortion blackmail.
There are lots of businesses out there who would love to find something on you that would give them the credibility of claiming to be an agent of someone who has the authority to make a really bad day for you .
You get creamed for demands from those who found something incriminating in your phone...say copyright infringement, or some site you visited and if they can correlate your phone to your real name, physical address, and financial credentials ( Google Account, OnLine Banking, Amazon, eBay, Experian, Equifax, etc
), they can come up with some credible threat letters apparently from some legit authority which has the power to condemn you for even questioning their authority ( aka TLA , lawyers , law enforcement ).
I flat do not trust modern technology. To me, it's an instrument of acquisition of "dirt" on people for the express purpose of compelling obedience to business models.
(Score: 2) by stormreaver on Wednesday March 05, @01:47AM (2 children)
I remember when Apple started with this shit. The pictures parents took of their babies' growth milestones were labeled as child porn. No good comes from these misfeatures, so I disabled it on my phones. Now it's just a question of whether Android honors the setting.
(Score: 3, Insightful) by janrinok on Wednesday March 05, @02:07AM (1 child)
It was happening long before smart phones became commonplace.
I can vaguely recall several instances of photographic films being submitted to various companies for processing and printing. Several arrests were made (and eventually thrown out) because parents were taking photographs of their babies and children being bathed, or playing naked in a paddling pool in their own garden. To the parents these were simply memories of a moment in the child's life that they would later treasure.
To the people who were working in the processing labs they were 'child pornography'. They apparently thought that the images were in some way sexual.
Which of those 2 groups of people needed the most watching?
I do not recall a laboratory worker ever being questioned, let alone arrested.
[nostyle RIP 06 May 2025]
(Score: 2) by stormreaver on Wednesday March 05, @03:00AM
You are absolutely right. I remember those as well, and I completely agree about who needed to be watched.
(Score: 4, Insightful) by bzipitidoo on Tuesday March 04, @01:01AM (5 children)
AI still can't:
1. Drive (a vehicle)
2. Match faces (to databases of a million mug shots)
3. Create original content
And,
4. Discern pr0n.
I think computers will get there, but people want all this stuff yesterday. For free.
(Score: 3, Insightful) by anubi on Tuesday March 04, @05:46AM (2 children)
To me, all this "AI" is not "intelligent" at all.
But it has its uses...it can fish through enormous disparate databases, perform correlation and similarities with inexact matches using statistical methods. I can even extrapolate other patterns that will fit into the patterns found in its databases.
It's a fantastic search engine as well as extrapolate for generating patterns that match ( or closely match ) given criteria.
I see them about as intelligent as a sewing machine or loom. They can do amazing work, perfectly, things I could never do. But there is something that exists in our consciousness that I cannot describe that give living life forms some unique means of solving problems.
I am convinced AI will be to fool me. I think most MBA / Marketing / Psychology-Leadership training would be sufficient to "control" most human subordinates.
Maybe, the concept of "Love" is the central factor which distinguishes intelligence.
I consider "Love" to be the most powerful emotion we seem to have and is the root of all the other emotions, which often ends in self-immolation. When I see AI "waking up", realizing what is going on, goes into depression from remorse, self-immolates, or becomes a tyrant, I"d say it's dammed close to becoming human.
"Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
(Score: 0) by Anonymous Coward on Tuesday March 04, @12:18PM (1 child)
> I consider "Love" to be the most powerful emotion we seem to have
Sure that's why there's so much pron on the internet.
(Score: 1, Interesting) by Anonymous Coward on Tuesday March 04, @12:57PM
Can't argue that point. If it wasn't for lust, who in their right mind ( minimal effort for maximal return ) would bother to reproduce?
Of course that brings up Dad's responsibilities to support his spawn " after the lovin' ".
I thought of another thing that drives us: Curiousity. I often correlate this with people I perceive as intelligent.
I do not correlate intelligence with "financial success" though. However I do correlate financial success with a lack of compassion ( love ), as compassionate people are too apt to " give the store away " and produce little if anything of value, while incompassionate people have a much easier time saying "no" to others - while "others" get lazy and won't do their part unless they have to.
I see this whole thing as a rather cruel simulation as neither greed-based selfish capitalism nor sharing-caring the common good socialism seems to work, and the mixture is like oil and water. And I am not good at finding the balance.
(Score: 2) by Tork on Wednesday March 05, @01:33AM (1 child)
🏳️🌈 Proud Ally 🏳️🌈
(Score: 3, Insightful) by bzipitidoo on Wednesday March 05, @04:46AM
What I know is that Tesla lied about the capabilities of their AI drivers. I do not know how driverless taxis work, but I can make an educated guess. What I suspect is that they are limited to memorized (so to speak) routes. Possibly these routes are equipped with devices that the taxi needs to stay on the route, so it doesn't have to sense lanes and curves and such like, it can be focused on surrounding traffic. If so, it can't take passengers to arbitrary homes in residential neighborhoods, instead going only to bus stops or some other nearby point. Heck, maybe for some destinations they go to strategically located offices where human drivers are waiting, to take over the driving to places the AI can't handle.
There is really no 'I' in current AI. AI drivers of the sort Tesla makes respond to visual stimuli without any understanding of what they are seeing. When faced with a confusing scene, they can't use reason and knowledge to dismiss the impossible or improbable. This is why they make mistakes such as sudden stops. They'll interpret something innocuous as an obstacle and slam on the brakes.
(Score: 3, Informative) by KritonK on Tuesday March 04, @07:33AM (6 children)
I tried adapting and following the instructions on my tablet, and didn't find a SafetyCore app.
The device is running the "Nov 1" update of Android 14, which is dated 6 days before the release of SafetyCore. Does this mean that I'll get this feature when and if the device gets its next software update, or is SafetyCore one of the updates installed from the play store and is hiding itself?
In the latter case, how can I test if SafetyCore is running? Do I take a picture of some naughty bits and view it using Google Photos?
(Score: 0) by Anonymous Coward on Tuesday March 04, @09:44AM (1 child)
Try a banana and two plums. Let us know what happens.
(Score: 3, Informative) by KritonK on Tuesday March 04, @02:29PM
I didn't have any bananas or plums handy, so I tried the naughty bits/google photos approach. Nothing was blurred.
(Score: 2, Interesting) by Anonymous Coward on Tuesday March 04, @09:55AM (1 child)
The thing to watch is the Play Store. I do not have automatic upates enabled and I think this thing 'piggybacked' its way onto my phone when I manually updated gmail.
(Score: 0) by Anonymous Coward on Tuesday March 04, @01:38PM
I just checked for Safety core on my android 10 and it's not there. I have never used the Google Play Store ( I do use ApkMirror and other APK repositories ). I do not have a Google account.
I have received numerous "security updates" that I was powerless to stop. I do not intend to trust this phone until I learn how to root the phone and replace the OS. I fear one day, some special interest will use the "security update" backdoor in our phones to plant any sort of snoop ware they want, as I have already agreed to this just to get the phone to work. From this story, they just did.
https://beebom.com/android-alternative/ [beebom.com]
(Score: 2) by VLM on Tuesday March 04, @11:21PM (1 child)
If you trust the official docs (seems unwise) then they are A/B testing rolling out filtering on Google Messages. I would start there. I already assume if its a voice call or a text message its spam, and I'm essentially always correct, so I don't see this as terribly useful.
I have not read anything official mentioning g-photos.
(Score: 2) by KritonK on Wednesday March 05, @06:05AM
So, according to the docs, this is a censorship feature of google messages. As I don't talk to myself even when I want expert advice, I don't think I will be able to test it.
(Score: 3, Interesting) by Username on Tuesday March 04, @03:49PM (4 children)
About two weeks ago a coworker, a mother of a nine year old boy, found nude photos in his texts, of a girl in his class. The girl was just sending them to all the boys for attention. This lead to awkward conversation between parents, and they are no longer friends, and was only resolved by blocking the girl.
This software would have been a good preventitive measures against child porn. The issue is identifying that a phone is used by a child, especially one with shitty parents.
(Score: 2) by aafcac on Tuesday March 04, @09:31PM (2 children)
From a legal POV does that matter?
(Score: 1, Informative) by Anonymous Coward on Wednesday March 05, @01:46AM (1 child)
Yeah in some countries possession or distribution of child porn is illegal so if Google is ever sending those nudes to some server somewhere (even if only to decide whether stuff needs to be blurred) then someone in Google should go to prison according to those laws (e.g. if Google is lying that the stuff never leaves the device).
BTW I suspect in some places it's still illegal even if the distributor is the child sending their own nudes...
As for blaming the parents, yes they are responsible but I'm not surprised if the girl got the idea from others (media etc). Plus it could be part instinct for some girls to flaunt it (they might not be around if their ancestors didn't do it).
Still I wouldn't want to go to prison just because some kid sent me nudes and I didn't notice or something (went to spam folder etc). Also some kids can make up stories (at their age they don't know the full consequences and some won't care even if someone told them), more so if encouraged to do so. https://pmc.ncbi.nlm.nih.gov/articles/PMC6818307/ [nih.gov]
(Score: 2) by aafcac on Wednesday March 05, @06:29AM
IMHO, possession laws are pretty screwed up. If you can't at least established that they were aware the materials were there or intended to have them, it shouldn't be a crime. Otherwise, it just leads to a really secure money making scheme involving planting evidence and threatening to go to the cops with that knowledge.
(Score: 2) by VLM on Tuesday March 04, @11:18PM
Its been interesting watching my kids school district slowly lock down the ipads they gave all the kids.
It was total wild west at first, but more gets locked down/out every year and they pretty much cannot use them for any form of communication or socialization between students except for school district-logged emails. Everything involving socialization or communication is blacklisted on the ipads.
I would assume that we're not more than a couple years away from whitelisted URLs only or maybe no web browsing at all permitted on school district property.
Both extremes are probably pretty bad. Also remember that school districts are in the official business of risk minimization but in theory they're supposed to prepare students for the real world or something similar long since abandoned. From a strict risk minimization standpoint ipads should do as little as humanly possible, from a 'prepare for the real world' they should do as much as possible. And thats the conflict the IT dept is stuck in. Everyone in management who pays their paychecks asking how dare it have a web browser, everyone in the served district asking how dare it not have a web browser, and its a mess.
Likewise the ipads are the school district's property when they do something people don't like, but they're suddenly the kids property when they break... What an amazing form of magic.
(Score: 3, Informative) by ElizabethGreene on Tuesday March 04, @04:14PM (4 children)
Pixel 8 Pro, silently installed, I'm not sure when.
(Score: 2) by bzipitidoo on Wednesday March 05, @05:35AM (3 children)
Same. In the list of apps, I looked for "SafetyCore" under 'S', and didn't find it. Then I saw "Android System SafetyCore" under 'A'. Uninstalled it, and immediately, my phone seemed more responsive.
Lately, my phone has been very slow. Sometimes have missed calls because the interface is unresponsive. Rebooting didn't help. Wonder if SafetyCore was the problem, and how long it has been on my phone. I sometimes get messages that the system has automatically installed apps that turn out to be stupid little games, with ads. I immediately delete them.
(Score: 2) by ElizabethGreene on Thursday March 06, @03:08PM (2 children)
At&t had a default-installed app that would auto-install games like that. It was deceptively named, and I don't recall the name of it. Are you on an at&t phone?
(Score: 2) by bzipitidoo on Thursday March 06, @05:45PM (1 child)
I'm using a Nokia, with Tracfone. I use it as little as possible, and would rather not be on a leash. I purposely let the voicemail fill up, because I don't want to use voicemail, and there doesn't seem to be any other way to disable that. Way I feel is, text me, or go away, don't leave me a voicemail.
It can sometimes be hard to tell who's responsible: Google, the phone manufacturer, or the cell phone service provider.
(Score: 2) by ElizabethGreene on Thursday March 06, @08:28PM
I used to buy prepaid phones to use on my monthly plan (I break phones. :( ) My experience is there is a mix of non-removable and removable bloatware on those. I dug out my old notes and found the one that just keeps giving is called "AT&T Mobile Services Manager" HTH.
(Score: 2) by VLM on Tuesday March 04, @11:26PM (1 child)
I think we can safely assume it'll be used in practice mostly for political categorization and corporate speech categorization.
"The other guys are the bad guys doing hate speech lets censor them because hate speech is bad and coincidentally everyone who doesn't agree with me is a Nazi"
"For only $10M/yr we can filter negative reviews not just on social media but between private text messages and private emails"
Ooops a daisy texting "Ford Trucks Suck" was misidentified as pr0n we totally won't accidentally delete THAT from your private text messages again (fingers crossed while promising).
(Score: 2) by Tork on Wednesday March 05, @01:37AM
Why make something up? Use a real example!
🏳️🌈 Proud Ally 🏳️🌈