You remember #Apple scanning all images on your #mobile device?

If you have an #Android #phone, a new app that doesn't appear in your menu has been automatically and silently installed (or soon will be) by #Google. It is called #AndroidSystemSafetyCore and does exactly the same - scan all images on your device as well as all incoming ones (via messaging). The new spin is that it does so "to protect your #privacy".

You can uninstall this app safely via System -> Apps.

developers.google.com/android/…

Dieser Beitrag wurde bearbeitet. (3 Monate her)
Als Antwort auf jack

You remember Apple succumbing to public pressure and quietly never implementing it in the first place?

Edit: I’m gonna make a big addendum to this comment. While Apple *did* scrap plans for CSAM detection due to public pressure, they did implement an on-device mechanism for blurring “sensitive content.” A feature much like the one on Android in the original post here, although I’ve yet to find much transparency on the Android apps’ workings. This on-device functionality works like face detection, so if you’re okay with that there’s little reason to be alarmed.

Dieser Beitrag wurde bearbeitet. (3 Monate her)
Als Antwort auf Cat Herder 💔 🗽 🇺🇲

@CatHerder @DelilahTech @blitzen

for anyone, not just people i am replying to:

what i did was go to the Google Play store and search for Android SafetyCore

when it found the app in the store, i was given the "uninstall' option

that was how i located and removed it - search did not help -
i'm sure there are other ways! but this was easiest for me and maybe others too

teilten dies erneut

Als Antwort auf Court Cantrell prefers not to

@courtcan @CatHerder @DelilahTech @blitzen

this happened to my friend - no sign of it in the search or in the app store, as of about 3pm CST today. (i did my search and removal several hours ago..)

i have no clue what's up with this.

anyway, i hope people are able to find it and remove it

Als Antwort auf Court Cantrell prefers not to

@courtcan @paelse @rustoleumlove @DelilahTech @blitzen

I'm not techy to the point of knowing or noticing those things. When someone posts about these creepy tactics, I take their advice!
So few people protect their privacy these days, but it's a big deal to me; I appreciate the people looking out for it.

Als Antwort auf Al

@mral It seems that #Murena and thus #Fairphone is available in the USA:

murena.com/america/products/sm…

I can't say that much about Murena and their (degoogled) OS, but I've been a happy Fairphone user for almost a decade now (both #degoogled and stock).

#CalyxOS is a good OS choice, check out their supported devices:

calyxos.org/docs/guide/device-…

#GrapheneOS looks pretty good, too - but it only runs on google's pixel (which admittedly is one of the most open phones around).

grapheneos.org/

teilten dies erneut

Als Antwort auf Flexi Bell

@flexi The vanilla standard android version by google. Pixel comes with it as well as Fairphone. Some others, like Asus or Motorola, too IIRC.

androidauthority.com/what-is-s…

Deacon Jericho hat dies geteilt.

Als Antwort auf Flexi Bell

you don't need to root, #shizuku plus f-droid.org/packages/org.samo_…

Takes 5 minutes to set up after downloading relevant apps. Delete anything tagged "recommended" for a start.

Dieser Beitrag wurde bearbeitet. (3 Monate her)

Flexi Bell hat dies geteilt.

Als Antwort auf Deborah Hartmann Preuss, pcc

@deborahh @mayintoronto Likewise.
BTW: This guide explains more about it, and where to disable "Android System Intelligence":
androidauthority.com/android-s…
#Android
Unbekannter Ursprungsbeitrag

mastodon - Link zum Originalbeitrag

jack

@tritol128 The documentation has already been mentioned twice in this thread. But here's a link to a somewhat less technical summary:

androidauthority.com/google-me…

Als Antwort auf jack

Hi @jack, not loving #Google at all (see my bio) but to be fair, there seems to be a substantial difference between this and what #Apple wanted to do: Google says none of the scanning results will ever leave your device, even positive ones. No law enforcement notification. No server-side collection. It seems to basically be an advanced local #spam filter for #RCS text messages.
Als Antwort auf Jan Penfrat

@ilumium I decompiled the apk to have a quick look and the app does have internet access.

It includes a binary library called "libtartarus" which seems to be an AI (references to TensorFlow).

It does also include telemetry (OpenCensus), so *some* data is transmitted to Google. There is also some logging to Google in the code, but can't say if it's enabled.

The code also does link the app with your Google Account…

I'm not an Android specialist so I can't tell what is sent exactly.

Als Antwort auf Jan Penfrat

@ilumium Apple’s proposal was two parts:

1. Scan images for CSAM as they are encrypted for upload to iCloud. If CSAM is detected, also send Apple a fragment of the encryption key.

At the time, photos at rest in iCloud (and others) were clear, and the NSA and various companies’ support departments had been caught saving and distributing nudes non-consensually. This plan intrinsically involved end-to-end encryption, so it was addressing a real privacy risk. If you didn’t send images to iCloud, the CSAM scanning would never run. This was scrapped because tech media spread ridiculous misinformation about it, creating a PR disaster.

2. Scan images in incoming messages in the Messages application on minors’ phones for nudity. If detected, obscure the image. If the minor taps to unobscure the image, notify the associated parent account.

The parent notification was dropped, and the rest of the feature was implemented in iOS 17, in late 2023. Any account can enable it, and the service is available to other messaging applications (so WhatsApp or a Mastodon client could be written to use it if the user has enabled it). This is exactly what Google has implemented.

support.apple.com/en-us/105071

Als Antwort auf jack

I'm not a big Google fan but this doesn't look the same. It is a feature released in October to (e.g.) mask unsolicited dick pics in RCS chat., disabled by default. Obligatory your threat model != my threat model.

security.googleblog.com/2024/1…
"...doesn’t allow Google access to the contents of your images, nor does Google know that nudity may have been detected"

Dieser Beitrag wurde bearbeitet. (3 Monate her)
Als Antwort auf jack

Welcome to my FUD list.

To cite Google (via the page you linked):

Sensitive Content Warnings is an optional feature that blurs images that may contain nudity before viewing, and then prompts with a “speed bump” that contains help-finding resources and options, including to view the content. When the feature is enabled, and an image that may contain nudity is about to be sent or forwarded, it also provides a speed bump to remind users of the risks of sending nude imagery and preventing accidental shares.

All of this happens on-device to protect your privacy and keep end-to-end encrypted message content private to only sender and recipient. Sensitive Content Warnings doesn’t allow Google access to the contents of your images, nor does Google know that nudity may have been detected. This feature is opt-in for adults, managed via Android Settings, and is opt-out for users under 18 years of age. Sensitive Content Warnings will be rolling out to Android 9+ devices including Android Go devices with Google Messages in the coming months.

Als Antwort auf jack

To those who are curious about the app and can't find the proper documentation I found this video explainer. youtube.com/watch?v=1rdlTveD8F…

I understand some people may feel the pic blurring is useful for them but in the privacy community Google does not have a good rep & is known for sneaking new features into Android to get more of your private info. It may not seem bad now but new capabilities could be added later without warning.

Als Antwort auf jack

@jack

You are a gentleman & a scholar for disseminating the dark arts of New Nerd Order as set forth in its foundation text 'Malware Malefecarum'.

THANK YOU!

Happily, Google's CovertApps boobytrap is no match for my obsolete Android, which doesn't support any of this Cambridge Five nonsense.

Knackered tech gazumps corporate cannibal, HUZZAH!

@jack
Als Antwort auf jack

See grapheneos.social/@GrapheneOS/….


The functionality provided by Google's new Android System SafetyCore app available through the Play Store is covered here:

security.googleblog.com/2024/1…

Neither this app or the Google Messages app using it are part of GrapheneOS and neither will be, but GrapheneOS users can choose to install and use both. Google Messages still works without the new app.


Als Antwort auf jack

grapheneos.social/@GrapheneOS/…


The functionality provided by Google's new Android System SafetyCore app available through the Play Store is covered here:

security.googleblog.com/2024/1…

Neither this app or the Google Messages app using it are part of GrapheneOS and neither will be, but GrapheneOS users can choose to install and use both. Google Messages still works without the new app.


Als Antwort auf jack

what this does is: local powered KI detector for spam and nudity.
grapheneos.social/@GrapheneOS/…


The app doesn't provide client-side scanning used to report things to Google or anyone else. It provides on-device machine learning models usable by applications to classify content as being spam, scams, malware, etc. This allows apps to check content locally without sharing it with a service and mark it with warnings for users.

Als Antwort auf jack

That is actually a rather poor description of what it does. Basically, the app is one that prevents dick pics.

It scans incoming images, and if it thinks they're nudes, then it shows a content warning message along with the blurred image. Additionally, it scans outgoing images and if it thinks they're nudes, it gives you the option to not send them.

However, all this scanning is done entirely on your own device, it doesn't send anything anywhere.

Als Antwort auf jack

As long as it happens on the device only (and that’s what Google claims) that’s ok. However, I have little trust in Google never changing it mind and sending the photos to their servers to analyze. I also don’t have a trust in them never sending metadata (this is something they didn’t mention at all – I don’t know if they do it or not). Why would they know how often I send and receive nudes?
Als Antwort auf jack

while the incorrect explanation of what the app does plus the framing of the post raises my skepticism by quite a bit, I'm surprised that you didn't provide a Google Play link to the app which will give most people an easily accessible uninstall button.

Anyways, here's the app on the Google Play store if somebody wants to see if the app is installed on their device and if they want to uninstall it:

play.google.com/store/apps/det…

Als Antwort auf jack

LOL at the sheeple still using stock Android, I use LineageOS and--WTF??

Kidding, of course, but yeah, can confirm that the app installs itself on LineageOS too, at least when you have the Google apps/play store installed. A decision I'm coming to regret.

Uninstalled. it.Looking forward to having to keep checking every week in perpetuity to see if it's reinstalled itself.

Als Antwort auf jack

where did you get this information? The only info I could find was this here: androidauthority.com/google-me…

Where the explanation isn't about privacy really..

Als Antwort auf jack

OMG what a mess of a post..
DO NOT uninstall this app, unless you want to lose signature verification of APKs and open yourself to malicious app sideloading.
The SystemSafetyCore has NOTHING to do with CSAM scanning or anything similar to Apple's photo verification proposal. The proposed image scanning (on-device only) is being added to the Google Messages app itself, not to any system service apps.
Just a heads up - do not uninstall apps just because someone on the internet told you so.