Tip jar

If you like CaB and wish to support it, you can use PayPal or KoFi. Thank you, and I hope you continue to enjoy the site - Neil.

Buy Me a Coffee at ko-fi.com

Support CaB

Recent

Welcome to Cook'd and Bomb'd. Please login or sign up.

April 27, 2024, 07:11:16 AM

Login with username, password and session length

Apple talking nonce sense

Started by touchingcloth, August 06, 2021, 09:53:53 AM

Previous topic - Next topic

touchingcloth

Apple are introducing a system which will check files stored in people's iCloud accounts for matches with known images of child sexual abuse:

Quote
Before an image is stored onto iCloud Photos, the technology will search for matches of already known CSAM.

Apple said that if a match is found a human reviewer will then assess and report the user to law enforcement.

I've always wondered why your Google and Dropboxes didn't do this, so I think I'm broadly in favour of it even though it seems pretty obvious that like the article mentions this will be likely to set governments clamouring for similar usages of technology in ways which would severely damage free speech.

Feel free to discuss, but I mainly wanted to do the Phil Collins thing. Ok now, cheers.

Sebastian Cobb

Quote from: touchingcloth on August 06, 2021, 09:53:53 AM
I've always wondered why your Google and Dropboxes didn't do this,

Everyone else already does, using a server-side technology called PhotoDNA. The main difference with apple is they'll be doing the scanning client-side, before things are uploaded.

This thread is interesting:
https://twitter.com/petersterne/status/1423389183750574091

Key takeaway point:
QuoteApple thinks photo scanning is non-negotiable — that for legal and PR reasons, you can't be a major consumer tech company and not scan users' photos — so the only way to encrypt photos on-device was to develop & implement client-side scanning.

I hope they're wrong, but idk




Zetetic

Technical summary document here:
https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf

Different "visual hashing" setup to PhotoDNA.

A significant question is whether its specificity holds up in the face of deliberate attempts to trigger a false-positive.

Sebastian Cobb

Quote from: Zetetic on August 06, 2021, 10:02:02 AM
A significant question is whether its specificity holds up in the face of deliberate attempts to trigger a false-positive.

Seeing it's client-side, tricking the algorithm may not be necessary if people can reverse-engineer the api to report images.

Mr_Simnock

Oh god I need to remove old pictures of myself in the bath aged 5 from my phone, I don't want 5 Steve jobs look alike paedo hunters round my house after me

Jerzy Bondov

You know when you have to do a Captcha to prove you're not a robot, and it says 'Click all the images containing child sexual abuse', that's how they trained the AI for this. Fascinating

Butchers Blind

Look what happened to Julia Somerville when she went to Boots. Tread carefully.

Paul Calf

Agree to having all your images deep-scanned on the off-chance they might catch a few clumsy nonces?

No. This is not acceptable. Stand up for yourselves.

Sebastian Cobb

#8
Quote from: Paul Calf on August 06, 2021, 11:32:28 AM
Agree to having all your images deep-scanned on the off-chance they might catch a few clumsy nonces?

No. This is not acceptable. Stand up for yourselves.

I can sort of see the case for when you push things into a vendors cloud, it's their garden and they have a right to protect themselves from hosting CP. In the case of Google, one of the worlds biggest ad-brokers, they're almost certainly doing all sorts of other analytics for data gathering with no real oversight, which is precisely why I don't use their photo cloud service.

Doing it on the device itself is a bit different. They do claim they're only doing it for images destined for iCloud, but it seems like something that could easily become ripe for scope-creep.

In fact according to the EFF there already is scope creep by scanning iMessage images and telling your mum and dad:
https://www.eff.org/deeplinks/2021/08/apples-plan-think-different-about-encryption-opens-backdoor-your-private-life

QuoteThere are two main features that the company is planning to install in every Apple device. One is a scanning feature that will scan all photos as they get uploaded into iCloud Photos to see if they match a photo in the database of known child sexual abuse material (CSAM) maintained by the National Center for Missing & Exploited Children (NCMEC). The other feature scans all iMessage images sent or received by child accounts—that is, accounts designated as owned by a minor—for sexually explicit material, and if the child is young enough, notifies the parent when these images are sent or received. This feature can be turned on or off by parents.

Somewhat related, Snowden's come out as quite critical of Apple's security theatre as a marketing tactic after they refused to help the FBI with that phone.
https://edwardsnowden.substack.com/p/ns-oh-god-how-is-this-legal

Dex Sawash


Apples are traditional paedo's bait. Only fair to turn it  back again

Paul Calf

Quote from: Dex Sawash on August 06, 2021, 12:13:18 PM
Apples are traditional paedo's bait. Only fair to turn it  back again

I don't think the wicked stepmother was a paedo. You might want to withdraw that remark.

Zero Gravitas

Quotethe technology will search for matches of already known CSAM

All the more reason to support artisanal small batch CSAM producers, trying to get your jollies from that flash drive you found taped your uncle's cistern just isn't worth the risk.

Noodle Lizard

It's disappointing, considering how Apple generally maintained a good reputation for privacy (how much of that was "theater" is debatable, of course). It seems odd to throw that all into jeopardy for something like this, but it's been clear for decades that a lot of encroaching measures which violate privacy (or personal choice, expression, rights etc.) tend to begin under the guise of "protecting the children". After all, who could possibly be against that? It might not significantly affect us in the West (yet), but it could certainly pose immediate issues in more authoritarian countries.

Granted, I don't know enough about programming to fully comprehend how the technology works for myself, but the EFF summarised it fairly well, I think:

Quote from: https://www.eff.org/deeplinks/2021/08/apples-plan-think-different-about-encryption-opens-backdoor-your-private-lifeIt's impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger's encryption itself and open the door to broader abuses.

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children's, but anyone's accounts. That's not a slippery slope; that's a fully built system just waiting for external pressure to make the slightest change.

idunnosomename

yes. I found this most troubling, and I don't like child pornography at all!!!!!

touchingcloth

Quote from: Sebastian Cobb on August 06, 2021, 11:48:50 AM
Doing it on the device itself is a bit different. They do claim they're only doing it for images destined for iCloud, but it seems like something that could easily become ripe for scope-creep.

For a lot of users, this will be equivalent to all of the photos on their device. iPhones have a feature that lets you store full resolution images in the cloud and lower res locally. I used to use it, but the grain and lag on my lolly porn wasn't tolerable.

Cloud

Bizarre u-turn from Apple, who up until now had managed to stay immune to the "but what about kiddie porn" logic of mass surveillance.  Maybe it just proves Snowden right, that it was all just PR and they're firmly under the thumb of the 3 letter agencies.

They're also rolling out their own VPN-that's-not-a-VPN where "even Apple can't see" - yeah until your DNS query matches some list or other no doubt...

Paul Calf

Quote from: touchingcloth on August 06, 2021, 10:29:05 PM
For a lot of users, this will be equivalent to all of the photos on their device. iPhones have a feature that lets you store full resolution images in the cloud and lower res locally. I used to use it, but the grain and lag on my lolly porn wasn't tolerable.



Whatever gets you off.

evilcommiedictator

I like how they're sneaking in the parental "your child is sharing naked[nb]According to our own algorithm, so we'll flag any black skin, swimsuit, or poodle as naked[/nb] images" functionality alongside the "known pedo images checking" functionality.

And of course, all those hashes of pedo images are run by the US government (but most US-friendly countries cooperate with them too)

Cloud

Not that there's anywhere to flounce to.  Android?  Google have been doing this since like 2009.  It's more the sense of betrayal, after Apple has been hyping up "privacy above all" for so many years.  And it's actually really stupid, as they do have expensive devices and a bit of a walled garden but privacy was always a big feature that makes it all worthwhile.  Now, if you can't trust them with your data, why not just switch to Android if you find it better in other ways?

And now they want people to trust them with all the websites they visit, via a closed source "relay" feature that's enabled by default?  Hmmmmmmm.

Sebastian Cobb

I don't trust android at either and in many ways it's less secure because vendors fork it and make their own versions and stop updating them etc, but there are also security-conscious versions I think people have reverse-engineered the google services and made a copycat suite that mocks a lot of it so apps that didn't work on base android without all the google shit can work.

edit, what I'm talking about is microg https://microg.org

Brundle-Fly

Android used to be such a lovely word. Apple too.

Paul Calf

There are alternatives. Linux smartphones are taking off one way or another.

https://www.youtube.com/watch?v=qTtgzNGRAfA

CalyxOS is an open-source mobile OS developed with the goal of giving people 'privacy and agency'.

Cloud

Quote from: Sebastian Cobb on August 08, 2021, 03:17:26 AM
I don't trust android at either and in many ways it's less secure because vendors fork it and make their own versions and stop updating them etc, but there are also security-conscious versions I think people have reverse-engineered the google services and made a copycat suite that mocks a lot of it so apps that didn't work on base android without all the google shit can work.

edit, what I'm talking about is microg https://microg.org

Yeah I still have a OnePlus 7 lying around and had a go with LineageOS with MicroG.  However I couldn't for the life of me with all the fiddling in the world with magisk and its plugins get Safetynet passing, which cuts out the ability to use banking apps, certain games etc. If it wasn't for that, it'd be quite a usable alternative.

I don't mind playing something like Pokemon Go knowing that the data is used in various ways because that's my choice.  It's scanning and data collecting without choice that is rather insidious.

Sebastian Cobb

Quote from: Cloud on August 08, 2021, 02:50:22 PM
Yeah I still have a OnePlus 7 lying around and had a go with LineageOS with MicroG.  However I couldn't for the life of me with all the fiddling in the world with magisk and its plugins get Safetynet passing, which cuts out the ability to use banking apps, certain games etc. If it wasn't for that, it'd be quite a usable alternative.

I don't mind playing something like Pokemon Go knowing that the data is used in various ways because that's my choice.  It's scanning and data collecting without choice that is rather insidious.

Yeah I've struggled with magisk and safetynet when trying to set it up myself, I've got a moto 5 and the 'mint' fork of Lineage seems to come with SafetyNet already set up and that's been pretty flawless though.

Although there's now been an official Lineage release of Android 11 for my handset so I'm kind of tempted to try that.

evilcommiedictator

Quote from: Brundle-Fly on August 08, 2021, 09:22:00 AM
Android used to be such a lovely word. Apple too.

Google and other big companies do the hash comparison for files stored on their clouds, Google Drive etc.
Apple are doing this on your phone, a separate OS process accessing your photos and sending off information about them.
As well as that, they'll run a ML algorithm on iMessage to determine if your child is sending "porn". So that's two ways the device is accessing information on your phone.

Dex Sawash

Accidently hit the camera search button on my android while sat on toilet and it found ikea bath mats. I was impressed and did it again with my foot and it found where I could buy more Puma socks. Took a picturevof my knob and it didnt find anything :(

chveik

Quote from: Dex Sawash on August 09, 2021, 02:58:51 AM
Accidently hit the camera search button on my android while sat on toilet and it found ikea bath mats. I was impressed and did it again with my foot and it found where I could buy more Puma socks. Took a picturevof my knob and it didnt find anything :(

what were you expecting to find?

Cloud

#27
In part you think "what nonce would actually be daft enough to save that shit to their photo album anyway?" but of course iOS being iOS, that's where all images go.  You can sometimes force them to save to the "Files" app instead but it has a tendency to put things into Photos at the first opportunity it gets. It's quite annoying actually as mine ends up chock full of memes. 

The first step down the slippery slope would probably be when the definition of CP in their eyes matches that of the UK, which actually includes drawings - and then the debate ensues of whether some character in some hentai appears subjectively over 18, stuff gets added to the database, and a bunch of weebs end up with the police kicking their doors down.

Saw the comment of "we'd catch loads of predators if we installed surveillance cameras in everyone's homes but we don't do that" but that's the thing, we actually do.  Your phone: often sat on a stand facing your bed.  Siri is always listening and has a habit of mishearing random conversation and piping up with some nonsense, what happens if you said "I need to take Timmy to the baths" and it interprets it as "show me kiddies in baths" and triggers that new warning they're putting in that you just asked it something noncey, do they then start auto-reporting that?  Alexa is interesting as many of their devices have a camera on it that any "family member" can "drop in" on at any time.

----

In terms of Android stuff, it seems that when it comes to Safetynet you are SOL without using GApps, it'll probably never pass any more if you're using microg or similar.  So far the things that need it are Pokemon Go, Ingress and for some reason McDonalds.  Meanwhile First Direct, PayPal and Authy, which you'd expect to want more security, don't seem arsed.

Maybe just have a separate phone that you only use for Pogo, Ingress and Maccies and nothing else.  Or, of course, don't feed into those Bad Things. I still rather enjoy Ingress despite the obvious data mine.

Dex Sawash

Quote from: chveik on August 09, 2021, 03:35:28 AM
what were you expecting to find?

Expected it to at least id it as something. Probably have to go remove it from some search record now.

Poobum

Quote from: Dex Sawash on August 09, 2021, 11:52:45 AM
Expected it to at least id it as something...

A child's cock probably, be worried.