Tip jar

If you like CaB and wish to support it, you can use PayPal or KoFi. Thank you, and I hope you continue to enjoy the site - Neil.

Buy Me a Coffee at ko-fi.com

Support CaB

Recent

Welcome to Cook'd and Bomb'd. Please login or sign up.

March 28, 2024, 09:44:40 AM

Login with username, password and session length

Apple talking nonce sense

Started by touchingcloth, August 06, 2021, 09:53:53 AM

Previous topic - Next topic

Dex Sawash


Look forward to defending myself in front of a jury of peers

dissolute ocelot

Presumably this could easily be repurposed into a tool for finding out if your child exploitation images are on the database or not[nb]Whether by hacking the database of signatures it uses, or else by intercepting it when it phones home to say it's found a perv.[/nb], thus allowing you to only share stuff that won't get you caught. Hurray for Apple!

Sebastian Cobb

Quote from: Cloud on August 09, 2021, 09:50:41 AM
In part you think "what nonce would actually be daft enough to save that shit to their photo album anyway?" but of course iOS being iOS, that's where all images go.  You can sometimes force them to save to the "Files" app instead but it has a tendency to put things into Photos at the first opportunity it gets. It's quite annoying actually as mine ends up chock full of memes. 

The first step down the slippery slope would probably be when the definition of CP in their eyes matches that of the UK, which actually includes drawings - and then the debate ensues of whether some character in some hentai appears subjectively over 18, stuff gets added to the database, and a bunch of weebs end up with the police kicking their doors down.

Saw the comment of "we'd catch loads of predators if we installed surveillance cameras in everyone's homes but we don't do that" but that's the thing, we actually do.  Your phone: often sat on a stand facing your bed.  Siri is always listening and has a habit of mishearing random conversation and piping up with some nonsense, what happens if you said "I need to take Timmy to the baths" and it interprets it as "show me kiddies in baths" and triggers that new warning they're putting in that you just asked it something noncey, do they then start auto-reporting that?  Alexa is interesting as many of their devices have a camera on it that any "family member" can "drop in" on at any time.

----

In terms of Android stuff, it seems that when it comes to Safetynet you are SOL without using GApps, it'll probably never pass any more if you're using microg or similar.  So far the things that need it are Pokemon Go, Ingress and for some reason McDonalds.  Meanwhile First Direct, PayPal and Authy, which you'd expect to want more security, don't seem arsed.

Maybe just have a separate phone that you only use for Pogo, Ingress and Maccies and nothing else.  Or, of course, don't feed into those Bad Things. I still rather enjoy Ingress despite the obvious data mine.

Yeah my dad was trying to find a photo on my mum's iphone that didn't appear to get saved and he managed to poke around in the phone but seemed to think images weren't really stored in directories, which they typically are in android, with different albums essentially pointing to different directories full of photos the system has managed to find.

Can you not turn Siri off then? I disable 'ok google' on Android, which is possible but unnecessarily difficult to do and I always have to google how to do it.

touchingcloth

Quote from: Sebastian Cobb on August 09, 2021, 06:04:47 PM
Yeah my dad was trying to find a photo on my mum's iphone that didn't appear to get saved and he managed to poke around in the phone but seemed to think images weren't really stored in directories, which they typically are in android, with different albums essentially pointing to different directories full of photos the system has managed to find.

Can you not turn Siri off then? I disable 'ok google' on Android, which is possible but unnecessarily difficult to do and I always have to google how to do it.

You can turn "Hey, Siri" off, which is the equivalent to "OK, Google". Whether that means Siri stops listening rather than responding is another matter.

Photos on iOS are stored either in the photos app or the drive app. The drive has a fairly normal directory structure and you can browse it from a mac or online a la Dropbox, but I don't have a clue whether the photos app stores them in anything resembling a desktop directory structure. If you sync your photos to iCloud and from iCloud to your mac then they end up in normal folders, but on an iOS device the file system outside of the files app is essentially a black box for end users.

Zetetic

Quote from: dissolute ocelot on August 09, 2021, 05:37:33 PM
Presumably this could easily be repurposed into a tool for finding out if your child exploitation images are on the database or not
I don't think so, because only "blinded" hashes are provided to devices and only Apple's servers can ultimately determine if an image('s NeuralHash) matches a database entry or not. See page 7 (mostly) of the technical summary.

The question of where the "matching" takes places is actually a bit more complicated than "on your device" vs. "on the server". The aim seems to be to ensure minimal knowledge of the CSAM database on the device and minimal knowledge of non-matching images on Apple's servers.

Technically, it's quite neat.

Sebastian Cobb

i haven't bothered to read the technical description but does that mean they're using some form of bloom filter?

Chedney Honks

Not at all surprised to see people here kicking off about this for the obvious reasons.

Zetetic

Quote from: Sebastian Cobb on August 09, 2021, 06:34:34 PM
i haven't bothered to read the technical description but does that mean they're using some form of bloom filter?
Nope.

JesusAndYourBush

You'd hope the algorithm for checking photos will be better than the audio algorithm youtube and others use, which can identify a cover version as the real thing, or even misidentify a totally different song, presumably because of a similar chord sequence or something.

Ah, first post...
QuoteApple said that if a match is found a human reviewer will then assess and report the user to law enforcement.

Ah ok, so presumably a bunch of humans will end up looking at some mundane false identifications as well as any real nasty stuff.
And maybe the rejected photos will help the algorithm learn.

Zetetic

Notably you also need multiple matches before Apple can look at the actual content of any of the matches.

Zetetic

Some interesting stuff on accidental and deliberate collisions:
https://blog.roboflow.com/neuralhash-collision/

Sebastian Cobb

They're back peddling now.

Quote from: evacide
BREAKING: Apple pauses its plan to do client-side scanning for CSAM.

https://cnbc.com/2021/09/03/apple-delays-controversial-plan-to-scan-iphones-for-child-exploitation-images.html

This is a direct response to the outcry from users and civil society. We're not done, but this is a reminder that collective action moves the needle.
https://mobile.twitter.com/evacide/status/1433844210331910145


shiftwork2

It's just going to come in midway through iOS 15 once the hullaballoo has died down.  Anyway good news for PAEDOS.

Sebastian Cobb

Quote from: shiftwork2 on September 04, 2021, 06:13:20 PM
It's just going to come in midway through iOS 15 once the hullaballoo has died down.  Anyway good news for PAEDOS.

I've seen this opined but it wouldn't be the first time a bad idea got mothballed and swept under the carpet. Admittedly that's more a Microsoft and Google trick.

Zetetic

The slightly ridiculous thing being that if they'd done the boring but basically worse thing of thing scanning everything serverside (like practically every other major provider), there would have been no news story at all.

Zetetic

A more serious discussion would probably look beyond Apple anyway who, like Microsoft and Google, are doing this because they are responding to political demands in a way that they hope keeps them ahead of regulation and that governments hope keeps the wider topic sufficiently far away from real public interest.



Sebastian Cobb

Quote from: Zetetic on September 04, 2021, 06:23:17 PM
The slightly ridiculous thing being that if they'd done the boring but basically worse thing of thing scanning everything serverside (like practically every other major provider), there would have been no news story at all.

It's not 'basically worse', the boundary is at the cloud, the contract between a technology company "minding its business" and "stopping people hosting bad things on their stuff" belongs at that boundary.

Sebastian Cobb

Also define 'boring' in terms of engineering.

Because to me it would largely mean needless boilerplate and useless ceremony that gets in the way (this is an implementation detail, but see: stop making classes). The simplest solution isn't inherently 'boring' but it's usually the sanest implementation and the easiest to test, maintain and manage.

Zetetic

Quote from: Sebastian Cobb on September 04, 2021, 06:43:22 PM
It's not 'basically worse', the boundary is at the cloud, the contract between a technology company "minding its business" and "stopping people hosting bad things on their stuff" belongs at that boundary.
The upshot of that perspective - absent any political shift - is going to be ending encrypted remote storage.

It's basically worse, here-and-now, because the practice of simply going through your stuff is entirely normalised, and then justified on the basis that so much infrastructure is privately-owned.

Sebastian Cobb

Quote from: Zetetic on September 04, 2021, 07:08:12 PM
The upshot of that perspective - absent any political shift - is going to be ending encrypted remote storage.

I don't think it will end it; it will (already has) split it between convenient, free/cheap storage that big tech give you so they can analyse it (and I assume they do that because you are the product in this scenario), and storage providers that allow you to manage your own keys, which for most people is too much effort. I think the fact of the latter means it will largely be ignored, my general feeling about this is governments want a near-panopticon but don't care much about edge cases - hence the attention to things like whatsapp and telegram.

Quote from: Zetetic on September 04, 2021, 07:08:12 PM
It's basically worse, here-and-now, because the practice of simply going through your stuff is entirely normalised, and then justified on the basis that so much infrastructure is privately-owned.

But what you seem to be advocating as a 'better' solution is big tech raking through not just privately owned, but personally owned property.

Zetetic

Quote from: Sebastian Cobb on September 04, 2021, 07:16:09 PM
I don't think it will end it; it will (already has) split it between convenient, free/cheap storage that big tech give you so they can analyse it (and I assume they do that because you are the product in this scenario), and storage providers that allow you to manage your own keys, which for most people is too much effort.
iCloud is broadly neither of these and is one of the biggest cloud photo and other personal data storage providers in this reality.

Quotehence the attention to things like whatsapp and telegram.
There are other reasons for that, presumably, relating to network effects and interest in communications data.

QuoteBut what you seem to be advocating as a 'better' solution is big tech raking through not just privately owned, but personally owned property.
That's already happening on any reasonable understanding - your photos are still personal property, even if you're storing them on privately-owned infrastructure.

I do appreciate the distinction for you is the code running on the device that you own[nb]Setting aside the fragility of that distinction for many people whose phones are effectively subject to leases.[/nb] vs. the device you don't.

Zetetic

Would you feel any differently about Apple's proposal if, instead of scanning images on your phone prior to upload, it re-fetched the image after upload and scanned it?

Sebastian Cobb

iCloud sounds terrible honestly. I don't use Google Photos for precisely the same reason I wouldn't use iCloud if I were an iPhone user.

But the overriding point here, and why people like the EFF think it is terrible is that once this is in place, unlike a photocloud, is that the scope can be changed pretty easily.

And it will, look at how North-American social media companies bow to oppressive regimes. And adjacent to that consider electronic colonialism.

Sebastian Cobb

Quote from: Zetetic on September 04, 2021, 07:27:24 PM
Would you feel any differently about Apple's proposal if, instead of scanning images on your phone prior to upload, it re-fetched the image after upload and scanned it?

I would be against it re-fetching it (I have no idea why it would need to). Why not stick to scanning the thing I have *willingly* and *knowingly* uploaded (assuming I have lost my mind and opted into this)?

Zetetic

Quote from: Sebastian Cobb on September 04, 2021, 07:31:20 PM
is that the scope can be changed pretty easily
More easily than including any code you like in the next software update? (The next software update to a specific device, for that matter?)

Zetetic

Quote from: Sebastian Cobb on September 04, 2021, 07:32:29 PM
I would be against it re-fetching it (I have no idea why it would need to).
Technically no reason at all. You could even scan it just prior to upload, and upload both the photo and encrypted scanning result at the same time...

QuoteWhy not stick to scanning the thing I have *willingly* and *knowingly* uploaded (assuming I have lost my mind and opted into this)?
How does this differ from Apple's proposal which is to scan photos that you're uploading to iCloud?

Sebastian Cobb

Quote from: Zetetic on September 04, 2021, 07:34:42 PM
More easily than including any code you like in the next software update? (The next software update to a specific device, for that matter?)

If only there was an ecosystem that released source code and could be reviewed by security professionals.

Sebastian Cobb

Quote from: Zetetic on September 04, 2021, 07:36:55 PM
Technically no reason at all. You could even scan it just prior to upload, and upload both the photo and encrypted scanning result at the same time...
How does this differ from Apple's proposal which is to scan photos that you're uploading to iCloud?

This is just apologia now.


Zetetic

Quote from: Sebastian Cobb on September 04, 2021, 07:39:48 PM
If only there was an ecosystem that released source code and could be reviewed by security professionals.
I worry that you think there is. But regardless, if we're talking about Apple users, how does this move make it substantially easier for them to scan arbitrary content on their phone in arbitrary ways in the future?

Quote from: Sebastian Cobb on September 04, 2021, 07:40:30 PM
This is just apologia now.
How so? As it stands, the method as described only applied to photos that are being uploaded to iCloud

Quote from: Sebastian Cobb on September 04, 2021, 07:31:20 PM
And it will, look at how North-American social media companies bow to oppressive regimes. And adjacent to that consider electronic colonialism.
Hang on - do you want electronic infrastructure to respond to the demands of local and national governments when it comes to operating in their countries ("bow to oppressive regimes") or not ("electronic colonialism")?

I appreciate that question is obviously a wind-up, but it does bring us back to the point here which is that Apple, in this case, is basically only responding to political will in countries like the UK. The determination that society should look through our stuff - on the basis that stuff might be CSAM - is the thing that needs tackling here, if you think that the consequences of that are a problem.

Sebastian Cobb

My question to you is 'why do think you know better than the likes of the EFF'.