Tip jar

If you like CaB and wish to support it, you can use PayPal or KoFi. Thank you, and I hope you continue to enjoy the site - Neil.

Buy Me a Coffee at ko-fi.com

Support CaB

Recent

Welcome to Cook'd and Bomb'd. Please login or sign up.

April 25, 2024, 11:21:24 PM

Login with username, password and session length

Apple talking nonce sense

Started by touchingcloth, August 06, 2021, 09:53:53 AM

Previous topic - Next topic

Zetetic

I note that the EFF's position pretty much starts with the assumption that iCloud is the only major service usable by most people with any degree of privacy, and that is why this move is of any interest. I agree with them that "this is a decrease in privacy for all iCloud Photos users".

But I'm not sure why you're so taken with their view, given that you can't see any distinction between iCloud and services that don't bother with end-to-end encryption and already routinely go through your information - hosted on remote servers - with whatever methods they like.

The EFF's position isn't "Apple should dump end-to-end encryption for iCloud Photos".

Sebastian Cobb

Quote from: Zetetic on September 04, 2021, 07:42:18 PM
I appreciate that question is obviously a wind-up, but it does bring us back to the point here which is that Apple, in this case, is basically only responding to political will in countries like the UK. The determination that society should look through our stuff - on the basis that stuff might be CSAM - is the thing that needs tackling here, if you think that the consequences of that are a problem.

Actually, the political will isn't quite there (yet). Apple are doing this off their own back.
I've posted this already, but still:
QuoteApple thinks photo scanning is non-negotiable — that for legal and PR reasons, you can't be a major consumer tech company and not scan users' photos — so the only way to encrypt photos on-device was to develop & implement client-side scanning.

I hope they're wrong, but idk
https://twitter.com/petersterne/status/1423389183750574091

History has told us again and again that seemingly benevolent surveillance will be abused (a good example of this is thatcher and the NUM, or america and civil rights activists, but I'm sure you can choose some examples yourself), there's a fucking reason that privacy campaigners oppose building this stuff in the first place.

Sebastian Cobb

Quote from: Zetetic on September 04, 2021, 07:47:50 PM
But I'm not sure why you're so taken with their view, given that you can't see any distinction between iCloud and services that don't bother with end-to-end encryption and already routinely go through your data.

The distinction for me is how much it blurs lines between apple's 'hardware', 'software' and 'services' and how that will be mirrored in other pieces of 'hardware' 'software' and 'services'. In other words, I fear this decision will shrink my scope of consent even though I do not use apple hardware, software or services.

Zetetic

Quote from: Sebastian Cobb on September 04, 2021, 07:48:06 PM
Actually, the political will isn't quite there (yet). Apple are doing this off their own back.
I think this is simplistic at best - Apple is doing this to avoid regulation (as a matter of instinct) and to preserve a near-unique selling point (a little bit more privacy than anyone else has bothered to offer), [edit] because they recognise what is coming.

Sebastian Cobb

They're not adding more privacy by scanning personally owned devices.

If they cared about privacy foremost they would not be putting in a system that scans personally owned devices for anything because once in that functionality can be widened to scan for other things.

Zetetic

But I believe this is an attempt not to have to simply give up end-to-end encryption, which they think is rapidly becoming the only other option.


Zetetic

Quote from: Sebastian Cobb on September 04, 2021, 07:58:44 PM
because once in that functionality can be widened to scan for other things.
They (and Google and Microsoft) have systems for running arbitrary new code pretty much at will though. (But I'm not complete immune to arguments that it's socially easier for all sorts of actors to expand the scope of something like Apple's proposal rather than install something new overnight.)

Sebastian Cobb

Quote from: Zetetic on September 04, 2021, 08:13:51 PM
But I believe this represent an attempt not to have to simply give up end-to-end encryption, which they think is rapidly becoming the only other option.


end to end encryption is somewhat meaningless if you implement a wooly "bad stuff" scanner with configurable scope.

they've removed boundaries.

Zetetic

Quote from: Sebastian Cobb on September 04, 2021, 08:17:08 PM
end to end encryption is somewhat meaningless if you implement a wooly "bad stuff" scanner with configurable scope.
I don't entirely disagree. The major advantage (of the system as described), as far as I can see, is that you can logout of your iCloud Photos account and that prevents novel scanning.

Sebastian Cobb

Quote from: Zetetic on September 04, 2021, 08:16:23 PM
They (and Google and Microsoft) have systems for running arbitrary new code pretty much at will though. (But I'm not complete immune to arguments that it's socially easier for all sorts of actors to expand the scope of something like Apple's proposal rather than install something new overnight.)

I'm not sure that's entirely true - we're getting a bit theoretical but AOSP is open sourced and reviewed by security professionals.

You don't have to engage with Google services to use AOSP, of course most people do, but there's still a level of opt-in there, and that's before we get towards microg etc. But this is largely useless to the average android user; cutting through the shit though, there is some element of boundary currently.

Sebastian Cobb

And I think there's a conflation here. What is the point of e2e between the likes of Apple, MS and Google?

The point of E2E is to stop Eve seeing conversations between Alice and Bob. It does that well. But the questions around e2e are not "should Alice trust Bob not to blabber her secrets after she's told them", they're certainly not "Should Bob be allowed to break into Alice's house and search through her draws, as an alternative to letting Eve see their conversations".

Sebastian Cobb

Actually thinking about it, the Apple system of allowing Bob to search Alice's house for bad stuff, notionally under specific circumstances (but let's face it this will be abused) could be streamlined. Maybe give Bob a some sort of way of opening Alice's door, let's say a 'magic door opener' so he doesn't have to break a window and get it refitted.

Oh wait that's technically completely different to, but spiritually exactly the same as key escrow isn't it?

Cloud

#72
Good stuff, they must have detected the number of us saying "well if that's what they think about privacy after all then why not just use an Android phone and at least get something out of handing over my data" :)

I have no issue with them scanning things at their end after uploading to the cloud (I think they do already?) and catching noncery that way. It's all to do with that insidious "first step of invading your phone" that Google and co would undoubtedly have followed (because what Apple does, they all copy). It's kind of symbolic.

In terms of encryption there is a philosophical discussion to be had there. If someone is uploading end to end encrypted files that happen to be decryptable with their own key to CSAM, then is Apple really technically hosting CSAM, or are they just hosting random encrypted gobbledegook that only that one guy can "convert" back to CSAM?  Does it really matter to them if only that one person, who will probably get caught some other way, is uploading stuff they can't see?  Personally I think just don't worry about it and allow e2e.  It's not their fault if someone uploads something bad that they (or anyone else other than the uploader) can't see. So don't blame Apple and allow them the view that it's comparable to having it on their own device, where if someone does see it they'll probably report it and the police will pop round with a warrant. 


Judging from the article though it's only "delayed", a bit of PR, I have my doubts it'll just be brushed under the carpet.

Sebastian Cobb

Quote from: Cloud on September 04, 2021, 09:08:14 PM
If someone is uploading end to end encrypted files that happen to be decryptable with their own key to CSAM, then is Apple really technically hosting CSAM, or are they just hosting random encrypted gobbledegook that only that one guy can "convert" back to CSAM? 

This is a really unclear thing that needs pulling apart I guess.

If you said yes then basically everyone would be fucked, because steganographic techniques means any picture or other file you visit on any web page could in theory contain something bad you don't know about, and could be widely distributed with nearly all the people 'possessing' (as in having it in their browser cache) being entirely oblivious.

Of course at the less dystopian end of the road you could have an encrypted archive that just contains bad stuff, and search for the hash of that (although the hash of an archive changes if you change the password/key this isn't largely meaningful but it was a trivial way to thwart 'deep packet inspection' for copyrighted content, before everyone just used ssl) and I doubt anyone would mind if a cloud provider reported on that sort of thing.

touchingcloth

What if you have a file which is just a string of zeroes, and the key turns it into child porn? Have they thought about that? Have they fucking thought about that?

Sebastian Cobb

Quote from: touchingcloth on September 04, 2021, 09:22:40 PM
What if you have a file which is just a string of zeroes, and the key turns it into child porn? Have they thought about that? Have they fucking thought about that?

I'd lose that key if i were you.

Sebastian Cobb

I guess the probability is infinitesimally small given hashing and redundancy checks but I guess there is a chance that in theory your decryption key does actually decrypt some other data and arrange the bits into something entirely horrific, even though the encryptor never actually encrypted that.

Zetetic

Quote from: Cloud on September 04, 2021, 09:08:14 PM
I have no issue with them scanning things at their end after uploading to the cloud (I think they do already?) and catching noncery that way. It's all to do with that insidious "first step of invading your phone" that Google and co would undoubtedly have followed (because what Apple does, they all copy).
The point is that Apple don't scan stuff in the cloud, because they're technically unable to without violating the end-to-end encryption, and they are unique because everyone else didn't copy them.

Zetetic

Quote from: Cloud on September 04, 2021, 09:08:14 PM
If someone is uploading end to end encrypted files that happen to be decryptable with their own key to CSAM
In this reality, however, they don't just "happen" to decryptable to CSAM - that property is a consequence from them being produced by encrypting CSAM, so the question is unlikely to come up.

Sebastian Cobb

#79
Quote from: Zetetic on September 04, 2021, 10:07:19 PM
The point is that Apple don't scan stuff in the cloud, because they're technically unable to without violating the end-to-end encryption, and they are unique because everyone else didn't copy them.

Nope. End to end encryption is 'end to end transit' what you are describing is something different, 'end to end storage and retrieval' (although by definition that would be encrypted in transit).

I don't think most people believe their stuff, unless explicitly told otherwise, is 'unreadable' in the cloud. Suggesting what they're doing around the idea that they're preserving privacy based on a common expectation that cloud storage is unreadable in this way is misleading I think.

The privacy people actually need - in relation to photos that aren't CSAM is one of accessibility; leaked nudes don't come from being stolen 'in transit' and they don't usually get stolen by a rogue Apple employee with root access leaking them, they get stolen by social engineering and weak security protocols of end users, nothing this tech will prevent.

touchingcloth