A system on my phone where it has a list of bad files and a threshold on how many of those files are allowed. If the threshold is reached Apple can read them. Both the bad files list and threshold is controlled by Apple and is explicitly designed to be un-auditable...
Honestly I would have been fine with Apple scanning my all photos after they are uploaded to iCloud but this is deeply disturbing
This is how I feel as well. I don’t use iCloud photos but if I did, sure scan them for CP, I don’t care. Maybe you will catch some bad guys that willingly gave you their photos. But scanning everyone’s phones is beyond creepy and feels like exactly what the fourth amendment is about. It’s British soldiers suddenly having the ability to search EVERY home in colonial America as often as they want.
> Amendment 4. The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
I have moved away from Apple ecosystem, already purchased another phone yesterday. The Constitution is holy to me, it’s all we have protecting us from technological dystopia.
Constitution applies between the government and you. Apple is not (knowingly) a government agency, they're a private business. Therefore the amendment doesn't apply to their actions. Vote with the wallet instead.
Yes that’s true but I assume this program is working closely with the government. I could care less about child abuser protection but history shows us that it is a very short maybe non-existent slope from “protecting the kids” to whatever overstep a government organization wants to take.
I wouldn’t care nearly as much about Apple scanning my phone it’s about who is pulling those strings they are working with and in America they are (or should be!) bound by the Constitution. I can only imagine the obliteration of rights that will happen in other places. Imagine a Hong Kong protestor with this program on their phone.
We’ve got to do everything we can to keep George Orwell’s quote from coming true-
“If you want a picture of the future, imagine a boot stamping on a human face—for ever.”
I’m a cheap bastard so I got a Pixel 2 used for ~$55. I’m sure people will tell me Android is no better and I’d be open to hearing that but they don’t scan your phone as far as I know. I also have the option to install things like Calyx OS, Graphene OS I believe.
> Honestly I would have been fine with Apple scanning my all photos after they are uploaded to iCloud
Apple has already been doing exactly this for years. Now they are going to check the photos right before uploading to iCloud which is actually more privacy friendly than they do now. It also lets Apple turn on e2e for iCloud photos if they want.
I understand the 'what if' and slippery slope arguments, but wow, there is so much misunderstanding in this thread. Apple makes the OS and doesn't need a fancy system to scan all the files on the device if that's what they want to do.
I highly suggest reading this: https://www.apple.com/child-safety/ and the associated PDFs. Apple PR hosed this up by putting 3 distinct features on one page that people seem to be conflating.
> Apple has already been doing exactly this for years.
Is that an assumption made based on that "everyone is doing it" or is there some evidence?
> Apple makes the OS and doesn't need a fancy system to scan all the files on the device if that's what they want to do.
If the goal is to scan all the files on everyones device this system is exactly what they need. It's not like they could upload hashes of every file on every users phone continuously.
Now that a fully built system for breaking end-to-end encryption is shipped directly in the OS, we're one configuration change away from massive scope creep.
First terrorist content, then "misinformation", then political speech. Apple will be unable to resist government demands to use this preexisting backdoor with a different set of perceptual hashes.
Disabling end to end encryption is always “one configuration change” away, presumably by substituting keys. How closely do you keep track of which keys are used to encrypt Health data - which is end to end encrypted - versus photos - which are not?
If Apple substitutes keys, then that is a detectable event (by jailbroken devices) and that would make news.
This is Apple explicitly announcing they are actively backdooring all iOS and Mac devices, and using your CPU cycles to determine whether you should be reported to the government.
Not really. Key management is done by the SEP, which can’t be introspected. And again, database updates take an iOS update so the back door threat is the exact same.
Then ask yourself why they shipped this scanning on the client-side. This is the first step towards normalizing client-side scanning of encrypted content across the entire device.
Why would Apple want to do this? It doesn’t benefit them at all. Their competitive advantage is having people trust their devices, if not their values.
People are making an extreme claim that Apple went out of their way to implement a fancy system to ruin their own value proposition, and the evidence they have to offer is mere speculation.
Ambiguous retorts like this make you sound intelligent but offer little to the discussion. If the adversary is the USA or China, I have bad news for you: every major democracy has planned encryption regulation which is unimaginably worse than what was announced here.
Actually, the discussion benefits from everyone stating their arguments clearly, even if you don’t benefit. Most of the discussion of this change has been FUD, making it difficult to tease apart actual privacy regression from imagined ones.
Hopefully so they can remove their current ability to decrypt user photos for whatever reason they want. The current state is they can decrypt any user photos on iCloud. Doing client side scanning and this CSAM detection implementation could allow them to remove their ability to decrypt EXCEPT in very specific situations.
It's not true end-to-end encryption since in some cases the content can be decrypted without the user key but it's significantly closer than what they have today.
That being said I don't know if that is their plan or not, but it is a plausible reason to make this change.
If they can decrypt in a “specialized” situation, then they can decrypt in any situation. All that has to be done is to broaden the classifier step by step. Or someone else gets access to the back door. That’s why there can be zero allowed back doors.
The database ships with iOS. Apple can do anything they want in iOS updates. In fact, this was exactly the “back door” the FBI requested Apple use half a decade ago. Per this standard, all Apple end to end products are already backdoored, and nothing new was announced.
Yes but previously Apple’s stance was, “no we won’t do that”. And so they earned many’s trust. Now they are planning to do exactly that. And so they have broken the trust they earned.
Where specifically did Apple say they will never try to detect for the presence of CSAM in your iCloud Photo Library? In fact, people mistakenly assume that they do already.
I disagree. I think it's the first step towards enabling e2ee on iCloud photos. This system will replace the server side CSAM they have done for years.
Many foreign countries have also clearly stated that they do not want this (E2EE) to happen and would legislate against it (the UK comes to mind first).
I do believe that you are correct with the idea that this technology was initially developed as a compromise to E2EE. But while E2EE on iCloud was indefinitely shelved, somehow this was not.
And someone at Apple thought this could be repurposed as a privacy win anyway ?
The other way I can think of it is if the ultimate goal is to add those checks to iMessage. One could argue the tech would make a lot more sense there (it's mostly E2EE with caveats), and it would certainly catch many more positive hashes.
I think someone at Apple massively misjudged the global implications of this and opened the company to a (literal) world of upcoming legislative hurt.
I read that article and see this new method as work around for the FBI complaints, and once again allowing E2EE to move forward.
Technology doesn't live in a vacuum. Given the calls from the government for backdoors to encryption, I think it's safe to assume this is Apple getting out in front of what could likely be heavy handed legislation to add actual backdoors like master keys.
But, we'll have to wait and see if Apple starts adding more services to E2EE again. It also may all be moot if legislation gets passed that forces companies to be able to break the encryption for warrants.
> Technology doesn't live in a vacuum. Given the calls from the government for backdoors to encryption, I think it's safe to assume this is Apple getting out in front of what could likely be heavy handed legislation to add actual backdoors like master keys.
I broadly agree but I cannot foresee a scenario where limiting at this particular issue (CSAM) would be seen as a sufficient compromise by legislators to allow E2EE to be expanded.
And other countries will have very different interpretations, much less palatable to Apple's values, on what should be checked for and they will have no qualm legislating to require it.
Quoting the NY Times (via Daring Fireball) :
> Mr. Neuenschwander dismissed those concerns, saying that safeguards are in place to prevent abuse of the system and that Apple would reject any such demands from a government.
> “We will inform them that we did not build the thing they’re thinking of,” he said.
They can tell themselves that but it doesn't matter : they precisely did.
It's disturbing because of how effective it could be.
It's at OS level, anything you have on your phone can be scanned. Even if an app tries to circumvent it by keeping files encrypted at rest it can scan them in-memory. And since it's all done client-side you'd never know it was happening until it found a match and sent it to Apple.
Honestly I would have been fine with Apple scanning my all photos after they are uploaded to iCloud but this is deeply disturbing