Hacker News new | past | comments | ask | show | jobs | submit login
The All-Seeing “i”: Apple Just Declared War on Your Privacy (edwardsnowden.substack.com)
861 points by ttctciyf on Aug 26, 2021 | hide | past | favorite | 380 comments



This distinction of changing the defacto ownership of your device and data is the real inflection point. The surveillance technology itself is not really that novel, as functionally it's applying established anti-virus techniques to data instead of code. Ask any AV company how their detection works, and it will include a variation on this.

This same tech can (and will likely) be used to find the owners of bitcoin and other cryptocurrency wallets, honeypot tokens, community identities, and to provide profiling information to the company's political masters. The collisions in the hashing scheme mean that you can insert anything you want onto peoples devices and get them pulled into the legal system once it is flagged, where the process itself is the punishment. The whole scheme is too stupid to ever have been about reason, it's just pretexts and narrative, and this is as good a time as any to exit their ecosystem.

Apple really picked the wrong time to attempt this, as I do not see anyone who understands how evil this is ever forgiving them. The most charitable thing I can say about it is that they're probably just doing it as part of a deal to avoid anti-trust plays, where Apple plays ball with the feds and its parties, and the storm just magically passes them over. The good news is running OSX made me lazy, and getting back into running a linux or freebsd laptop again is going to be fun.


> The surveillance technology itself is not really that novel

What's novel is that the tech reports you to the authorities. Imagine your AV reporting you to the authorities for digital piracy, it's something that RIAA could only dream of back in the day. Now it's becoming a reality.


"We just scan every song on your iPod to make sure the neural hash isn't copyrighted content. If it is blah blah blah tokens blah blah report you to the RIAA."

Just bought a System76 laptop. Happy birthday Linux!


Or they could just scan the songs themselves. Why bother with hashing when you have full control? This is a red herring.

Since they have had the ability and incentive to stop music piracy that way for at least a decade, and haven't done it, that should tell you something.


ok but how do they know you haven't bought that content?


Gotta go deeper and scan the bank transactions going through your phone. Why not just watch your screen and build a model of everything you do. It's a slippery slope ... of extrapolation.


> Gotta go deeper and scan the bank transactions going through your phone.

Sight Piracy as the reason, but get better conversion tracking for those sweet Ad clicks. We all know which companies would be drooling at the prospects of this.

Sadly this isn't hypothetical, Card companies have colluded with the data hoarders in the past for this exact purpose.


> Why not just watch your screen and build a model of everything you do.

https://www.consumerreports.org/privacy/how-to-turn-off-smar...


This kind of reporting is of course standard practice in the banking industry and has been going on for 20+ years.


Well clearly the burden of proof now lies with you. Have fun with our fully automated appeal process. If you get over 10,000 retweets we may reconsider.


The idea is to force you to buy the same content in multiple platforms. Bought content would have DRM; anything stripped of DRM is presumably pirated content.


In the future, everything is streamed. You don't own the content.


...If people will want to pay for that model. Not everybody will - this is absolutely certain. So, there is still some space for physical media being around for the rest.


Are you sure? There are already music albums that you can't buy in stores.

The free market serves only the masses. Everybody else can only hope to buy what they want.


I am sure I will never, ever go streaming, if that means cultural consumption linked to my identity.

If there will be unavailable goods or services, I will be forced to do without.


Children being born now will come into a world with no concept of either privacy or ownership.


Children being born these days will probably have parents to teach them that, and will be exposed to past culture, unless civilization fully collapses, and will understand privacy and ownership naturally, as they do not need to see those concepts around to understand them and "want" them.


The presumption with that line of thought is that "people will WANT the album at all" and/or "WANT at any price". These are implicit assumptions you are failing to acknowledge/mention.

And they are NOT generally valid.

For example, I stopped listening to broadcast radio in the 1990s - I've never missed it and I've done without quite happily.

I stopped watch any of the mainstream TV networks in the 2000s - I've never missed it and I've done without quite happily.

I've never used a music streaming service to this day - I've never missed it and I've done without quite happily.

No one piece of media is SO VALUABLE that it's irreplaceable or necessary to have access to. NONE.

I'm not a slave to whatever emotional beast would DEMAND I live crappy music or music only available from streaming. It can always be substituted either with music from other sources or living without it entirely.

If Justine Bieber was wiped from existence, my life would not change one iota. Same goes for pretty much ALL other "artists" today - and that's made easier because most are pure garbage without any musical talent anyway. Want to put up barriers to entry preventing me from getting garbage? OK. Thanks, I guess?

It will turn out like the Saints football tickets - they can't even give them away.

https://www.thegatewaypundit.com/2021/08/get-woke-go-broke-n...


Yes, there will be people who want the album. Or at least the song. For example, people who heard a song in a club and used a music fingerprinting service like Shazam to find it.


>And they are NOT generally valid.

you make this statement, but then you just have a bunch of examples where it is not specifically valid for you.


Audio Hijack to the rescue.


Or are the author/creator of that content!


As if they would care for such details.


authors/creators of content don't exist; content springs fully formed from centers of large corporations, especially if that corporation is named Disney.


DRM to the "rescue"


who says the government cannot come up with a twisted reasoning of mandatory compliance to serve the unique IP industry which is struggling in the face of harsh pandemic and more seriously the threat imposed by those egregious hackers and thiefs who steal and consume content without paying for it. How dare they! they are killing the entire artist community. or " in the face of unprecedented attacks on our domestic soil by foreign alien enemies, we are forced to implement a assailant monitoring programme which i assure you will not be used for any other purpose. the programme is going to be strictly for intended purposes only. That said, while partnering with the industry, we have realized the immense potential to build an inclusive and healthy competitive environment in the market and will help the industry leaders root out evil "


Not the authorities perhaps, but it is already happening in some degree: Windows Defender sending “samples” or unknown binaries to the cloud to analyze them. I am extremely bothered by this and try to disable this “feature” as much as possible. As always with Windows, you have to aggressively tweak system settings to permanently disable the constant reminders “Oops we have detected suboptimal settings, please turn on every privacy invading feature for your convenience”. We can only hope MS doesn’t abuse the samples for other purposes, but given their and other big techs track record, we can assume there are additional parties interested in the submissions.


If you are tired of fighting with your own computer, consider switching to Linux. It's been working flawlessly for years for me.


I am certainly considering it every now and then. Already use it for main desktop usage. And I hear good things about Proton, but I really like ease of usage of Windows for everything gaming related. It somehow reminds me too much of my working job, when trying to figure out how a certain game needs to be started in Linux.


Not all games work but many on Steam do and most run well out of the box. It will only improve as Steam is committed on making Linux the new PC OS for gaming.


> I do not see anyone who understands how evil this is ever forgiving them.

I'm not so optimistic.

How many people among Apple's users actually understand how evil this is, and among those, how many do really, actually care? People seem fine enough with Facebook's data vacuuming, why would they protest against Apple's "non-intrusive" scheme? They "don't hate children" and, of course, "have nothing to hide".

The issue, as has been brought up in one form or another in the numerous threads on the subject, is that people like their comfort (using smartphones) and there really isn't that much of a choice.

> The good news is running OSX made me lazy, and getting back into running a linux or freebsd laptop again is going to be fun.

And therein lies the rub. Many people wouldn't find doing this fun. They'd much prefer being able to watch Netflix in ultra high-def and not having to futz around with Nvidia's drivers or what have you.


> Many people wouldn't find doing this fun. They'd much prefer being able to watch Netflix in ultra high-def and not having to futz around with Nvidia's drivers or what have you

I think they were being sarcastic, maybe not. Either way, you're right. And this is why WE need to be doing this so that it becomes a viable option.


This is the attitude anyone who cares about this stuff needs to adopt. Especially those with engineering, design, or documentation skills.

Now is the time to get the open alternatives shored up, and to start creating a viable and concise pathway and ecosystem that we can recommend to “non-techies” who are looking for an alternative.

That can only happen if we start using this stuff now, and commit to donating some time and effort to take the deficiencies we find in our own use and make them better.

It’s fine to vent, and I hope to see people continue to call big tech out. But putting on a more positive mindset, this is actually an amazing opportunity we have to start creating the change we want to see.


1. Ditch Apple

2. Buy the best that exists today (ie System76)

3. Give monthly critical feedback every step of the way to open source hw/sw companies and their competitors about what you really need in the next version

4. Buy a Better world

Ask HN: Do you use Purism, PinePhone, or Fairphone? https://news.ycombinator.com/item?id=28164208

Ask HN: Do you use a Linux-first laptop? (System76, Librem, Dell XPS Dev Ed) https://news.ycombinator.com/item?id=28216287


I successfully transformed my company from MacOS to Arch on Apple hardware. There are two computers left with Catalina for some specific workflows, in isolation from outside world.

System 76 are cool, but when the time comes we will build custom PCs for work. The "aura" and "coolness" of Apple is gone.

I admit that my decision to use only Apple computers in the office was irrational and tainted by emotional perception from the past. I almost puled the trigger for M1 Air and was planning to upgrade office Macbook Pro's to M1X. Not anymore.


Wow congrats, getting Macs to run Linux sounds a superhuman feat.


No it's really easy, I had Linux running on a 2006 model iMac and I am as far from a Linux expert as you can come.


Ah yes! My understanding is older Macs are easier. The newer T2-enabled Macs make it hard. For example wi-fi, keyboards, and trackpad don't work out of the box, lots of configuration is needed, and some things (ie microphone) simply don't work at all.


Ah ok, makes sense. This was my first mac, things have definitely changed since then...


> That can only happen if we start using this stuff now, and commit to donating some time and effort to take the deficiencies we find in our own use and make them better.

If, and i doubt it in any reasonable time, you can replace all of apples ecosystem, including buying / renting of DRM media i would switch. until then switching is quite a big step back.

the problem is not lack of software so much as it’s lack of industry support.


The industry is what got us here in the first place... we need to do this for ourselves. By us, for us.


yea great pep talk, but go convince the various license and copyright holders to allow you to create a store to sell media on linux. this just isn’t going to happen. which means, we’re done here.


I feel it's bit unreasonable to expect a non-technical user to even start to comprehend this issue.

Many of them know that the photos, videos, music from their old iPhone would be available in their new iPhone after they sign-in; But do they really understand what happened in-between to enable that, Should they even know that? That's what Apple is banking on.

It would be pragmatic to expect even technically equipped Apple fans to call-out Apple's latest hypocrisy and move away from the ecosystem. They didn't do it earlier, They didn't do it when it came to light that Apple knew that its contractors exploited child labor[1], They wouldn't do it now.

[1] https://www.businessinsider.in/tech/news/apple-knew-a-suppli...


Well you could categorise them as not understand how evil this is. They are not comfortable, but they may not see it as evil.


"get them pulled into the legal system once it is flagged, where the process itself is the punishment"

This is the real threat here. Anyone can have data flagged at any time, by accident or maliciously. Like how any video can be flagged for copyright infringement and the creator is 'punished by the process' regardless of guilt/innocence. A possible fix would be to have severe financial punishments for every false claim (lets say a million bucks per instance). Imagine how careful the system would be designed if that were the case, verses the case where there is no punishment for false claims.


Supporting and advocating for “Right to Repair” laws was never more crucial, because this is not going to stop at iPhone or mac.


Just to extend my comment in response to how this problem will spread:

By doing surveillance on types of images, Apple is in effect implementing anti-virus - for ideas. That's only a bit hyperbolic, as the perceptual hash for a viral meme can be searched on, just like the material they're using as a pretext for it.

I could even see them announcing it at a launch. We should be concerned that the company has skipped its Black Mirror stage and jumped right into its Universal Paperclips endgame.

(I'm also appreciating the irony that people like me being angry about Apple announcing they're going to implement a version of what Google has already been technically able to do for the last decade, and what Microsoft has probably been doing in secret since even before then.)


Apple already disallows lots of ideas.

You can't make these apps for iOS (NSFW!) https://sexgames.citor3.com/

And arguably, if you've tried apps like those in actual VR their impact doesn't come close in non VR. But, since Apple doesn't carry porn in the app store and doesn't allow other stores Apple is effectively banning thoughts.


You can’t autocomplete "shit" via the iOS keyboard and several other swear or politically incorrect words.


Why do people accept being treated like children?

iPhone=Disneyland, Linux=Real world


>Why do people accept being treated like children?

Because some cultures have a religious-inherited "sin-avoidance" problem, and e.g. try to ban alcohol, disney-fy public discussion, lose it if a tit is accidentally shown, and are "shocked" if somebody utters "fuck" in a late night show.

(It doesn't help that many of the "adult" population behave like children and throw a fit if their morality as to the above is "challenged").

Now, put these cultures, for historical reasons, in charge of technology platforms...

All of these, and more, turn into some state duty (in turn imposed on corporations) to handle moderation and control content in behalf of adult audiences (e.g FCC rules, Hayes code, Apple's ban of political, sex, etc apps, and so on).

This leads to the child-ification of the adult population, where the "adults" in charge know better and protect the masses.


Let me translate your cynic-speak:

___ begin translation ___

some (presumably christian) cultures consider it important to promote a societal standard of morality that encourages self-control as the basis for sustainably safeguarding individual liberty while being (increasingly more) tolerant of aberrant ways of self-expression.

As with every culture, its members defend their way of life and morals.

Put these cultures in control of technology platforms and you will see that they will attempt to strike the right balance between personal freedoms and public responsibility and even failing at that sometimes under pressure from various interests (commercial, the self-interest of spy agencies who want to make their job easier, post-modernist denial of christian cultural norms of morality and personal freedom, far left activism that seeks to dox, cancel, and censor all opposition)*.

___ end translation ___

*I'm pointing out some of the overtly anti-christian or worldview agnostic interests at play to highlight the highly oversimplified anti-traditionalist/anti-conservative/anti-christian slant of the parent comment.

Maybe just maybe we can have a nuanced conversation without all the finger pointing and scapegoating, and recognize that there's some legitimacy to each of these values that sometimes conflict with each other and figuring out the right balance is challenging.


There's more factors here, of course.

Linux phones are only for the most masochistic of techies.

Average Joe who is uninformed about tech will just assume you get something for your money, and by that measure, iPhones must be the best.

I chose Android which comes with it's own problems, but I prefer the way it works, and literally everything about Apple rubs me the wrong way.


>I could even see them announcing it at a launch. We should be concerned that the company has skipped its Black Mirror stage and jumped right into its Universal Paperclips endgame.

Well, what Apple did is squarely Black Mirror stage, nowhere near Universal Paperclips.


> Apple is in effect implementing anti-virus - for ideas.

Should we call that anti-dissent? Or thought police?


> This distinction of changing the defacto ownership of your device and data is the real inflection point.

So the ability to store child porn is what constitutes "de facto ownership" in your mind?

But why would they "use this tech to hunt down bitcoin owners"? They could just scan emails or photos directly. Doing it by way of neural hashes and vouchers seems like an absurdly complicated detour when they already own the OS and all the most commonly used apps.


De facto ownership means your personal property doesn’t get searched for evidence of crime, without reasonable cause


That’s not how it typically works though. Your car has to be tested regularly, at least in Europe. Your bag is searched when you travel, in an infinitely more intrusive way.

Also , your photos are only scanned on their way to iCloud. Turn off icloud photos and nothing will be scanned.

Besides, the laws around child porn have always been different from other laws, they have preemptively scanned for that for as long as I can remember, without probable cause.


You will really own nothing and you will be “happy”. Soma here. Soma there. Soma, soma everywhere.


> he good news is running OSX made me lazy, and getting back into running a linux or freebsd laptop again is going to be fun.

This is good news.


>So what happens when, in a few years at the latest, a politician points that out, and—in order to protect the children—bills are passed in the legislature to prohibit this "Disable" bypass, effectively compelling Apple to scan photos that aren’t backed up to iCloud? What happens when a party in India demands they start scanning for memes associated with a separatist movement? What happens when the UK demands they scan for a library of terrorist imagery? How long do we have left before the iPhone in your pocket begins quietly filing reports about encountering “extremist” political material, or about your presence at a "civil disturbance"? Or simply about your iPhone's possession of a video clip that contains, or maybe-or-maybe-not contains, a blurry image of a passer-by who resembles, according to an algorithm, "a person of interest"?

What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services so the point at which the scanning is done is mostly arbitrary from a process standpoint (I understand people believe there are huge differences philosophically). They could have already scanned our files because they already have full control over the entire ecosystem. If they can be corrupted by authoritative governments, then shouldn't we assume that have already been corrupted? If so, why did we trust them with full control of the ecosystem?


In years previous, take the San Bernadino shooter for instance, Apple argued in the court of law that creating backdoors or reversible encryption was insecure and also subject to exploits by malicious actors, and thus not reasonable and was "unreasonably burdensome". As well, they made the argument that compelling them to do write back doors also violated the first amendment.

It was most likely a winning strategy that the FBI actively avoided getting rulings on and found a workaround.

What apple is creating here is an avenue for the FBI/NSA/Alphabet agency to create a FISA warrant and NDL to mandate hits on anything. The argument its gotta be pre-icloud upload or subject to manual review or on some arbitrary threshold is something is just the marketing to get the public to accept it.

All of that can easily be ordered to be bypassed. So it can be a scan, single hit for x, report.

Ill take the downvotes, but if anything, someone more conspiracy minded could easily take this as a warrant canary. Given the backlash apple ahs faced and ignored, it doesnt make much good business sense for them not to back off unless they are

A) betting on it being a vocal minority to resorts to action (which is entirely possible, especially given the alternatives and technical hurdles to get to a suitable alternative)

B) Being pressured by governments now. (also entirely possible given their history with the FBI and previous investigations).

[1] https://www.rpc.senate.gov/policy-papers/apple-and-the-san-b...

[2] https://en.wikipedia.org/wiki/FBI%E2%80%93Apple_encryption_d...


> What apple is creating here is an avenue for the FBI/NSA/Alphabet agency to create a FISA warrant and NDL to mandate hits on anything. The argument its gotta be pre-icloud upload or subject to manual review or on some arbitrary threshold is something is just the marketing to get the public to accept it.

Why would they make things even more complicated with limited access, since they can already access everything in cloud? Let’s leave out the argument for expanding scan to whole device. If that is what happens, then people start really discarding their phones.


Well for one scanning on-device lets them expand the amount of stuff they search for without an impact on their servers.

We can all assume they will eventually start scanning for more things than just photos only before they are sent to iCloud. It can easily and _silently_ be expanded to be any file on the phone.


You can do silently every imaginable thing right now. iOS is not exactly open-source system.


Except that right now thy don't have a plausible reason to be scanning things, and any indication of something like that happening without prior expectation would be an even bigger deal than this is. Setting the expectation that this is acceptable is how you hide overstepping and abuse.

Just because my neighbor physically can run out and physically attack me every time he sees me exit my house isn't a valid defense of him running out and verbally abusing and threatening me every time I leave, not is it a valid excuse not to worry about it escalating to that.


We were talking about silent things here. So there is no prior expectation for them? Silently expanding scan for every file for example would count still overstepping similar way for the most of the people. Because that is abuse, and usually there is zero tolerance. Apple has avoided marks of the abuse in the past pretty well.

But anyway, Photos spotlight, Files app or Siri are already scanning you files and getting metadata. Metadata is even stored to iCloud to be able to keep sync process working. There are excuses if you want to make them.


> Except that right now thy don't have a plausible reason to be scanning things, and any indication of something like that happening without prior expectation would be an even bigger deal than this is. Setting the expectation that this is acceptable is how you hide overstepping and abuse.

How do we know Apple isn't doing this right now? How do we know if and when they do? Are people keeping track of everything the phone sends back to Apple's servers? Is it even possible any more?

Considering Apple doesn't let you have full access to the device, the phone could do anything, encrypt the message and send it. The only way I know would be by monitoring the traffic off-device on the network all the time, which means only while on Wifi. And that wouldn't give you content, only metadata, as by then it's encrypted and you don't have the key.


Because they dont have access to everything in the cloud.You dont have to use iCloud, or Siri, or Spotlight.

This was specifically addressed in the San Benadino and other cases. Apple gave the FBI everything in the cloud. FBI was looking for everything on the device.

What this change does is all a method, without an opt out option, for them to scan for anything on the device. Be it a string of text/keywords, or certain pictures of a place with certain metadata etc.


This is just speculation. Current technical implementation limits scan only for images to be uploaded into cloud, which can be opted. If you don’t trust that, you can’t trust to use their devices right now either.


That seems like a reach.

>Current technical implementation limits scan only for images to be uploaded into cloud, which can be opted.

That is conflating policy with a technical limitation. Their changes negate the technical discussion at this point.

Their POLICY is that it will only scan for images to be uploaded. They no longer have a *legal* argument to not comply with government requests for device scanning of any data now, since the framework is now included.

That is a big change in that regard. Whereas in the past there was a layer of trust that Apple would hold governments accountable and push back on behalf of a users privacy (and there is a very tangible history there), this implementation creates a gaping hole in that argument.


Actually it is not just POLICY. This scanning is build very deeeep in to the iCloud upload process. They need huge revamp for the system, and it seems intentional just because of this speculation. So we are in the same discussion whether this is implemented or not.


None of the tech documents point to this being the case. In fact in many of the articles I have read, it’s quite the opposite. Including the peer reviewed paper that had the dangers of such a program outlined in the conclusions. [1][2]

Do you have any sources here here to the contrary?

[1] https://www.washingtonpost.com/opinions/2021/08/19/apple-csa...

[2] https://www.schneier.com/blog/archives/2021/08/more-on-apple...


Their threat model[1] states:

> This feature runs exclusively as part of the cloud storage pipeline for images being uploaded to iCloud Photos and cannot act on any other image content on the device. Accordingly, on devices and accounts where iCloud Photos is disabled, absolutely no images are perceptually hashed. There is therefore no comparison against the CSAM perceptual hash database, and no safety vouchers are generated, stored, or sent anywhere.

and

> Apple’s CSAM detection is a hybrid on-device/server pipeline. While the first phase of the NeuralHash matching process runs on device, its output – a set of safety vouchers – can only be interpreted by the second phase running on Apple’s iCloud Photos servers, and only if a given account exceeds the threshold of matches.

We should also take account the way how blinding the hash works from CSAM paper[2]:

> However, the blinding step using the server-side secret is not possible on device because it is unknown to the device. The goal is to run the final step on the server and finish the process on server. This ensures the device doesn’t know the result of the match, but it can encode the result of the on-device match process before uploading to the server.

What this means, that whole process is tied strictly to specific endpoint in the server. To be able to match some other files from device into the server, these are also required to be uploaded into the server (PSI implementation forces it). And based on the pipeline description, upload of other files should not be possible. However, if it is and they suddenly change policy to expand to scan all files of your device, they will end-up into the same iCloud as other files, and you will notice them and you can't opt out from that with the current protocol. So they have to modify whole protocol to include only those images which are actually meant to be synced, and then scan all the files (which are then impossible to match on server side because of the how PSI protocol works). If they create some other endpoint for files which are not supposed to end up into iCloud, they need store them in the cloud anyway, because of the PSI protocol. Otherwise, they have no possibility to detect matches.

It sounds like that this is pretty far away from just policy change away.

Many people have succumbed to populism as it benefits them, and it takes some knowledge and time to really understand the whole system, so I am not surprised that many keep talking, that it is just policy change away. Either way, we must trust everything what they say, or we can't trust a single feature they put on the devices.

[1]: https://www.apple.com/child-safety/pdf/Security_Threat_Model...

[2]: https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...


I just want to say thanks for the links and taking the time to explain it. I think it’s pretty logical. I see your viewpoint and I think I need to take some more time to consider my stance (again…).


It isn't a philosophical debate. It's about invading and controlling someone else's property. I can't shack up in your home and eat your food just because I feel like it. We're all doomed because digital natives have no concept of boundaries between something they own and something someone is renting or letting you use for free in exchange for data mining.


Like I said, Apple controls the hardware, software, and services. They are already control your property.


There's a substantial difference - both in theory and in practice - between Apple being capable of making your device do things you don't want it to do, and them actually doing it.

Saying that just because they could have, we ought to be okay with them actually doing it is nonsense. If you apply that line of thought to a non-updatable product, it becomes pretty clear.

Pick basically anything man-made around you - your shoes, your couch, whatever. That could have plenty of awful things in it. It could be spying on you, it could be poisoning you, whatever.

Just because the manufacturer could have done something terrible, doesn't mean we're okay with them actually doing it. The mere fact Apple can do these things after purchase doesn't make it any more acceptable for them to do so.


> Saying that just because they could have, we ought to be okay with them actually doing it is nonsense. If you apply that line of thought to a non-updatable product, it becomes pretty clear.

That is not the argument I made. The argument was that if they could be corrupted we should assume they are already corrupted. And since they already have control over the device an already corrupted Apple could have been spying on you already. It isn’t excusing their behavior. It is pointing out the naïveté of the current outrage.


Perhaps you are right that the outrage should have come much earlier. I myself avoided Apple since a long time. However, it's better late than never. I'm glad people are finally waking up.

See also: https://stallman.org/apple.html and https://www.fsf.org/campaigns/apple.


The concept of ownership you are asserting is but one of many historical principles of ownership. There are however, other concepts of ownership that conflict with what you are asserting.

https://www.econtalk.org/michael-heller-and-james-salzman-on...

I don't think there is a good faith argument that Apple is invading or controlling anything of you own. All that's happening is you agree to run the algorithm in exchange for using iCloud photos. That's just a contract; a mutual, voluntary exchange.


Contract implies meeting of the minds. I'd like to see the process by which I can line out or alter the terms of the agreement please.

I'll wait.

This is the other thing "digital natives" don't get, nor want to. Negotiation is normal. Ya know something else tgey don't get? Selling something with the damn manual, and enough system documentation to actually be able to sit down and learn something. Drives me nuts.


Of course they won't let you alter the terms of the agreement, but everyone always has the option (barring some sort of malicious state-enforced intervention) to walk away and divest themselves away from platforms and ecosystems that don't respect their privacy or otherwise act in ways they do not agree with. Or at least if it's going to take a long time to get there, one ought to start thinking about how to go about ensuring that escape hatch is available if their entire life's data depends on someone else continuing to provide them access to their "cloud" services based on potentially arbitrary rules and changing conditions they may never know about or be able to audit for themselves.

If there were enough people out there to stand up and make a difference by going somewhere else and hitting these companies where it hurts (stop giving them money and personal data to mine), then maybe it would start to make a difference.


I agree with the premise of what the parent is saying. How is it legal that you can enter into a contract that gives Apple a perpetual right to your private (Or your company’s) data, without clearly requiring you to consent, with a witness or notary, in plain and understandable terms at the point of purchase?

If your data is located in a cloud you might make the argument that Apple is the owner of the system. But if Apple is truly accessing data from your personal device or “server” without explicit authorization, it violates a whole host of computer crime laws.


> But if Apple is truly accessing data from your personal device or “server” without explicit authorization

If you're uploading the photos to iCloud Photos, you clearly have given authorisation to Apple - in the sense of accepting the iCloud T&Cs and saying yes to "store my photos in iCloud Photos" for them to access the data in those photos, else they wouldn't be able to upload them.


Easy. EULAs are not contracts.


Among the "historical principles of ownership" are those from the communist countries, where the individual humans had the legal right to own only things belonging to a very short list and nothing else.

However, USA has claimed during decades that such restrictions of the rights of ownership are an abomination.

Even if we would accept that this is just a modification of a contract between the "owner" of a device and Apple, if Apple would have acted in good faith, they should have offered that if you do not agree to let Apple run programs on your own device for their benefit, which was never mentioned when you "bought" the device, then Apple should fully refund everything that you paid for your device and other Apple services, so that you will be able to get an alternative device.

As it is now, you either accept that Apple changes their "contract" at any time as they like, or you incur a serious financial loss if you do not accept and you want an alternative.

This certainly isn't a "mutual, voluntary exchange".


the problem is that the companies make ilegal alternatives, obfuscated legals terms, and put himself in the least resistant position, and force you to opt in.


Apple is renting the phone to you for $1000 down and $0 a month (unless you actually are financing). Therefore, they are the landlord and, given notice, can change the property as they feel fit.


This is demonstrably not true. If you rent a home and then burn it down, you are going to be held liable to the owner of the home. In the case of your phone, no one, including Apple, cares if you buy it and then immediately smash it on the ground and destroy it.

Apple controls the software that runs on it but there is nothing that stops you from modifying or hacking it to your heart's content if you are able to, just as they are not obligated to make that an easy task for you.


>Apple controls the software that runs on it but there is nothing that stops you from modifying or hacking it to your heart's content

Nothing except all of Apple's attempts to make that difficult and a bad op sec decision. Oh and let's not forget the series of lawsuits attempting to make jailbreaking considered illegal. Luckily they failed there, but if they could make modifying their software illegal make no doubt that they would.

They don't own the hardware they sell you in the same way a landlord owns a home because they have transferred all physical equity to the purchaser. However, Apple's model really stretches the definition of "ownership". Would you say you own a adobe acrobat because you paid for it, or would you say you own a license to use it? Buying Apple means you own the hardware and license the software that makes that hardware be anything other than a paperweight. It's not a very attractive idea. Kudos to their marketing department.


> Nothing except all of Apple's attempts to make that difficult and a bad op sec decision.

No one said it had to be easy or advisable.

> Would you say you own a adobe acrobat because you paid for it, or would you say you own a license to use it?

Any software that I run that I didn't write myself is subject to the license of the people who wrote it defined it to be. Even the MIT License places requirements on you for you to be allowed to use the software. Exceptions to these copyright protections have been made which extends to jailbreaking iOS devices, which requires modifying copyrighted code.

> Buying Apple means you own the hardware and license the software that makes that hardware be anything other than a paperweight.

All hardware is paperweight without software.


> No one said it had to be easy or advisable.

Now you're making a pedantic point about a technicality instead of what's happening in real life.

> Any software that I run that I didn't write myself is subject to the license of the people who wrote it defined it to be. Even the MIT License places requirements on you for you to be allowed to use the software. Exceptions to these copyright protections have been made which extends to jailbreaking iOS devices, which requires modifying copyrighted code.

The MIT license doesn't require you to allow anyone else to scan your private data and doesn't allow the licensor to change the terms after you've already started using the software.

> All hardware is paperweight without software.

If you buy a Dell and you don't like the Dell crapware, you can remove it and the device still works just as well (if not better). If it came with Microsoft Windows and you don't like the Windows license, you can install Linux or OpenBSD. The hardware is still useful even if you don't like the license for the software it came with.

If you don't like Apple's software licensing terms, your iPhone is a paperweight.


> Now you're making a pedantic point about a technicality instead of what's happening in real life.

In real life people are looking for escalation of privilege exploits that enable them to exploit iOS to allow for installation of arbitrary software on it. This is what jailbreaking is.

> The MIT license doesn't require you to allow anyone else to scan your private data and doesn't allow the licensor to change the terms after you've already started using the software.

At what point did I ever state any of this or even imply this? I am simply stating that licenses affect all the software we run and places restrictions from the creators of said software on the users of it. This has nothing to do with Apple surveilling its users with its new tech.

> If you buy a Dell and you don't like the Dell crapware, you can remove it and the device still works just as well (if not better). If it came with Microsoft Windows and you don't like the Windows license, you can install Linux or OpenBSD. The hardware is still useful even if you don't like the license for the software it came with.

Hypothetically it is possible to run whatever software you want on an iPhone, including installing another OS. In reality this translates to people are jailbreaking devices. As has been mentioned, people are allowed to hack their own iPhones and it's protected by DMCA exemptions.

But if you're going to accuse me of being pedantic about technicalities instead of real life, how about this: In real life almost no one gives a shit about running arbitrary code on their devices and just use it to get access to the applications that are readily available in official app stores.

> If you don't like Apple's software licensing terms, your iPhone is a paperweight.

You can dislike their software licensing terms and still use your iPhone. I dislike the things that Apple is proposing with regards to CSAM detection but that doesn't mean I can't use my phone.


> In real life people are looking for escalation of privilege exploits that enable them to exploit iOS to allow for installation of arbitrary software on it. This is what jailbreaking is.

"Unjust imprisonment is fine because you can hire a black ops team to break you out."

So you jailbreak your iPhone. Then an iOS update comes out patching a security vulnerability. If you install it, it removes your jailbreak (or bricks your phone). If you don't, your device has an unpatched security vulnerability.

And at any given time there may not be a jailbreak for the current version of iOS.

This is not a reasonable state of affairs.

> At what point did I ever state any of this or even imply this? I am simply stating that licenses affect all the software we run and places restrictions from the creators of said software on the users of it. This has nothing to do with Apple surveilling its users with its new tech.

The problem is that Apple is imposing license restrictions you don't want. Your response was that all licenses impose restrictions. That ignores the important distinction between restrictions you actually care about and restrictions that don't really affect you.

> Hypothetically it is possible to run whatever software you want on an iPhone, including installing another OS.

Hypothetically you can make your own iPhone out of sand and crude oil. In practice no third party operating systems for iPhones exist because Apple doesn't document their hardware and so there are no drivers for third party operating systems.

> In real life almost no one gives a shit about running arbitrary code on their devices and just use it to get access to the applications that are readily available in official app stores.

In real life most people unjustly imprisoned by a government don't have the wherewithal to break out of prison. That doesn't mean they like being in incarcerated, or having Apple scan their devices.

What it means is that they're structurally bound into a position where their true preferences can't be expressed. Which is the problem.

> You can dislike their software licensing terms and still use your iPhone.

Yes, exactly. But you can't refuse to accept their software licensing terms and still use your iPhone, which means that your choice is between having something imposed on you that you dislike, or your iPhone is a brick because you can't in practice use it under any other terms.


>All hardware is paperweight without software.

You miss the point. I can buy an x86 machine and run Windows or a FOSS OS or any number of unix clones or hell even write my own OS. From the outset I can say I own the hardware.

You can't say the same for Apple hardware. Even if the act of jailbreaking as a specific case is not considered illegal, you have to do many illegal things if you want to pwn an Apple device enough to run another OS on it.


> You miss the point. I can buy an x86 machine and run Windows or a FOSS OS or any number of unix clones or hell even write my own OS. From the outset I can say I own the hardware.

You are more than welcome to use an x86 machine as your cell phone, but I don't think most people would choose to. If we're talking about comparable hardware, even m1 macs allow you to run alternative operating systems [1] on it so this isn't a valid point.

You could write your own OS on an desktop computer and it'd be a significantly easier process than doing so for an iPhone which has a locked bootloader, but that doesn't mean that you can't. Just that it's tremendously difficult and low value proposition. Privilege escalation on a jailbroken iPhone is typically about as much as people want. Why would they buy an iPhone over a device with an unlocked bootloader otherwise?

> You can't say the same for Apple hardware. Even if the act of jailbreaking as a specific case is not considered illegal, you have to do many illegal things if you want to pwn an Apple device enough to run another OS on it.

What laws do I have to break to pwn an Apple device enough to run another OS on it? Jailbreaking [2] is protected by a DMCA exception.

[1] https://asahilinux.org/

[2] https://en.wikipedia.org/wiki/Jailbreaking_(iOS)#United_Stat...


> you have to do many illegal things if you want to pwn an Apple device enough to run another OS on it.

Can’t think of a single one, elaborate please?


I think the op meant that as an analogy : in order for you to use their software, you pay 1000$ upfront for the hardware. So you can look at it as a one time payment/rent to use their environment. Since you need to upgrade iPhones quite often, I guess renting isn't a bad analogy.

> but there is nothing that stops you from modifying or hacking it to your heart's content if you are able to.

Are you sure? I haven't read the terms, but that might be quite against their rules. Rules that you probably adhere to by using their product, but I'm not a legal expert.


Their rules cover their continued services. When you buy an iPhone, you are free to use whatever tools you’d like to modify / hack / break / enhance / etc the device.

The terms govern your interaction w/ Apple. So, for example, if you crack open the case and try to re-wire the board, the terms say your warranty no longer applies. If you modify the software, they can ban you from interacting with their servers. And if you start offering to sell modified iPhones to other people, they can come after you for damaging their business.

That’s what owning the phone means. You can do what you want with the phone you bought, but they aren’t required to support your efforts or allow you to use the services they’re actively running.


You are allowed to modify iOS Software in this case because it is an exemption to the DMCA.

https://en.wikipedia.org/wiki/Jailbreaking_(iOS)#United_Stat...


Good to know, thanks!


I agree we are all doomed, but I don’t agree it has that much with digital native or not to do. My boomer grandparents, my gen x parents and my millennial self, we are all affected by this. And gen z (the first generation of digital natives), and whatever comes after gen z, is not to blame for that. Reducing it to a generational thing is silly.


I think the point was that the digital natives and the next generation of digital natives coming will not know any different and will thus tacitly accept it.


I think “we don’t have the machinery to do that” is an effective argument in the real world when someone asks you do to something. I’m not sure if it matters legally (lawyers sometimes use vague phrases like “reasonable effort”), but it definitely affects how strongly people will pressure you to do things, and how likely you are to acquiesce to that pressure.

The scope of the change Apple would need to make to scan your photos arbitrarily just got a lot smaller. The number of engineers who would need to be “in the know” to implement this change got smaller. The belief from governments that Apple has the option of doing this got stronger. The belief among Apple’s own management team that they can do this got stronger.


This is very well put.


Because that door hasn’t been opened yet. “Scan every photo on users devices” or “scan for non-CSAM” are much easier requests once they’ve already started scanning on-device.

It’s just how life and politics work.


The door has been opened for quite some time. What do you think spotlight is? It scans an indexes all your data.

What's prevented the government from saying "hey if you see Osama Bin Laden in a spotlight scan, you need to send us all that guys data."

The answer is, Apple can just say FU. And that's exactly what will happen here. In particular, the US DOJ needs to stay in Apple's good graces here and not be overly aggressive. If DOJ pulls any funny business, that's a pretty good reason for Apple to just say "OK, we're picking up our toys and going home. You get nothing now and we're turning on E2EE."


I'm not a security professional by any means, but this has been my line of thinking on this whole debate for quite a while. It's pretty silly considering what has been made public about the clandestine operations of alphabet agencies,(if you were paying attention to the right channels[1] their was good reason to believe that the 4th amendment was a joke to the Feds long before Snowden's leaks) especially combined with the existences and complete opaqueness of secret FISA Court. Its kinda crazy to me that all these technologists, and especially those on *hacker*news really believe that you have any sort of privacy from the US government, who has demonstrated it can act with complete impunity in most parts of the world for decades. I say especially people here because they should know how just a handful of rogue actors in any given organization could subvert any sort of veil of privacy. I'm not an expert by any means, but it makes complete sense to me that privacy in any large organization is a very delicate thing to maintain when your adversary is as sophisticated and belligerent as the US security and intelligence apparatus appears to be. Maybe I'm just not privy to something, but it seems like if the US national security apparatus want to do something on our or allies soil, they'll find a way.

[1]https://www.pbs.org/wgbh/frontline/film/homefront/ - aired 15-5-07 and covered the notorious ATT room 641a


I’m sorry, but there is a world of difference between locally indexing files for local search and tagging files as contraband so that they can be reported to the government.


Technically speaking, no there isn’t. It’s just a little bit if metadata you’re sticking on an already long tail of metadata that they’re doing as they index.


Most importantly, people are arguing that there is new technical risk, so a difference in intent or beneficiaries is not relevant for that argument.


Being technically similar is irrelevant. It’s completely different in principle, and that’s what matters.

A great deal many things are technically similar, but vastly different in principle. And those distinctions are important.


I’m not sure which side you’re arguing here.

The biggest concern about Apple’s system is that it’s very easy to add new items to a hash list. That is an argument about the technical similarity of scanning for CSAM and scanning for other things like classified documents (for example).

But there is a vast difference in principle. Pretty much everyone wants to stop child abuse. But many people—including major news organizations—believe citizens should sometimes have the opportunity to view classified documents.

Different categories of things to scan for will be different in principle, even if the technical approach is similar. This difference in principle is what Apple leans on when they say they will oppose any request to expand their system beyond CSAM.


The biggest concern about Apple’s system is that they are showing to all governments and everyone that it’s fine and good to scan for whatever on my device and report me to the government if they see fit, despite years prior refusing to implement backdoors or give access to someone’s device to the FBI.

They essentially invalidated all those claims and I can’t see how they’ll now be able to argue back if the US or the Chinas come to Apple saying they have to have more surveillance in their devices.


DOJ can pressure Visa, Mastercard and Amex to stop processing payments for Apple. Due to how the international payments systems work, that's a global sanction, even if Apple had no footprint in US.

And before you claim that's absurd and impossible, there is precedent for US doing just that.[0]

EDIT: There is also an earlier precedent, UIGEA - https://en.wikipedia.org/wiki/Unlawful_Internet_Gambling_Enf...

0: https://www.cnet.com/tech/services-and-software/credit-card-... (yes, the blockade was lifted later but my point was that the nuclear option is available)


> DOJ can pressure Visa, Mastercard and Amex to stop processing payments for Apple.

I think they'll find there's a world of difference in public support for "people leaking classified documents" vs "the people who make you and your family's phones, tablets, laptops, and watches".


This entire argument is a non sequitur and comes up like clockwork every time this issue is discussed. It's the metaphorical equivalent of saying "well someone could've snuck in through the open window. Let's just assume they did and leave the doors open as well".

How about instead we push back against Apple further shifting the Overton window on how acceptable it is for companies to run intrusive services on hardware we own?


It’s not a non sequitur. The comment is engaging with a series of rhetorical questions that imagine a slippery slope by observing that very little has changed about the trust model between iPhone users and their devices. If you are convinced Apple is slipping, then it is worthwhile to be able to answer how their position today is different than it was last month. That is of course a different question than whether their position last month was acceptable, and maybe people are realizing it was not.

As a concrete example, if you think the proposal introduces new technical risks, then if Apple announces they made a mistake and will instead scan entirely on the server, you may be satisfied. However, I’d argue that since no new technical risk has been introduced, your conclusions should not change.

I’d argue that the incorrect characterization of Apple’s announcement as scanning all the files on your phone with no control has shifted the Overton window more than what was actually proposed. Politicians who are none the wiser probably believe that’s what Apple actually built, even though it’s not.


I disagree - it's a distraction from the larger issue at hand.

> I’d argue that the incorrect characterization of Apple’s announcement as scanning all the files on your phone with no control

That's a strawman - few if any are arguing that the system will read all of your files out of the gate.

>since no new technical risk has been introduced,

This assumption doesn't reflect reality. Introducing a brand new system built specifically for client-side scanning absolutely adds technical risks, if nothing else then by the sheer fact that it's adding another attack vector on your phone. Not to mention the fact that all it would take is a change in policy and a few trivial updates (a new event trigger, directory configs, etc) for this system to indeed scan any file on your device.


What is the larger issue? The entire chain starts off with a prediction of what will happen in the future. It sounds like the trend is the proposed larger issue.


What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services...

Simple: Money.

Their response to any such demands would be (and has been) "we don't have the capability to do what you're asking".

No judge is going to burden them to spend their own dime to build a massive new feature like this and deploy it to every phone out there to comply with a demand arising from a prosecutor of an individual case.

No government is going to pony up the money to reimburse them to do it (not even getting into the PR optics).

That leaves it happening only if 1) they decide to do it themselves, or 2) government(s) legislate they must.

So far #2 hasn't happened. Politicians had no basis of reference to point to and say "Your competitor(s)' doing that, you should too".

But now that #1 occurred, it will normalize this nonsense and pave the way for #2.


Their response to any such demands would be (and has been) "we don't have the capability to do what you're asking".

No judge is going to burden them to spend their own dime to build a massive new feature like this and deploy it to every phone out there to comply with a demand arising from a prosecutor of an individual case.

Government does not care one bit about how much it costs or if it is even possible. They demand the data with an ultimatum: deliver it as we requested by our deadline or we send in our IT people to take it. Sorry (not) if it takes your whole company down while we plugin in our own servers in your datacenter to take your data.


Doesn't work if the data of interest is not there for the taking. And a judge will not compel beyond what they consider reasonable. Having the feature already in place dramatically shifts the bar.


Their response to such demands has not been we are technically incapable of doing what’s requested. The demand from the FBI in the San Bernardino case was a very small change to passcode retry constants, because the terrorist’s device did not have a Secure Element.


Wasn’t that much more? FBI demanded that Apple would give them the signing key, they would do the rest themselves. Basically adding arbitrary code and unlock any device at that time.

https://en.m.wikipedia.org/wiki/FBI–Apple_encryption_dispute


No. The reference you provide characterizes it correctly: the FBI wanted Apple to create and sign a one-off build of iOS. The specific request was to allow automated passcode input and remove retry back offs and auto-erase constants.

This is how Apple described the request: “ The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.”

Or in their FAQ:

“Is it technically possible to do what the government has ordered?

Yes, it is certainly possible to create an entirely new operating system to undermine our security features as the government wants.”


You are right. I’m confusing to something else, or just fake news.

Edir: it was speculation by court, plan B https://www.theguardian.com/technology/2016/mar/11/fbi-could...


The politics of it is very different, and that's where the danger lies:

https://news.ycombinator.com/item?id=28239506

I think that quite a few engineers are too focused on the technical aspects of it, and specifically on all those "barriers to misuse" that Apple claims to have in place. But it'll be much easier to remove the barriers once the system as a whole is in place.


The reason we're focused on the state of it now is that we can switch at any time - especially if those barriers are shown to be ineffective or are removed at some point.


The real threat here is legal, not technical. Think mandatory on-device scanning as condition of access to the hardware market.


That just ties back into "be afraid of what it could become", and isn't dependent on Apple making this system - Congress could have forced PhotoDNA to be shipped with phones since the inception of PhotoDNA in 2011[0].

0: https://doi.org/10.1016/S0262-4079(11)60791-4


Of course it's not dependent on it being Apple. It could just as well have been Google.

And the difference is that Apple gave them what they needed to mandate this while still claiming that they preserve privacy.


There is a fairly large difference, first being it would be a massive damage to Apple's brand if they started scanning people's phones without permission.

But now that they've built the system to scan things on-device, they can be compelled by a government to scan for other things, and Apple can shrug their hands and say they had no choice.


Buried in the EULA, you give consent.


Why would Apple start shrugging now when they've been fighting the FBI in court?


One reason is that they weren't under antitrust scrutiny in 2016 when they fought the the government in court.

Their incentives have changed - they now have the real looming threat of being broken up by governments, so it is now in their interest to comply with anything else governments ask them to do.


If you believe this is true, why didn’t Apple also launch in the EU where antitrust scrutiny and case law are both more strongly against Apple?


How do you think will the lawyers be able to prove Apple is a monopoly when Android exists?


The mere existence of a competitor doesn't matter in competition law.

Other oil companies existed, but the government still broke up Standard Oil.

Other browsers existed, but the government still made Microsoft make certain API's available to other browser makers.


This is a false equivalence. Standard Oil was at the top. Apple is clearly not. In the USA, they may very well be, but I doubt the margin is very high. https://images.app.goo.gl/hLr1zAGs7Wq5TG46A A lawyer will say that Apple though in the lead as a brand, is still second to Android in terms of market share, no matter how fragmented the market is.

This might be the same in other western countries as well. But overall Android would still have more devices running it than iOS.


Again, competition law does not focus on the existence of competitors (true monopolies exist only in high school microeconomics).

If a company has durable (even if not a literal monopoly) market power due to its anticompetitive behavior and exclusionary conduct, that is grounds for taking antitrust action.

It's hard to switch to a different smartphone, due to anticompetitive actions taken by Apple such as their app store monopoly and making it harder to distribute web apps - switching to Android means losing all the apps you've purchased.

It is a lot easier to switch to a different brand of kerosene fuel, so antitrust action against Apple is justified despite them having a lower market share than Standard Oil did.

I'd suggest reading a textbook on US antitrust law and the court decisions that have led us to where we are, because this is a large topic that goes well beyond market share.


It's doubtful the focus would be there, it'll be on how locked down their ecosystem is, the app store, API's, proprietary ports etc.


> They could have already scanned our files because they already have full control over the entire ecosystem

They did do it in emails since 2019: https://www.indiatoday.in/technology/news/story/apple-has-be...


> They could have already scanned our files because they already have full control over the entire ecosystem.

Apple barely submits any CSAM[0]:

> According to NCMEC, I submitted 608 reports to NCMEC in 2019, and 523 reports in 2020. In those same years, Apple submitted 205 and 265 reports (respectively). It isn't that Apple doesn't receive more picture than my service, or that they don't have more CP than I receive. Rather, it's that they don't seem to notice and therefore, don't report.

0: https://www.hackerfactor.com/blog/index.php?/archives/929-On...


At one point this will be proven and we'll go back to regular digital cameras or even polaroids.


Nothing except Apple saying you could trust them. People were stupid enough to accept that and now even the trust is gone.


That's rational, but the point he's making is that this system obliterates the only defense we have had or could have against such activity: end-to-end encryption. This approach owns the endpoint.


…in the same way any existing feature of iOS that makes device data available to Apple (eg iCloud Backup) “owns” the endpoint, no? What’s to stop a malicious Apple from turning on iCloud Backup for all its users and hoovering up your Signal messages database and iCloud Keychain?


Nothing. iOS even defaults autoupdate to on, so Apple could do this without your interaction today.


> What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services...

Yes, proprietary black-box hardware and software is poor from a user privacy perspective. But, If Apple began on-device scanning of content, I'd imagine eventually someone would notice the suspicious activity and investigate.

With Apple's announcement, the scanning will just be something that Apple devices do. Nothing to worry about. And, no way for anyone to independently verify that the scope of the content being scanned has not been secretly increased.

As for icloud, if your content is not encrypted on the device in a manner where only you have the keys, any cloud storage is suspect for scanning / data mining. But, on-device scanning is a back door for e2e encryption-- even on device encryption with keys only you control is thwarted.


> no way for anyone to independently verify that the scope of the content being scanned has not been secretly increased.

This seems like the easiest thing out of the lot to verify.

The way that this system is designed to work is that when uploading to iCloud Photos, images have a safety voucher attached to them.

If Apple secretly expanded this to scan more than just iCloud Photos, they would have to either a) upload all the extra photos, b) add a new mechanism to upload just the vouchers, or c) upload “fake” photos to iCloud Photos with the extra vouchers attached.

None of these seem particularly easy to disguise.

Your concern is completely understandable if you are starting from the premise that Apple are scanning photos then uploading matches. I think that’s how a lot of people are assuming this works, but that’s not correct. Apple designed the system in a very different way that is integrated into the iCloud upload process, and that design makes it difficult to expand the scope beyond iCloud Photos surreptitiously.

Could Apple build a system to secretly exfiltrate information from your phone? Of course. They could have done so since the first iPhone was released in 2007. But this design that they are actually using is an awful design if that’s what they wanted to do. All of their efforts on this seem to be pointed in the exact opposite direction.


How do you think Apple will increase the scope of what’s scanned without every person with Ghidra skills not noticing?


If the exchange with Apple is encrypted / interleaved with other traffic to icloud, how would you know that there isn't new classes of scanning being done?

I'll be very surprised if similar tech is not lobbied for as a backstop to catch DRM-free media files played on devices we "own".

And, it seems far more probable than not that police will demand this capability be used to help address more crimes. The problem here is that crimes can mean speaking out against an oppressive regime. Being targeted for having the wrong political views (think McCarthyism in the United States or the US backed murder of a million people in Indonesia for affiliating with the "wrong" political party). Etc. History is awash with political abuse of "out groups" perpetrated by tin pot dictators all the way to presidents and PMs of major world powers.

And, it sets the precedent that e2e encryption is not an excuse for a provider to not provide private customer data to the authorities-- a back-door can be installed, "Just do what Apple did."


This is a very well-written post. Ever since this program has been announced I have struggled with talking about the implications succinctly.

Online, I never know if an interlocutor is even arguing in good faith, but even in person it's difficult to balance talking about all the ways that the claimed safeguards are meaningless, how the benefits don't really make sense, how this is markedly different from other infringements on privacy with the need to be concise and explain that the real problems aren't just theoretical because similar invasions of privacy are killing actual people around the world already.

Anyway, I think the only practical way that this could resolve well, is if Apple saw a precipitous decline in its iCloud brand, then it could be argued that they had to abandon this plan for purely business reasons. A serious movement to abandon Apple services ($17.5B revenue in 2021 q3), might empower the people within Apple who opposed this reckless plan from the beginning.


i still think its more about dmca2.0 than any save the chiodren bullshit


Today it’s CP, next year it will be copyright violations…


CP first, then terrorist propaganda, then copyright violations, then foreign propaganda, then misinformation... That seems to be how the cycle goes.



Australia has already shown what the end-game is, with its "The Assistance and Access Act 2018" [1]. It's not illegal to have end-to-end encryption, but it's illegal to deny access to the ends of the encrypted pipe.

As an aside, Australia has just implemented the next step: the "Surveillance Legislation Amendment (Identify and Disrupt) Bill 2021" [2], which makes it legal to hack your device to access the ends of the pipe. Useful if the ends of the pipe are not controlled by a malleable corporation.

[1] https://www.homeaffairs.gov.au/about-us/our-portfolios/natio...

[2] https://www.aph.gov.au/Parliamentary_Business/Bills_Legislat...


This actually sounds like the right way to go. Individual, warrant based access is comparable to wiretapping in a way that Apple's dragnet approach is not.


Don't forget the #datbill as well. [1]

Australia is a bad joke at this point. I thought nanny state was bad but we're firmly moving into Stasi territory now.

Even the government's obvious incompetence is looking like not enough protection in the face of all these overreaches.

[1] https://mobile.twitter.com/efa_oz/status/1430674903548661767


This is so disgusting.


Ah, this old bogeyman.

The media drastically overreacted to that act, to the point where the Department of Home Affairs now has an entire page dedicated to addressing the false reporting [0].

The TL;DR is that the act doesn't allow the government to introduce mass surveillance. Section 317ZG [1] expressly forbids any law enforcement request from _having the effect_ of introducing any systemic vulnerability or weakness and _explicitly_ calls out new decryption capabilities as under that umbrella. Your claim that a company can't deny access to the ends of an e2e-encrypted pipe is false.

And yes, that new act exists. The government will be able to hack into your devices and take over your accounts _with a warrant_, just like they can break into your house or take money from your bank account _with a warrant_.

[0]: https://www.homeaffairs.gov.au/about-us/our-portfolios/natio...

[1]: http://classic.austlii.edu.au/au/legis/cth/consol_act/ta1997...


This is a reduction in privacy. Do what you want on your servers, but hands off my phone.

Also the whole "Apple is planning to encrypt iCloud Photos end-to-end anyway" thing is just fanfiction. I'll believe it when they announce it.


> This is a reduction in privacy. Do what you want on your servers, but hands off my phone.

Apple devices might not be precisely the smartest purchase if the concept of your hardware is important to you.


> Apple devices might not be precisely the smartest purchase if the concept of your hardware is important to you.

Maybe a handful of HN users are aware of that, but the majority of users think that their property belongs to them.

It also goes against what Apple marketing says about privacy and your data. I wouldn't fault most consumers for not understanding that Apple's PR doesn't reflect reality.


Actually the fact that you don’t really own your Apple hardware is supportive for the Apple’s privacy arguments. Since it does not really matter in that case where those images are scanned, as long as the end result is the same.


Not owning anything == privacy now?


As you are well aware, the market is not very competitive and there aren't dozens of vendors to pick from.

"Use something else" (or even more laughably, "start your own") is not a reasonable argument anymore.


> As you are well aware, the market is not very competitive and there aren't dozens of vendors to pick from.

Where I come from a dozen means 12. Even just LineageOS has support for phones from 30 different vendors.

https://wiki.lineageos.org/devices/

And LineageOS is far from the only way to start taking ownership of a device. And let's be real here: Pretty much anything will beat Apple on that particular spectrum.


LineageOS is not enough to replace IOS or 'Google Android' completely, these days. Without google services you won't be able to use many banking or booking apps.

Many people are not prepared to take this convenience hit: when you switch, you empirically lose some of the functionality, while the privacy advantage is 'intangible' until you personally become a victim of the lack of it (which is absolutely worth avoiding, even if still very rare)


There are plenty of phones. Even Linux-like or totally different (like Sailfish). It just seem that you are not ready to make compromises.


I am well aware that other phones exist. But I used the word 'market' for a reason, because defining the market[1] is the first step in any analysis under competition law.

[1] https://en.wikipedia.org/wiki/Relevant_market


Some people have bought into the whole Apple ecosystem. Hard to extract yourself, and getting a usable phone like a OnePlus gets you thinking about foreign extraction of your data, bitter pill to swallow. Now it’s clear that all phones will suck, and always suck more in a future rev.

I use some Apple devices but not the ecosystem for exactly the reason of switching.


I've been a long time Apple user. Prior to my latest work laptop I've only gotten MBPs. However given the ever increasing pain in getting around the macOS security to do basic dev work (eg getting real gdb not ldb working), I've gone back to a Linux laptop.

Given this 'CSAM-scanning-next-up-political-dissent!' stuff from Apple I'll be taking a much more serious look at PinePhone/PurismPhone, etc. The rate of processor/ram/screen specs are plateauing so open source hardware/software is becoming feasible without completely outdated hardware.


I just picked up a Pixel 3a for like 90$ on eBay and flashed CalyxOS on it in about 10 min. Really easy to get started and I’m impressed with the functionality and polish of this open-source mobile OS.


I have to admit that Apple ecosystem works really well. I have been fighting with different Linux distributions and other devices for years. Even wrote own client for Chromecast. But something like AirPlay 2 just works, with high quality.


I have a modest suggestion in the spirit of Apple's move.

As we know, there are people in the world who are running meth labs or creating explosives for terrorists in their homes. In order to safeguard the public, we shall have a detachment of dogs which will sniff everyone's houses every once in a while. When they sense something bad they'll alert their handlers and there'll be a manual inspection before reporting to police.

There's no risk to privacy here - dogs being dogs can't tell their handlers what they sense. We can also show the training publicly so people can verify the iDogs are trained to only sense drugs or explosives. So it's all even more secure than Apple's iPhone scanning! What says you?


I understand you're making a reductio ad absurdum argument here, but this is actually very similar to what LEO often tries to do today (e.g. searches based on what is smelled / seen inside your car at a traffic stop) and actually iDog might be constitutional.

The constitutional standard for a warrant search is "probable cause", and for a warrantless search you generally also need exigent circumstances. Assuming that a judge is sufficiently satisfied with the iDog's nose, and the iDog was sniffing somewhere public like the sidewalk when it found the meth smell, you could likely establish both probable cause (iDog smells meth) and exigent circumstances (meth labs often blow up, meaning there's emergent danger that cannot risk waiting for a warrant).

That's not to excuse Apple, just to provide a fun backstory on the things law enforcement gets to do in this country.

Another one that was nearly deemed constitutional: in Kyllo v United States, LEOs used thermal imaging to find an Oregon man's house was radiating a high amount of heat indicative of intense grow lights, which they used as probable cause to search the home for an illegal pot growing operation. This was only found unconstitutional by a 5-4 decision in the supreme court. If it were found constitutional, you can imagine we'd have helicopters flying overhead thermal imaging for pot operations today.


Upvoted.

That said, I do feel you miss the genius of the iDog proposal. As far as I understand, an officer might sometimes be able to use his dog's nose if it happens during a procedure (which might include the dog searching if there's a warrant), but he can't create the circumstances deliberately. 'I was doing something proper and then the dog started jumping' might be admissible, but if an officer started walking the dog around hoping to catch people opinion might be different.

We suggest regularly scanning every household in the nation in a deliberate process. I was just proofreading five different papers proving the system is perfect if we can trust the dogs (of course we can, only monsters and terrorists don't trust dogs).


> if we can trust the dogs (of course we can, only monsters and terrorists don't trust dogs)

https://news.ycombinator.com/item?id=3418016


Agreed, I have nothing to hide & love dogs.


And hate terrorists!


damn, what if the dog gets excited and barks because I'm dry-aging some of my (fully legally hunted!) wild game?

or because the handler accidentally stepped on the dog's tail?


Don't worry, these are well-trained, well-bred and very well-fed iDogs. 'Not eating when not fed by the handler' is part of the basic training. Also, there's a manual verification step where the handlers search your property before reporting to the police.

The chances of an error are less than a billion to one. It's worth it to beat the drug dealers and terrorists.


That's why we have the manual human reviewers, you see.


The article mentions the slippery slope and "what happens in a year or two when..." scenarios. The article even calls it a cliff. But doesn't expand on timelines of concern.

As it currently stands, this concept would be sitting in plain sight waiting eternally for any lawmaker anywhere.

In your country, either side of the political spectrum - with a majority in lawmaking - can simply tap Apple on the shoulder and potentially turn ALL those devices against you.

Guns won't help. When information technology is used against you, when you are separated from society in a manner where people dare not risk their own livelihood for fear of being similarly marked.

And if Apple goes ahead with this, this risk is sitting there for the rest of your life just waiting for [that one politician that represents everything you hate] to use it against you.

Maybe that politician hasn't been born yet. But they will come. Don't let this Pandora's Box sit waiting for them.


It's the same reason free speech is something akin to sacred even for your worst enemies, because those who start taking away the bad people's speech are themselves always going to be one political actor away from having theirs taken away.


A powerful piece...

For the privacy-minded Apple users among us (I mean, that's who they marketed to, yeah?), I'd recommend turning off automatic software updates... For as long as it makes sense to. I hope they reverse their decision, but I'm already looking for alternatives. I'm certainly not buying another Apple device, even though I'm about due.

They really lost a lot of fans with this, myself included.


Yeah turned it off and iCloud too!

I was all in on Apple. Now got a System76 laptop on the way. Transitioning off iPhone to Linux will be tough but something new to explore.


In the essay the "i" in the headline is lower case, which is significant and chilling. It's a homunculus of Apple's new direction: the meaning changed from "me" to "panopticon".


Yes. Our title caser doesn't understand such nuances, but we've corrected it now.


Why stop at photos you take?

Since we're doing this on device we can just turn the camera on every few minutes and ask an ai if the camera sees something interesting.

If it sees that you're in trouble it can start streaming to the authorities.

We will finally be safe.

By the way, if your phone is off or left at home we will know you're in trouble and send assistance right away.

I wish I could say /s.


I don't know... I have a really hard time getting too upset about this. I'm a big proponent of privacy and have always been a Snowden supporter. And while "protecting children" is a trope in politics, I think everyone with an iPhone knows they're giving up some privacy to own one. It's constantly tracking their location and sending other data to Apple.

This isn't a government agency. Apple has been incredibly thoughtful about privacy in the past, and I feel like they've earned the benefit of the doubt here.

I hope I'm not wrong, but I don't see how this is insane. They're just making sure the files you upload to them aren't illegal.


Maybe I'm completely paranoid here, but given that actual sex offenders commonly seek out ways to be near children, what happens if one or more of them end up in Apple's image vetting team?

They'd be completely anonymous and fully covered, with an endless pipeline of naked kids images being delivered to them.

The idea that if you take a picture of your kid in the bath, it just happens to match a CSAM fingerprint and then gets silently transmitted to anonymous reviewers for "review" is terrifying.


This is a disgusting thought, but hear me out. Perhaps this might actually be a good job to give to a paedophile. Their classifications would probably have a superior false positive rate than someone who is disgusted by the images, and it would all but eliminate any concern about an employee suffering psychological trauma.


> and it would all but eliminate any concern about an employee suffering psychological trauma.

I doubt this. If they were all images that this person happened to be into, maybe... But even then, I think it would likely make their addiction to child porn worse, which is it's own psychological problem that is probably worse for society than the trauma suffered by current employees. What happens when they leave that job, and are used to seeing hundreds of cp images a day?

Not to mention, that some of the stuff the scanner would be looking at would probably be horrific, and violent. Looking at that kind of thing all day would probably have similar psychiatric affects on pedophiles and non pedophiles. In the worst case, it might cause some pedophiles to start to like the worse images out of boredom from seeing so much cp.

Overall, I'd say this is would just be a bad avenue to go down.


I thought we agreed that playing violent video games doesn't make people more violent. Isn't this the same thing?


It's not the same thing. The argument would have to be: violent video games lead to more violent video games, which, anecdotally, I would probably say they do.


The concern is that child pornography leads to more children in danger. This is akin to violent video games lead to violence.

We don't ban child pornography because it leads to more viewing of child pornography. We ban it because we recognise that demand for CSAM breeds supply of new CSAM. Also because it is—by definition—the product of child abuse.


Your terrifying idea mischaracterizes the nature of false positives. Any photo in your library is equally liable to be a false positive as any other; the perceptual hash is not looking for similar images by the metric of what you find similar (content). That’s also the underlying idea behind why people have been able to turn arbitrary images into adversarial false positives.


So that picture of my driver's license I took for an ID check or that sensitive work document I scanned with my phone are just as likely to be sent? Great.


The image would need to be vaguely similar in terms of gross shapes and arrangement. It's exceedingly unlikely that any CSAM would ever be remotely similar to an ID card or a sheet of paper.

If there are ever going to be any "natural" matches to any CSAM hashes, it's probably going to be a photograph of people who are coincidentally in a similar pose at a nearly identical angle and strikingly shading.


In the myriad of articles about this systems many issues there have been comments from people who have worked with the NCMEC upstream database and note that it's filled with mundane photos, empty rooms, etc - I think it was in one of the hackerfactor article discussions

This entire system is ripe for false positives AND adversarial attacks.


I've no doubt the totality of the database contains a lot of photos, but only photos tagged as A1, A2, B1, or B2 would be considered illegal to possess. And then only the absolute worst of the worst (images categorised as "A1") are being included in the hash set on iOS. The category definitions are:

  A = prepubescent minor
  B = pubescent minor
  1 = sex act
  2 = "lascivious exhibition"
The categories are described in further detail (ugh) in this PDF, page 22: https://www.prosecutingattorneys.org/wp-content/uploads/Pres...


The NCMEC database is large and graded to distinguish types of photos. There’s evidence in the false positive calculations that Apple is only using a subset, presumably the one where photos are graded as depicting active abuse.

It’s not reasonable to dispute the 1 in 1e12 false positive claim on mere speculation.


>It’s not reasonable to dispute the 1 in 1e12 false positive claim on mere speculation.

It's entirely reasonable. Have you seen https://thishashcollisionisnotporn.com/ ?

Extraordinary claims require extraordinary evidence.


Collision attacks make for a fun tech demo, but I've yet to hear anyone suggest any plausible scenario where they could be used against Apple's implementation. It would require absurdly elaborate, Oceans Eleven style espionage to achieve any outcome whatsoever. And it would be immediately apparent to anyone involved that a collision attack was involved.

It would be far easier (and far more effective) to just acquire child porn, break into your victim's house, stash physical prints under their mattress, and then contact the police.

Furthermore, the website includes numerous misleading statements about Apple's system, or makes critical omissions on the description of Apple's system. Whatever side you're on, misleading arguments should be dismissed for what they are.


This is apples to oranges. The whole thread was about random false positives and not adversarial ones.


If it's that easy to generate a false positive then I believe it will be more common to accidentally have one.

Onge again, extraordinary claims require extraordinary evidence.


The ease of adversarial collisions has no relationship to the probability of natural collisions.

It's entirely possible to make a cryptographic hash algorithm that has an exceptionally low probability of natural collisions but where adversarial collisions are trivial.

It's also possible to create a cryptographic hash algorithm where occasional natural collisions are expected, but adversarial collisions require brute force.


The chance that any pictures from your library are revealed at all is at most one in one trillion (mod you not storing CSAM or being attacked by someone trying to plant evidence on you). Contrast this to a server side scanning system where every photo in your library will be accessed with unknown false positive characteristics.


They can do that on their servers, not on my phone.


"just making sure the files you upload to them aren't illegal"

the problem lies in this sentence. 1) this happens on device before they're uploaded, which is a monumental shift for a company that claims to be pro-privacy 2) they're now saying they're willing to surveil photos for governments, the reason is sorta irrelevant. they're opening pandora's box - are they going to start scanning files on behalf of the RIAA or other copyright stuff now?


I'm curious could you unpack what this means for me? "big proponent of privacy and have always been a Snowden supporter"

Given that, my assumption is this would click for you, but as you said it doesn't. What does being a big proponent mean to you? How do you support Snowden? What's important to you about privacy? Curious to hear your logic, I bet there are tons of people who have the same concerns (or lack of).


"just"

And who gets to decide what is illegal?


Laws?


Which laws?

I'm uncomfortable enough with US laws or Australian laws deciding what images are "illegal" to have on my phone.

But China also have laws. And Iraq. And Saudi Arabia.

And Afghanistan had one set of laws last month, but new different laws since then.

Multinational Corps may like to think they've lifted themselves above petty regional political and legal pressure from elected governments and regional law enforcement. But Apple is 100% capable of being destroyed by either of China or the USA. Anyone who thinks otherwise is fooling themselves. If you shut down Apples manufacturing in China, or head office and server farms in the US, they'd be finished.


The laws of the country you live in. If you don’t like those laws then work to change them, don’t depend on a foreign tech company to decide for you.

If I wanted to live under US laws and cultural expectations I’d move there.


> If I wanted to live under US laws and cultural expectations I’d move there.

If I didn't want to be poor, I'd just be rich instead!


Yes, and that's the problem. If you think that the laws protect you, an average iPhone user, you are sadly mistaken. Laws are dictated by the highest bidder, and by those in power. If they want to go after gays, or Jews, or Blacks, or Moslems, or immigrants, or political activists, or whoever, do you think the laws are going to protect or remain neutral to those groups?


I've not met any laws. Where do they live?


Thank you for expressing this opinion. I know it’s not a popular one; but I’m 100% with you.


Time has proven otherwise. All censorship systems start with protecting the kids, and grow to eventually encompass all undesirable content…


How have Google’s, Facebook’s, or Microsoft’s CSAM scanning grown?


Facebook filters and censors content on a massive scale. They even block private messages with certain keywords or domains


To be fair, Google and Facebook both do plenty of content scanning, but you’re right that it likely doesn’t use the CSAM pipelines. Apple’s pipeline is probably even harder to repurpose since it’s designed very closely for this use case.


Unknown re private scanning, however Australia and UK have rigorous internet censorship regimes that block copyright infringement and gambling that use exactly the same censorship infrastructure that was build to block CSAM in the early 00s.


Lol, Australia's internet filter is just a DNS block. It couldn't really be any less rigorous. Not that I'm complaining...


[flagged]


The site guidelines ask you not to post like this. Could you please review and follow them? https://news.ycombinator.com/newsguidelines.html

Also, it would be good if you'd stop posting unsubstantive comments generally.


I read it. I just disagree with it.


I stopped work on a memo app. Was piggybacking on Apple's branding around privacy. Am going to wait a year to see how this shakes out. Super disappointed.

[Edit]

Here is/was the privacy statement https://www.deepmuse.com/privacy - I'm kinda embarrassed.


IMHO this is really another continuation of the "you will own nothing and be happy" trend that has been around for a while, but companies have started to really push in the last few years. Slowly eroding ownership and normalising mass surveillance is their goal, so they can continue to extract more $$$ out of you.



When Apple sells yet again another record amount of iPhones next quarter which device should we move to?

Fundamentally this illustrates that software has become too inherently intrusive. What’s the solution tho that could ever be mainstream?

The other issue is that software has become too complicated and too many (potentially) bad things are happening in the background. How can the layperson fight back?


GrapheneOS on Pixel 4 has been a dream, as has Elementary OS on an XPS 13 is similarly great.

WebUSB installer for graphene is a game changer, it made the process incredibly easy. Also it seems most apps work fine without Play services.

Synology Photos is a great local icloud photos replacement.

I made the switch this past weekend. Aside from the impact on my wallet and hassle of needing to sell my Apple equipment, it was surprisingly painless.


I think this kind of defeatism really feeds the public's lax attitude toward privacy.

Yes, the iPhone and Apple products are very popular. And they will probably continue to grow. Does that mean we just accept anything they do, antithetical to one of their core promises to their customers?

Or do we make a big deal about it so everyone sees what's happening and what the implications are?


I've been making a big stink about it. I switched to Signal for my iMessage buddies and that seems to be sticking.

The more people seen visibly taking proactive steps to create privacy, the better.


Do you mean that both the state and corporations will make you a generous gift of privacy? Right when it goes against their best interest of grabbing more power and profit?

Nope.

If you want privacy, like any other rights, you will need to fight for them. Rights can only be gained by a fight; whatever is given is a privilege, which is often taken back as easily as is granted.

So be prepared to (continue to) fight for your rights: in courts, in Congress, etc, but also by choosing less convenient, less featureful, more expensive devices and software which does not violate the rights you care about. And no, the majority of the consumers won't care until you show some signs of winning.


that's exactly correct. We have to accept discomfort NOW (to our convenience, wallets) so that everyone can benefit later.

We cannot give up.


I was very very close to ditching my Android device for an Apple device because it seemed like Apple was on the side of privacy.

I don't feel that way anymore, and watching the FOSS projects like the PinePhone with a lot of interest.


I made the switch from Android to Apple a few months ago for exactly this reason, and boy do I feel like I’ve been bait-and-switched.


Is it impossible for you to return the device because you don't agree with this change?


Same here, exactly.


i actually had an ipad picked out and in the basket ready to buy it! although i have been thinking of buying one for a good few months now, ive just been hesitating a lot because i was not sure if i could handle how restrictive ios is compared to android.

the thing that really has me torn is that there is nothing else like it for simole/fun/creative music making apps (samplr, reason compact etc), which is the main reason i was going to buy one. at the same time i don't want to be part of apple's figures next quarter


The solution is privacy-safe open standards that decouple corporations’ proprietary hold over phone network tech


Apple is the only device I know of doing on-device scanning.


On device scanning vs in the cloud scanning with a distinction without a difference.


What? No way. One is owned by the vendor, the other is owned by me. The physical device is mine. I don’t consent to a search.


Yes you do, when you turn on use of their servers.

Think of it as a pat down in the hallway before you can enter the building.

No difference.

Only done when you use their service


Have you considered reading the article? It makes the difference clear as day.


Have you considered I disagree with the articles assertions?

Outright fabrications and misstatements aren’t my problem. The article has them in spades.


For now...


> How can the layperson fight back?

Simply not using a smartphone is fine, they aren't that great and there doesn't need to be an alternative.

it feels like we've been trained as consumers to the point where saying no isn't realistic anymore, there must be something else to buy that represents me more.. etc


Software must be open source by law. Apple has shown closed software is too dangerous.


Stop keeping your needs of a computer locked behind service agreements.


I think iPhone is superior and would hate to leave it. Although I barely do much with my phone outside 2FA, browsing, and texting. A switch won't be too bad in that regard.

I do love my Macbooks though. So while these privacy invasions make me angry I'm not willing to drop Apple all together. I've been meaning to keep most of my sensitive information on a usually disconnected Linux machine anyways. I'll keep using my Macbook for development.


I see this Apple move as a warning. I have lived part of my life under communistic regime.

For me my Apple addiction ends here. There is no "magic" left in their products, only "bait & switch" dark patterns.

No hardware or UX will lure me again to suppress my instincts.

This is the beginning of global politically and financially motivated race for public control. Apple is just giving a spark to the fire. Imagine a future in which your beloved Face ID will be tied to everything, your beloved iDevices, Teslas, or home appliances will scanning and reporting, scanning and reporting. There is no middle ground in this for me. No benefits or conveniences are so important. FOSS and public oversight of software must be demanded by law.

Posted earlier this without getting any reactions. https://docplayer.net/1287799-Fourth-amendment-search-and-th...


I don’t understand why this outrage seems so US-centric. This is the same Apple that hands over all your iCloud data (photos and otherwise) to the CCP if you happen to live in China. And they’ve done this openly for the last several years.

What am I missing? Isn’t that a much much much much worse thing for Apple to do? Why are we only suddenly suspicious of Apple’s privacy claims with this matter?


As an American I couldn’t care less that Chinese people’s data is handed over to Chinese people’s government, especially considering the alternative would be that Chinese people’s data is handed over to a US entity and by association the US government.

Contrary to popular belief, iCloud data, while encrypted, can be decrypted by Apple and is subject to US law enforcement requests. *

Considering this fact, it is pretty one-sighted to see this as some sort of unconceivable act. However if you look at it from the other side would we want all American user data (assuming Russia had a company that had such pervasive penetration into American lives as Apple does globally) to be sitting on Russian servers subject to arbitrary Russian laws?

So if you only consider American interests, it’s unconceivable for us to give up such power and control over other sovereign nations, but perhaps other countries don’t care about American interests like we do.

All Apple did in China was comply with local laws to stay in business there. What Apple is doing in the US is not mandated by law (as far as I know).

From the American side Apple has marketed itself as privacy focused, even fighting the FBI publicly at the risk of negative publicity. This about face is unexpected but also betrays those of us who invested in the Apple product line under the expectation they continue this standard of privacy and security that was marketed. Chinese people probably never expected this level of privacy to begin with, but we did and we can.

* iCloud messages backups can be decrypted


This to me is a surprising attitude. As an American whose outlook is generally framed by American values, it’s very upsetting to think of how privacy and freedoms are systematically impinged upon in so many parts of the world. If we can be upset about invasion of privacy in one country, why would those principles change at geopolitical borders?


The imposition of our own values onto others is how wars start. If you think that our values are encoded in laws, and accept that different sovereignties have different laws, then you should be able to accept that different people have different values and standards. Maybe to the EU we are barbarians because our consumer privacy laws are so lacking...

Those principles are not universal because different cultures have different values, beliefs, standards, and situations. If you've been taught otherwise then you've been brainwashed by a propaganda machine designed to motivate you to support and fight wars of value imposition like the Iraq or Afghanistan wars.

Plus, no country was around to teach the Americans how to build a society back then, and you could say we turned out fine enough.

We are big proponents of democracy but not only do we acknowledge that democracy is not a perfect system to begin with (something we are taught in schools), but our version of democracy isn't even a perfect execution of it, much like how China isn't really a communist utopia. We are more like a capitalist democracy, where effectively the wealthy can leverage more of a vote (through advertising, propaganda, PACs, etc). So if all our systems are flawed, who are we really to impose our flawed values. The answer is that the real motivation for imposing our values isn't some belief in fundamental ideals or values, but rather for American interests. We didn't setup a puppet bureaucrat in Afghanistan because of democracy, we just wanted a friendly government in the Middle East. The CIA operating in your country doesn't give you relief! No, they are not there to help YOU or your people.


Thank you for this thoughtful response! I think I see what you mean how values are not necessarily universal and that imposing them across cultures can go very very wrong. You’re totally right that we don’t have to look past the last several weeks’ world news for examples.

As a “value” democracy gets tricky. What does it mean exactly? Even within the US it takes on many contradictory forms. And why does democracy matter? Is it a worthy value simply because it’s the least bad political system? That’s not very universal as a reason, and surely not a good reason to make war.

Where I suppose I diverge is that I think freedom of expression (and, inextricably, guarantees of privacy) actually are universal values that cross whatever kind of boundary. What has nationality to do with it? Call me a myopic American but its hard for me to accept the idea that freedom of speech could be a culturally specific value.

Now should we enforce values with coercion? Generally I think you and I agree that we shouldn’t. But information technology (and, to a great extent, strong cryptography) provides an enormous (peaceful) opportunity to durably promote those values at an anthropocene scale.

Maybe nobody externally taught these values in the formation of the US, but it’s maybe also worth remarking that the formation greatly benefited from an unusually free press for that time period.


I agree with you that freedom of speech is a fairly important individual right. If you look at the reasoning behind its status it's probably because it's one of the early forms of power an individual could wield, much like how treasured gun rights are in America as well. It goes hand in hand with a free press.

That's certainly something the Chinese people deserve to have, but it's also not our fight, especially if the people of China are less interested in it than foreign spectators seem to be. Since their economy and standard of living is rising and more or less doing fairly well, it raises questions of motivations when we really want them to do something for them under the guise of for us. If we value our democracy or freedom of speech and press more than the average Chinese citizen does, then something is probably up... and something is in fact up, because wars are usually fought under the pretense of righteousness. But in reality almost all wars are fought in the interest of self-preservation and self-interest. US foreign policy is pretty much bound to be purely in the interest of its citizens much like how a corporation is bound to make a profit. If we actually promote free speech and democracy, it'll be due to self-interest, because for the simple reason that a politician can't really justify that they did something for the good of other non-constituents and hope to be re-elected.

Free speech is a great ideal, but it's also a tool used by the enemy for disinformation, destabilization, and defamation. This is similar to how encryption is great, but also empowers criminals just as it empowers resistance to tyranny. Some peoples will choose to have less free speech for more stability, and that is their choice. If the EU started fervently advocating for more consumer privacy protections in the US when we actual Americans don't care about it too much, we'd probably raise an eyebrow too on their actual motivations on matters of domestic concern.

Personally, from a righteousness point of view, if we really wanted to help people we'd take them into our borders and integrate them into productive members of our own society. Anything else is illegitimate. If we truly believe China is going the wrong way then we should take in their citizens who agree with us, and give them the opportunity to shine here. Humans are a resource, not a liability. If we can't take them in, then it's also not our place to tell them how to go about their lives. If we really want people to adopt our values, we should integrate them. And for the most part the US does a pretty good job of this as we're probably one of the most multi-cultural societies on this planet.


> Maybe to the EU we are barbarians because our consumer privacy laws are so lacking...

Yup. Well, that, plus your third-world health-care(lessness) system, and your unbridled appetite for guns to kill each other with.

Sorry if this comes as a surprise to you, but: Honestly, you are barbarians.


I'm glad people like you weren't in charge in 1861 in the USA.

China is a slave country controlled by dictators against the will of the people, not just "a different culture".


Governments ultimately draw their authority from their ability to keep their citizens from overthrowing them... China is doing a good enough job right now, even if you don't think so, and even if there are some bad things happening there. But the reality is that even if they were to submit to internationally audited and monitored elections overnight, it'd still be subject to the flaws of a democractic system (private interest group, big money influence, etc) that would taint the ideal of a fair election. A democractic election is ultimately still a proxy of power distribution. You'll end up with more of the same most likely, but with less excuses to point fingures at when they commit unpopular acts (for example they may still imprison Uighurs, expand in the Indian ocean, and threaten Taiwan).

Rather than making such strong remarks telling Chinese people what they are, you can just go there and ask them yourself instead of armchair posturing.


You could “just go there” and start asking questions.

I wouldn’t though, lest you be arrested like Michael Spavor.

https://www.bbc.co.uk/news/world-asia-china-58168587


Not saying it’s ok what they did but the article does reveal the real political reason behind this:

Critics have accused China of treating both Spavor and Kovrig as political bargaining chips, held as part of what is known as "hostage diplomacy".

Wasn’t so much the asking questions as it was a diplomat used as a bargaining chip since a Huawei exec was also captured.


So what you're saying is that the CCP took the two Michaels hostage for no other reason than to be political bargaining chips?

...that's exactly why I wouldn't travel to China whilst Xi is power.


Yes. And US is a plutarchy often against the will of its people. It is just a different culture.

If it helps. I like US better.


As a non-American, consider that many of us live in countries with better policies than yours, and we’d prefer not to have your ideas imposed on us by fiat, thanks.

So I’m quite happy that Apple has to follow local rules, and I respect Chinese citizens enough to believe that they can advocate for the change they want to see, over time.


Because the Chinese people value different things than the American people and they're governed by different norms and nobody nominated Americans to be privacy commander in chief.

I think gun laws in the US are terrible but I recognize the US has a different history and it's not my place to tell them how to live. Simple as that.


It’s difficult to say what people value when they don’t have the freedom to express that.

It’s not like anyone in China can vote their leaders out or even say anything negative publicly.


China actually has both actual elections (at the local level) and it definitely has negative public opinion. The Chinese are, if anything, some of the most ardent shitposters if you've actually ever used social media in China. Of course some of it is censored but most of it isn't, deliberately so because it's pretty much the only way to figure out what grievances people have, and technically because you can't stop a billion people with the ability to make memes and puns from getting around anything.[1] There's even people who make that stuff available for English speaking audiences[2]

This notion that you somehow can't figure out what the Chinese want, as if you couldn't talk to them, which you literally can, you could just visit China actually, just shows that the entire discourse happens at some sort of meme level.

[1]https://qz.com/2014939/chinese-internet-users-reject-beijing...

[2]https://www.whatsonweibo.com/


That isn’t so just because you say it is and it seems like you have no idea what it is actually like to live in China. A lot of the people pushing back against anti-China sentiments are actually former Chinese citizens… and if you look at the anti-China lobby that consists of ex-Chinese you end up with the cult Falun Gong and ultra right-wing Epoch Times. That should already tell you a lot about how unrealistic your sentiments are and who is really pushing this narrative that Chinese people are somehow all slaves who do not know any better.

If you get a bunch of people who are completed disconnected from a constituency deciding what’s best for them you end up with the former British colony of America, the former democratic Afghanistan. As its clear you don’t know the actual sentiments of Chinese people it’s a bit ludicrous to suggest that you know what’s good for them better than they do.

Nobody fought the American war of independence on behalf of Americans. It’s arrogant to think that others need us to liberate them from some imagined subjugation.


I have not said I know what the sentiments of Chinese people are or what's good for them.

You're very obviously using that as a straw man argument to avoid discussing democracy and democratic mandate.


> Chinese people probably never expected this level of privacy to begin with, but we did and we can.

Taiwanese users data is also backed up to China. This was confirmed to me by an Apple Support in China. Whether you believe Taiwan is part of China or not, I can assure you users in Taiwan do expect this level to privacy.

What we are seeing is a slow deterioration of user privacy across the Apple ecosystem, not just the US. So even if you don’t care about Chinese users data, it does show what Apple management as a whole thinks of your data.


Tic-toc


> Contrary to popular belief, iCloud data, while encrypted, can be decrypted by Apple and is subject to US law enforcement requests.

The most seem to forget, that with this newcoming feature this is not possible anymore. Apple can’t decrypt your images anymore by request. (Read Apple’s PSI system)

There is also strong evidence that same is coming for backups. On iOS 15 beta, there is backup recovery option by authentication key.


They can’t decrypt the safety vouchers, which contain low resolution versions of your image until the conditions are met … which makes no sense as they have access to the cleartext full resolution image right there.

Unless of course this is a precursor to an E2E encrypted iCloud wherein Apple does not have the ability to decrypt your images server side. I don’t see how this design makes sense unless that’s the next step


> … which makes no sense as they have access to the cleartext full resolution image right there

They have no access. It is end-to-end encrypted with new system. That is the whole point of on-device scanning. It is not next step, it is already there.


It works if the photos are encrypted such that Apple can’t access them. Hence the wonky design.

They haven’t announced that photos will be E2E encrypted.


They did… just read the specs. There are many pdf:s on the bottom. They failed the PR truly.

https://www.apple.com/child-safety/


That’s not my interpretation of the first party descriptions of the CSAM system. Clearly it works if photos are encrypted; nothing implies they’re using that capability yet.

If that were the case, this would not merely be a PR mess but PR malpractice of an entirely other, novel kind.


The system is complicated, especially the PSI paper. But it is clear, that it is designed to run on encrypted images, there is choice to run without it.


So it doesn't mention that anything is actually E2E encrypted right now, does it?


It does if you try to understand that system. CSAM scan on-device won’t work at all without that encryption part.


Of course it will. It's only the vouchers that are encrypted.


Apple didn’t wake up one day and decide to do this on a lark.

They’re being proactive, probably in a minimalist form, to anticipate regulatory powers on what is unarguably the largest or second largest platform used for illegal porn.

If FB screeners have ptsd and are killing themselves over what they have to see every day, imagine what is on iCloud and iPhones. Right now, nobody is required to filter that content while social media is. The alternative to “sure, you tell us what is illegal and we’ll scan for it” is “We’re the govt and we want to see everyones photos for the children.”

Sure, the latter may still happen, but probably later than sooner now. I’m surprised it has taken this long,


There was a great comment by joe_the_user on hn responding to this before: https://news.ycombinator.com/item?id=28261573

>Government by threatened legislation is much worse than government by actual legislation. Legislation is public, Legislation can be opposed, legislation can be reviewed by the court and so-forth. Allowing yourself (and your users) to controlled by threats of legislation is allowing democracy to be discarded.


It's my understanding that Apple simply can't operate in China without playing by those rules. So really, the onus is on the CCP.

In this case, is Apple being compelled to do this by the US government? Or is it a choice Apple has made purely internally? I think that makes a difference.


I agree the question of whether Apple was compelled by whatever government (or if they did this voluntarily) has implications on the ethics of these decisions. They may genuinely have no choice.

But I don’t see how it affects the question of whether Apple’s privacy assertions are trustworthy.


>> They may genuinely have no choice.

There is a choice. Don't comply and have the CCP make you stop selling there.

Apple has no principles that can't be tossed aside in exchange for a large market - in other words a lot of money. This should not be unexpected.


The CCP would make Apple stop manufacturing in China, which would effectively mean that they couldn't sell anything anywhere. Apple does have very deep pockets, but that would be an existential threat.


Well put. Perhaps it’s cynical, I think that’s true of any corporation of Apple’s size: their only true principle is to maximize shareholder value.

The best we can do as privacy-concerned mere-mortals is to take our business where those profit incentives align with our values. Apple’s put a lot of work into advertising that their profits are aligned with values of privacy, but some signs say otherwise.


You shouldn't expect privacy from a third party. You'll be disappointed by definition.


Maybe similar pressure was placed on them here and we just aren't privy to it.


ding ding we have a correct answer


This is a claim which should require extraordinary evidence, since Apple has very publicly resisted pressure to build technology at the government’s behest in the past.


It’s now common knowledge (as mentioned in the article) that Apple refrained from adding a feature at the government’s behest in the past. It’s a fine line between not adding a feature they don’t like and adding a feature they do.


This claim is often repeated but the only source for it I’ve found is reporting in Reuters citing anonymous sources. The user experience challenges of end to end encryption are immense, especially since iPhone is many users’ only iCloud client, and I find it hard to believe Apple was moments away from announcing a solution to them but the FBI pressured them out of it. That is not the extraordinary evidence such a claim needs. For example, Bloomberg reported an even better sourced story about supply chain compromises to Apple’s cloud services which has been more or less entirely debunked.

In addition, the emphasis on the on-device portion of this scanning project is evidence that Apple views losing access to iCloud data as part of its roadmap.


The Bloomberg claims were explicitly denied by Apple and several other companies. To the best of my knowledge, Apple has never publicly denied the Reuters reporting, and explicitly declined to comment when given the chance by Reuters. It’s certainly one thing to extend the benefit of the doubt to a company in a dispute with a reputable news agency; it’s entirely another thing to take issue with the claim when even the affected company won’t do so.


So you are suggesting that FBI pressure is the primary reason Apple did not pursue plans to end to end encrypt iCloud Backups on the basis of one news article and lack of comment from Apple?

(There is other counter-evidence: Apple rarely comments on speculation. In an interview with the WSJ, Apple’s answer to why now was that they figured out how to do known CSAM detection in a way they felt met their privacy requirements. The omission is at least slightly informative, if you think FBI pressure is critical. Finally, people more familiar with the legal context have argued it would jeopardize the program for there to be evidence that Apple is doing this work in response to FBI pressure as suggested. Finally, Tim Cook offered a more straightforward explanation and vision for iCloud end to end encryption in an interview with a German newspaper:

SPIEGEL ONLINE: Is the data as secure on your iCloud online service as on the devices?

COOK: Our users have a key there, and we have one. We do this because some users lose or forget their key and then expect help from us to get their data back. It is difficult to estimate when we will change this practice. But I think that in the future it will be controlled like the devices. We will therefore no longer have a key for this in the future. )

If so, what information would change your mind? How confident are you that this is the full story?


I have specific reasons to believe that Apple has been subject to legal pressure. But if you didn’t believe six anonymous sources in a story by a reputable reporter, you’re not going to believe my secondhand reports either. Skepticism is fine: stubborn unfounded skepticism in the absence of a direct confirmatory statement from Apple isn’t possible to argue with.

Apple being legally pressured is not the full story. It is absolutely true that they have been pressured by the FBI and others, and simultaneously that they also have real concerns about user experience with lost backups. If you read the Reuters story, it doesn’t draw a straight line from the FBI to the backup situation, it just points out that legal pressure is a factor in Apple’s reasoning. Apple spent a lot of money building an E2EE key vault based on HSMs several years ago, and it’s also fairly obvious that they had bigger plans than securing passwords and browser history. Yet they have not made full E2EE backup available even as an option for advanced users, despite the fact that even Android now supports E2EE backups. And prior to enabling E2EE backups (one assumes that’s coming this year) they paused to build exactly the on-device scanning system that law enforcement has been exhorting cryptographers to build since William Barr’s letter in 2018. It does not take a great deal of imagination to see the pattern, but obviously only Tim Cook can prove it to your satisfaction.

ETA: Just to take this a step beyond “someone is arguing on HN”: this argument matters because I think we all intuitively understand how dangerous this system would be in a world where Apple’s engineering is responsive to government pressure. Your skepticism makes perfect sense if you want to believe this system is secure. I wish I could live in a world where I was able to share that skepticism, it would be a more relaxing place.


I’m not sure why it’s obvious to you that Tim Cook must personally whisper into my ears otherwise. The FBI and every other intelligence agency is probably pressuring Apple all the time. Elsewhere in the thread, I even say that I think law enforcement pressure is one reason Messenger has not turned on E2E by default. I understand how this works.

What you haven’t convinced me of is whether Apple’s priorities are being driven by the pressure. Apple can believe keeping known CSAM off their services is important, and just because someone else agrees doesn’t mean the outside party was critical or the cause of the decision. We live in a society where there are lots of non-government reasons to not be the world’s #1 CSAM host, especially as the famously anti-porn company.

To what extent Apple’s intentions are sincere or coerced is important to suss out because it changes the likelihood that Apple, in the long term, will build different features that endanger its users. I agree that the platform vendor is “intuitively” a source of risk, but I don’t think what they’ve announced is any more (technically) dangerous than anything else my device already did. Even if Apple is outright lying about the contents of the hash database and what their human reviewers will flag, they could’ve been outright lying about whether they slurp my iCloud Photo Library straight out off iCloud with the keys they escrow. Besides that, there is no other possibly untoward behavior that I can’t verify locally. In fact, if Apple built iCloud scanning, I’d be at least as concerned about future features, because there I have no audit rights.

I don’t “want to believe” the system is secure - I have the tools to confirm that the system exposes me to no risk that I’m not already comfortable with as an (for sake of argument) iCloud Photo Library user, and almost all of the other risks are hypothetical. I’m even open to believing that the other risks are more probable today than a month ago, but the evidence isn’t very strong. Some evidence that would change my mind: any information about NCMEC being compromised by nation states and Apple ignoring that evidence, any evidence from Apple sources stating that they worked with the FBI on this system design, any evidence that Apple is expanding the system beyond CSAM.

Which brings me back to a question you never answered: how confident are you that the system presages generalized full device content scanning, and what evidence would change your mind?


I never said the device presages full-device content scanning. All I’ve said (including in this NYT op-ed [0]) is that it enables full-device scanning. Apple’s decision to condition scanning on a toggle switch is a policy decision and not a technical restriction as it was in the past with server-side scanning. Server-side scanning cannot scan data you don’t upload, nor can it scan E2EE files. Most people agree that Apple will likely enable E2EE for iCloud in the reasonably-near future, so this isn’t some radical hypothetical — and the new system is manifestly different from server-side scanning in such a regime.

Regarding which content governments want Apple to scan for, we already have some idea of that. The original open letter from US AG William Barr and peers in 2018 [1] that started this debate (and more than arguably led to Apple’s announcement of this system) does not only reference CSAM. It also references terrorist content and “foreign adversaries’ attempts to undermine democratic values and institutions.” A number of providers already scan for “extremist content” [2], so while I can’t prove statements about Apple’s intentions in the future I can only point you to the working systems operating today as evidence that such applications exist and are being used. Governments have asked for these, will continue to ask for them, and Apple has already partially capitulated by building this client-side CSAM system. That should be an important data point, but you have to be open to considering such evidence as an indication of risk rather than intentionally rejecting it and demanding proof of the worst future outcomes.

Apple has also made an opinionated decision not only to scan shared photos, but also to scan entire photo libraries that are unshared with users. This isn’t entirely without precedent, but it’s a specific deployment decision that is inconsistent with existing deployments at other providers such as Dropbox [3] where scanning is (allegedly, according to scanning advocates) not done on upload, but on sharing. Law enforcement and advocates have consistently asked for broader scanning access, including unshared files. Apple’s deployment responds to that request in a way that their existing detection systems (and many industry standard systems) did not. Apple could easily have restricted their scans to shared albums and photos as a means to block distribution of CSAM: they did not. This is yet another difference.

I’m not sure how to respond to your requests for certainty and proof around future actions that might be taken by a secretive company. This demand for an unobtainable standard of evidence seems like an excellent way to “win” an argument on HN, but it is not an effective or reasonable standard to apply to an unprecedented new system that will instantly affect ~1 billion customers of the most popular device manufacturer in the world. There is context here that you are missing, and I think suggesting more reasonable standards of evidence would be more convincing than your demands for unobtainable proof of Apple’s future intentions.

[0] https://www.google.com/amp/s/www.nytimes.com/2021/08/11/opin...

[1] https://www.justice.gov/opa/press-release/file/1207081/downl...

[2] https://www.google.com/amp/s/amp.theguardian.com/technology/...

[3] see p.8: https://www.europarl.europa.eu/RegData/etudes/BRIE/2020/6593...


> Apple’s decision to condition scanning on a toggle switch is a policy decision and not a technical restriction as it was in the past with server-side scanning.

This is not a meaningful distinction. There are many security and privacy protections of iOS that are equivalently "policy" decisions: letting iCloud Backups be turned off; not sending a copy of your device passcode to Apple servers; not MITMing iMessage which has no key transparency; not using existing Photos intelligence to detect terrorist content etc. In technical terms, there are many paths to full device scanning, and some of those paths were well trodden even a month ago (iCloud Backup and Spotlight, for starters, and Photos Intelligence as a direct comparison).

Making this claim also requires showing that the likelihood of Apple making one of many undesirable policy decisions has changed.

> Regarding which content governments want Apple to scan for, we already have some idea of that.

I asked about what Apple will scan for, not what governments want them to scan for. Again, I see a pattern in your argument where you state what government wants and then don't state how that desideratum translates into what Apple builds. The latter is entirely the source of ambiguity for me.

> Apple’s deployment responds to that request in a way that their existing detection systems (and industry standard systems) did not.

I could see that. If that's what Apple had built, would you have a different take on the system? It seems like no -- most of the risks you care about are unchanged since you are operating in a world of "policy decision" equivalence classes.

> I’m not sure how to respond to your requests for certainty and proof around future actions that might be taken by a secretive company.

You seem to misunderstand what I said. I'm asking for an estimate of your certainty, not absolute certainty. Furthermore, I'm asking you to provide examples of what information would change your mind for the same reason you keep repeatedly calling me stubborn and accusing me of bad faith: without it, I have no idea whether we are engaged in discussion or a shouting match.


Therein lies a problem. Most people would agree that a good predictor what people will do is what they have done in the past. If you read through some of the stories ( those of Snowden come to mind ), some declassified information over the course of the past few decades, a pattern emerges.

There is no evidence, either because it does not exist or because it is hidden. The best we have is inference and whistleblowers.

That said, I genuinely think we are not being tinfoil enough these days. And that is based only on what we know ( or at least avg. citizen should know ) was already done in the past.


The PRISM revelations, to this day, are very ambiguous about their implications for cooperation. When they came out, most involved companies flat out denied cooperation. The types of data the NSA claimed to get were available by tapping into network backbones. Unless you are aware of a theory or evidence I’m not, I think it’s just as likely that the program described in the leaked slides involved unilateral or covert intrusion by the NSA rather than cooperation.

It is reasonable to be conservative about data stored in someone else’s cloud, and there is undeniable value to end to end encryption that gives you control over who can access it. That said, especially if you read Apple’s letter in response to the PRISM allegations, Apple’s behavior seems quite consistent and sincere over time: https://www.apple.com/apples-commitment-to-customer-privacy/.

I don’t think it’s likely they designed this feature under pressure from the government or with the intention to expand it to local data on your device.


Ok. If that is not the reason, then the question becomes what is the real reason.

Some analysts seem to think Apple should be getting into advertising business, which would partially explain some of the proposed updates. Naturally, if that were the case, it would render Apple's commitment to privacy about as useful as T-Mobile's. Then again, I might be giving Apple too much crap. Most companies don't even pretend to care.

https://www.marketwatch.com/story/this-could-be-apples-next-...


I think Apple (i) genuinely believes scanning on device is better for privacy because it lets users (theoretically) confirm the behavior of the system and (ii) is learning from others’ mistakes of deploying scanning server side and having that become a blocker to moving to end to end encryption (e.g. Facebook Messenger). My guess is they announce expansions to end to end iCloud behavior soon. Features like this are building blocks to controlling who escrows your iCloud private keys: https://gadgettendency.com/apple-allowed-to-bequeath-and-inh...


Interesting. That did not occur to me. Thank you for sharing this. Lets see where this goes.


You’re welcome, and I’m happy to share alternative theories. I think some of the issues of control that have been raised are legitimate. For example, you brought up the idea that Apple is doing this as part of a move to advertising, perhaps because advertisers want to be reassured their ads don’t appear next to distasteful content. I think there’s an interesting idea there: lots of companies are trying to move towards “on-device” ads; are they really private? Is privacy the right framework to evaluate them, or is it a different type of control?


Perhaps. However Apple has since released two security updates to iOS but has not patched the iMessage flaw that allows Pegasus software to spy on thousands (perhaps millions) of iphones.

What are they waiting for? Hmm perhaps getting something else in place first.


You are saying known CSAM detection for iCloud Photo Library will launch before the Pegasus 0-days are fixed? The two are entirely different. If Apple was working on behalf of the government, they could’ve already shipped over the contents of iCloud for all the users targeted.

In any case, I’d happily take the other side of the bet at even odds that the security issue is patched after the child safety program goes live.


> since Apple has very publicly resisted pressure to build technology at the government’s behest in the past.

And they've also not done that. When Jobs died Apple promptly bent the knee and joined up to PRISM.

Very publicly resisted? Nope. They went along, very quietly (it was Yahoo that tried to fight back). We only found out about Apple having joined up thanks to the man that wrote this article.

How many human rights atrocities does Apple get to partake in before their credibility is shot, such that the burden is put on them instead, of proving - in such circumstances as this one - that they're not commiting more atrocities.


> is Apple being compelled to do this by the US government? Or is it a choice Apple has made purely internally? I think that makes a difference.

You're being downvoted but it's a critical issue.

If Apple is currently being compelled to do this, it likely means the US Government has a massive new privacy obliterating program underway and Apple probably isn't the only tech giant joining the human rights violation parade. It's important to find out if that's going on. We can be certain they didn't stop with PRISM.

If it turns out to be the case, that Apple has joined up to another vast human rights violating program (they already did it at least once before, remember), the US needs to move forward toward Nuremberg-style trials for all involved Apple management and all involved Apple employees (and not only them). That's the only way it stops.

Such human rights violations should not be allowed to continue. How many tech employees at these companies got away with extraordinary human rights violations related to PRISM? Employees at these companies were responsible in part and critical to helping to make it happen. Who are these enablers? Why aren't they in prison? Why is this so rarely discussed on HN? (yeah we all know why)

HN is pretty amusing about this topic. Privacy is a human right? Yeah? Also universally HN: but let's not talk about the people actually responsible for the human rights violations; let's not talk about all the techies being paid princely sums to commit human rights atrocities. Let's not talk about prison sentences for what they've done to their fellow humans. Let's not hold tech employees responsible.


The Jan 6 commission is using the riots as a pretext to collect and inspect private speech/communications from a huge list of people who had absolutely nothing to do with January 6 [0] but were politically active for Trump or his campaign or just posted memes on Twitter etc.

The massive list of people they're demanding records for is shocking. Privacy doesn't mean anything to the current establishment, if people could just take off their partisan blinders for two seconds they would realize this and we could probably form a plea to congress as unified voice.

Give this CSAM system another few years and they won't even have to subpoena for most of the private communication they're already going after today.

https://www.forbes.com/sites/andrewsolender/2021/08/25/jan-6...


I imagine because most of us care more about US policies in general than Chinese because most of us live in the US. If fixing Chinese lack of free communication were on the table I'm sure we'd mostly be for it, but that's a whole other thing that ultimately goes back to their government.


I mean, that’s also bad? But the CCP is not going to budge on this, and it doesn’t affect me as much as an American, so I feel like I can be upset about both and more upset about the one that affects me directly.


Because most of Chinese keep silent with their government?


People care more about what happens to them than others and people care more about what happens where they are than elsewhere.


As far as is documented, the behavior of iCloud does not change, just the operator. In particular, the difference is that end to end encrypted data in iCloud remains that way, so saying all iCloud data is handed over is incorrect.

In fact, iMessage is the only end to end encrypted messaging service operating in the country (for example).


It’s my understanding that the keys used in that “end-to-end” encryption are also under the control of the operator [1], so from a privacy perspective it is the same as handing over that data in plaintext.

[1] https://www.nytimes.com/2021/05/17/technology/apple-china-ce...


It’s an incorrect reading of the article. The HSMs in the data center are operated by the Chinese company so any CloudKit data escrowed by Apple could be accessed, but end to end encryption keys are synced through iCloud Keychain which uses a different protocol with device secrets.


Do you have any sources for that? I ask because the article I linked specifically states that Apple was forced to discard the entire encryption system it uses elsewhere. It’s also hard to understand why a government would insist on this sort of data custody without the benefit of plaintext access.


The key generation routine for iCloud Keychain is shipped in iOS and tangled with your device passcode. Chinese iPhones have the same iOS builds as iPhones everywhere, so if some backdoor code was present to have them generate iCloud Keychain keys differently, someone would have found it.

Here’s what I think the article is trying to describe:

1. It is known that Apple houses Chinese iCloud user data in Chinese servers. Apple has said so: https://www.cnet.com/tech/services-and-software/apple-ceo-ti...

2. China refuses to support Thales’s HSMs, so Apple had to build their own, presumably based on the secure element: https://twitter.com/matthew_d_green/status/13943950780100526...

I think #2 is what the article characterizes as “discarding entire encryption system.” However, the encryption of iCloud Keychain isn’t dependent on HSMs in the same way the rest of iCloud data is.

As a result, E2E encrypted iCloud data for Chinese users is probably still safe in China. Given physical access and non-standard HSMs, non E2E encrypted data in iCloud probably is not.

It will be very interesting to track the consequences if and when iCloud moves more data into E2E encryption, since the majority of synced data is not: https://support.apple.com/en-us/HT202303


Apple put a lot of money to design that HSM to lock themselves out. Chinese can’t access iCloud Keychain as plaintext.

https://blog.cryptographyengineering.com/2016/08/13/is-apple...


> the fact that, in just a few weeks, Apple plans to erase the boundary dividing which devices work for you, and which devices work for them.

A very overdramatic sentence. It is a bit scary to realise that only now people think that this border is being crossed. It has happened a very long time ago already. The first years of Android, owners were the product, not the phone. Privacy features in the past years might have improved this a little.

Google’s massive success on many services is based on the fact how phones and their software were collecting data for them. User interfaces are just illusions for non-tech persons. They might give you a sense of control.

Now that Apple does not trust us with CSAM material, the end is near. There are arguments for both sides, and many are taking sides to just get attention.

However, you can only solve this problem with politics.


This whole switch to Linux is not a solution.

Privacy advocates need to be like second amendment activists. We need to use their playbook. They raise a big stink about anything, no matter who big or small, that could curtail their rights. No number of Sandy Hook events will result in meaningful changes in laws.

Pushing everyone to Linux will eventually lead to all hardware falling under some national security law, allowing hardware to be imported if only they allow certain OSs to be installed on them and boot loaders will be locked.

Free market has no impact here, the masses don't care. And privacy supporters are too logical to whip up any type of movement.

Till privacy advocates come up with emotional reasons why privacy is absolutely necessary (like grandma is gonna die without it), this is a losing battle.


> Till privacy advocates come up with emotional reasons why privacy is absolutely necessary (like grandma is gonna die without it), this is a losing battle.

They have already come up with good reasons.

"Every time you use encryption, you're protecting someone who needs to use it to stay alive." -- Bruce Schneier

"“Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say.” -- Edward Snowden

Surveillance harms journalism and activism, making the government too powerful and not accountable. If only activists and journalists will try to have the privacy, it will be much easier to target them. Everyone should have privacy to protect them. It’s sort of like freedom of speech is necessary not just for journalists, but for everyone, even if you have nothing to say.

> Pushing everyone to Linux will eventually lead to all hardware falling under some national security law

I don't see any connection here. Linux is already used on all servers and nothing happens.


The problem is that the NRA is not big because of the people. It is big because of gun companies. Thats why their budget is 20x larger than EFF's.

Privacy doesn't move units like gun rights move guns. Until this changes, privacy is a lost cause in the digital world.


Dude can write, too.

Good job, ES.

Apple prob going to about-face like OnlyPorn

But it should set off some soul searching in the tech community at least about the consolidation of power


Sigh.

You poor fools.

I guess us poor fools?

I haven't thought the day will come when Apple news brings up very strongly, as a knee jerk reaction, one short part of a one long sentence Hungarian poem indeed called One Sentence On Tyranny. It is a poem many of us know from heart, to remember what was even if many back in Hungary forgot. I fail to convey my emotions here in just a few words but I am incredibly saddened.

Anyways, I checked a few translations, they lose some of the power of the original but let me try, the first two lines only for context:

[...] you would like to look, but you can only see

what Tyranny conjured up for you

already forest fire surrounds you

fanned into flame from a matchstick

you threw down without stamping it out

Oh yes, nothing new under the Sun. And it might be too late now.

These things have been so much on my mind because I saw an anti masker protest in Vancouver peacefully escorted by police. My mind melted. I remember, remember all too well, it was only 35 years ago when Hungarian police have broken up a protests with batons -- it's called the Battle Of the Elizabeth Bridge to this day. It was a very one sided battle, mind you. And while I don't remember, my parents do when they broke it up with tanks....


This is what happens when a businessman takes over a founder led company. Soulless and liars are the only the words which comes to my mind.


I wonder if any current or future President might revisit the idea of granting Snowden a pardon. He is is still viewed unfavorably by a fair amount of the US population, but it seems like that's changing with time.


Around here, people view him favorably but are wary because his choices make him look like an agent for Russia. I think it would change only after a pardon.


So what happens when, in a few years at the latest, a politician points that out, and—in order to protect the children—bills are passed in the legislature to prohibit this "Disable" bypass, effectively compelling Apple to scan photos that aren’t backed up to iCloud?

Isn't this essentially what might happen with so-called ChatControl in Europe?

https://www.patrick-breyer.de/en/posts/message-screening/?la...


I was surprised to not see an RSS link there, but apparently a feed does exist:

https://edwardsnowden.substack.com/feed


Are there any decent open source alternatives out there we can contribute to?

A quick search revealed that there is Librem 5, a Linux based smart phone. I would love to hear more about it.

https://en.m.wikipedia.org/wiki/Librem_5


There’s also Pine Phone.

https://www.pine64.org/pinephone/


I think 1TB[^] would be enough to hold most people's privacy concerns.

[^]: https://www.macrumors.com/2021/08/17/iphone-13-third-week-se...


My question after reading this article is, what can I do as a current iPhone user? Is there a reasonable alternative phone out there that can run the same apps but that isn't effectively controlled by Apple or Google?


Simple: don't store your photos on Apple's property (iCloud). That's the only time this fingerprinting will affect you. If you store photos locally, then no type of fingerprinting will happen.

While you're at it, don't store them on anyone's servers, because they all fingerprint for the same exact reasons. There's no service out there that doesn't do this at some level.


Sorry, I should have been more clear in my post. I meant to say that, if I no longer want to support Apple (or Google) because of their behavior, are there any reasonable smartphone options available? It seems to me that this is the only way I can protest against what Apple is doing.


The problem here isn't the hardware. The problem is the location you're choosing to store your photos.

If you want to protest what they're doing, then stop storing photos on iCloud which means you need to spend less on iCloud storage. That's the most realistic option.


What would happen if in a new paradigm shift multi-nationals decided to use their enormous lobbying power to push back on the government in favor of their users for once instead of only lobbying to screw them over?


Great to see someone influential framing it this way. At the end of the day, what it's there for doesn't matter and unfortunately way too many of us stumble on this mistake in reasoning.


Are there any self-hosted equivalents to icloud photos? (e.g. automated backup/sync. Does the camera app have a way to save to other places, etc?)


I think your best bet might be NextCloud. You can either host it yourself on a storage VPS or find a provider who provides managed NextCloud (or however they repackage it.) Be aware there are many more providers than listed on the NextCloud website and you'd do well to search about. If you're in Europe, take a look at Hetzner's "Storage Share" as an example of what's possible and prices for high-quality hosting, but like I said there's a million operators out there doing this and you can get it dirt cheap if you want.

I don't know how it works on an iPhone but on Android NextCloud detects when save pictures in a different directory due to some app and asks me if I want it to track that directory. It was already configured by default to track stuff taken with the default camera app.


Synology disk station software does this, but that's a significant investment. If you open the DS file software it defaults to syncing your photos. I don't run mine public, so I've never checked how their VPN/cloud sync thing works.

If I want to share a photo from my phone I use syncthing hosted on one of my VMs on a server I bought and built, but that I don't have physical access to easily (I'll never see it, probably). At home to share a photo I either use mattermost to get a public link to the stored image or ssh to the same box as syncthing runs on. I also host mattermost, on a different VM on a different server in the same datacenter.

I don't like apps seeing my stuff so I just don't use stuff like imgur or whatever.


OwnCloud


A little melodramatic, but the guy is a talented writer.


This is your chance to break free. Please consider switching to an opensource OS. Our freedom depends on it.


If you hear a loud bang, that will be the RMS smug-o-meter exploding.


Edward Snowden just went full Stallman. Take a good idea and ride it over the cliff of sanity.

Of course privacy can't be absolute, we live in a society. Be realistic. Focus on evil things. If you think Apple have an evil plan, sure, but most people who object don't even think that.

And for a tech spy he seems to not understand tech. "What if some evil regime wants Apple to scan for anti-government propaganda". Well then they won't be using the CSAM system that's for sure, they can just scan the images directly, either on iCloud or on device. Co-opting the CSAM scanner is probably the most impractical way imaginable to spy on Uighur separatists.


Let me see: Who I will trust more about "security" or "privacy"?

Some guy on hackernews, full with well payed Apple employees or renowned spy?

Hard thing to do? Right?


I'm not saying you should "trust" me, or anyone. Just consider the facts:

A)

- Apple have complete control over the hardware, the OS and all the most popular apps, including Photos and Mail.

- They also have complete control over iCloud, which is not encrypted

- They can and do scan your photos and emails so that they can classify photos, find possible appointments, emails etc, and now they even OCR your photos.

B)

- They are now building a very high profile, limited and locked in system that relies on hashes, external databases, a large number of matches, human review etc.

Do you really think they would use B if they, the FBI, the Chinese government or whoever, would want to spy on users? For all we know they are already spying, it would be completely trivial to do so. Clearly system B is a complete red herring when it comes to spying. They don't need it.


The point is not to lose yourself in the technical implementation.

Design of the system is corrupt, implementation is done in a way to hide this with layers of complication and false "privacy" for "normal" users.

For me and many other people there is difference in knowing that Apple is the only responsible part in a contract and on other hand adding third party private corporation (funded by DOJ with more than 30 million dollars), which will "provide" the hashes and because of the sensitive nature of hashed material nobody will have access for auditing.

Nobody in their right mind believed Apple about "privacy". The difference here are that this is intrusion on my property from assumption that I am guilty until proven innocent.

A lot of normal people, without technical knowledge can see that this is a big problem.


Ignore the technical implementation then, it's still obvious that they have always been able to quietly spy on exactly everything you do on the phone. Definitely no need for this crazy CSAM tech if spying is what they want to enable.


IBM gleefully cooperated with the Nazis. It won’t be long until Apple is using this framework to alert the PRC about Chinese dissidents. Just to merely stay in their market.

iOS 15 effectively is the point where Apple kicks off the holocaust that they’re going to be responsible for. I hope they enjoy their place in history because they’re earning it. I’ve loved my iPhones, but there’s a warm place in hell waiting for all Apple employees involved in this endeavor.

Apple is an enemy more threatening to mankind’s freedoms than Al Qaeda ever could’ve been.


i(lluminati) phone.

How did we get to place (speaking of America where there is a bill of rights) where it is normal to have a multimodal tracker on or near your person at all times?

George Orwell's telescreen at least could not fit in your pocket and Orwell never imagined things like GPS or facebook or digital phones.

We are the slow boiled frog as the most expansive totilitarian infrastructure in history is built up.


> How did we get to place (speaking of America where there is a bill of rights) where it is normal to have a multimodal tracker on or near your person at all times?

> George Orwell's telescreen at least could not fit in your pocket and Orwell never imagined things like GPS or facebook or digital phones.

Because we're not living in 1984's dystopia where the government oppresses us, we're living in Brave New World's dystopia where we choose to oppress ourselves.


Phones are not mandatory, if you care about privacy throw your phone in the bin (better, someone else's bin).


are we close enough for the government to scan anything passing its road?


Apple has through side channels leaked iCloud is the largest open host of CSAM among big tech. It's the only large provider that hosts images that doesn't automatically scan. The only difference is Apple wants to do it while leaving your photos in the cloud encrypted. This isn't rational, it's an anti-Apple culture war position.


This entire argument is based on the premise that Apple only scans photos that the user has requested to be uploaded to iCloud, and will continue to do so.

I don't think many people believe that anymore. Not even the Apple's goodwill, necessarily - but that, once the system is in place and normalized, the governments won't mandate it and extend its scope by legislative fiat.


Are the photos in iCloud actually encrypted, though? As far as I’m aware, a government agency can subpoena them already. I’m not sure I agree with the slippery-slope argument, but I’m still failing to see how Apple’s current security model prevents them from performing the hashing on iCloud servers and avoiding all this drama.


And there's no reason they can't scan the images on their servers.


Can you point to any examples of these leaks?

(Edit: Thanks for the links.)


It’s not a leak. It’s an observation from evidence that came out during discovery in the Apple vs. Epic trial that’s being reported now that it’s useful context: https://www.forbes.com/sites/johnkoetsier/2021/08/19/apple-e...


https://www.theverge.com/22611236/epic-v-apple-emails-projec...

#71, from the Epic v. Apple anti-trust trial discovery.


That's referring to iMessage. Predators grooming children is not something the on-device photo library scanning is going to fix.

They could let users report such messages to Apple, but that requires having a team of humans to review iMessage reports.


The CASM perceptual hash scanning is not the only new thing, there's also on-device machine learning algo used by Messages app attempting to identify "sexually explicit photos" as well:

"The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos. When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.

Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages." -- https://www.apple.com/child-safety/

While that's billed as "available for Family accounts in iCloud", so _probably_ opt in, it's another piece of software I never asked for and don';t want on the phone I bought.


I'm well aware, and that's a good feature. I do think they should go one step further and give minors the option to report abusive messages (eg. sextortion).


They could pay 10,000,000 total per year to 100 people and it'd be a rounding error, with their nearly 2,000,000,000,000 market cap.



I'd much prefer they scan their shared albums on the server.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: