Hacker News new | past | comments | ask | show | jobs | submit login

>In short, Apple has built a CSAM detector that sits at the doorway between your device and iCloud. If you don’t sync photos with iCloud, the detector never runs.

If you don't want your photos scanned, turn off iCloud. You can disagree about whether or not that's anti-consumer, but you cannot argue they are changing anything about your device or it's EULA (they are changing the EULA of a service).




That's today.

Next will come scanning of your phone's content. "Hey, if you never send email or chat, the scanner never runs."

And keyboard logging. "For the children".

Encrypt end-to-end all you want, it doesn't matter if your device is running spyware.


Sure, if theoretical new policies come into effect that fit the description of the parent, then there will be policies that fit the description of the parent.

We are in agreement.


I am curious how that works for iMessage. In the example screenshots they shared, they show their scan runs on iMessage too. Is there a way to disable that? Also disabling iCloud would be an option if they allowed for third party backup/sync apps to run background tasks.


>Is there a way to disable that?

It's a feature of Communication safety in Messages. This is opt-in.

https://www.apple.com/child-safety/


You are right. Though that’s opt in “for now”.


> I am curious how that works for iMessage.

I don't know the details here, but you can disable iMessage and just send plain SMS, which in theory does not send the data to Apple. But then, part of the point of using iMessage is that the message contents are not sent to your carrier and have some encryption in place. I would guess your best bet is to find another phone manufacturer and use a different messaging service that provides better security if you want to opt out.


According to John Gruber this does work on SMS, but is a feature of the app, not the service:

> It’s also worth pointing out that it’s a feature of the Messages app, not the iMessage service. For one thing, this means it applies to images sent or received via SMS, not just iMessage. But more importantly, it changes nothing about the end-to-end encryption inherent to the iMessage protocol. The image processing to detect sexually explicit images happens before (for sending) or after (for receiving) the endpoints. It seems like a good feature with few downsides. (The EFF disagrees.)

https://daringfireball.net/2021/08/apple_child_safety_initia...


You may be confusing CSAM detection for iCloud Photo Library with a new parental control feature for under 13s in the messaging app.


Ah so the csam detection doesn’t apply to incoming iMessages? (For now at least I guess- who knows about what Apple decides to do once this gets adopted).


No they are different things. CSAM detection is looking for specific images from a database before upload to iCloud photo library: https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...

The child protection part can be enabled for under 13s if they’re in a family account. If enabled messages app will try and detect adult images being sent and received and give a warning to the child, it can also let the parents know about it.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: