> I would say that it is a back door, because it would work even if iCloud Photos were E2E encrypted.
Backdoor is defined by the Oxford Dictionary as "a feature or defect of a computer system that allows surreptitious unauthorized access to data."
The system in question requires you to upload the data to iCloud Photos for the tickets to be meaningful and actionable. Both your phone and iCloud services have EULA which call out and allow for such scanning to take place, and Apple has publicly described how the system works as far as its capabilities and limitations. In the sense that people see this as a change in policy (IIRC the actual license agreement language changed over a year ago) , Apple has also described how to no longer use the iCloud Photos service.
One less standard usage is not about unauthorized access but specifically to private surveillance (e.g. "Clipper chip") - but I would argue that the Clipper chip was a case where the surveillance features were specifically not being talked about, hence it still counting as "unauthorized access".
But with a definition that covers broad surveillance instead of unauthorized access, it would still be difficult to classify this as a back door. Such surveillance arguments would only pertain to the person's phone and not to information the user chose to release to external services like iCloud Photos.
To your original point, it would still work with iCloud Photos did not have the key escrow, albeit with less data being capable of being able to be turned over to law enforcement. However iCloud Photos being an external system would still mean this is an intentional and desired feature (presumably) by the actual system owners (Apple).
I see. My interpretation doesn't hold up given your definitions of back door.
I bet the authorities would be happy with a surveillance mechanism disclosed in the EULA, though. Even if such a system is not technically a back door, I am opposed to it and would prefer Apple to oppose it.
Edit: I just noticed that you had already clarified your argument in other replies. I am sorry to make you repeat it.
It has proven very difficult to oppose laws meant to deter child abuse and exploitation.
Note that while the EFF's mission statement is about defending civil liberties, they posted two detailed articles about Apple's system without talking about the questionable parts of the underlying CSAM laws. There was nothing about how the laws negatively impact civil liberties and what the EFF might champion there.
The problem is that the laws themselves are somewhat uniquely abusable and overreaching, but they are meant to help reduce a really grotesque problem - and reducing aspects like detection and reporting is not going to be as effective against the underlying societal issue.
Apple has basically been fighting this for the 10 years since the introduction of iCloud Photo, saying they didn't have a way to balance the needs to detect CSAM material without impacting the privacy of the rest of users. PhotoDNA was already deployed at Microsoft and being deployed by third parties like Facebook when iCloud Photo launched.
Now it appears that Apple was working a significant portion on that time toward trying to build a system that _did_ attempt to accomplish a balance between social/regulatory responsibility and privacy.
But such a system has to prop technical systems and legal policies against one another to make up the shortcomings of each, which make it a very complex and nuanced system.
Backdoor is defined by the Oxford Dictionary as "a feature or defect of a computer system that allows surreptitious unauthorized access to data."
The system in question requires you to upload the data to iCloud Photos for the tickets to be meaningful and actionable. Both your phone and iCloud services have EULA which call out and allow for such scanning to take place, and Apple has publicly described how the system works as far as its capabilities and limitations. In the sense that people see this as a change in policy (IIRC the actual license agreement language changed over a year ago) , Apple has also described how to no longer use the iCloud Photos service.
One less standard usage is not about unauthorized access but specifically to private surveillance (e.g. "Clipper chip") - but I would argue that the Clipper chip was a case where the surveillance features were specifically not being talked about, hence it still counting as "unauthorized access".
But with a definition that covers broad surveillance instead of unauthorized access, it would still be difficult to classify this as a back door. Such surveillance arguments would only pertain to the person's phone and not to information the user chose to release to external services like iCloud Photos.
To your original point, it would still work with iCloud Photos did not have the key escrow, albeit with less data being capable of being able to be turned over to law enforcement. However iCloud Photos being an external system would still mean this is an intentional and desired feature (presumably) by the actual system owners (Apple).