I think this is the first time they have mentioned that you will be able to compare the hash of the database on your device with a hash published in their KB article. They also detailed that the database is only the intersection of hash lists from two child safety organizations under separate governmental jurisdictions.
My immediate thought is that this could still be poisoned by Five Eyes participants, and that it does not preclude state actors forcing Apple to replicate this functionality for other purposes (which would leave the integrity of the CSAM database alone, thus not triggering the tripwire).
> this could still be poisoned by Five Eyes participants, and that it does not preclude state actors forcing Apple to replicate this functionality for other purposes
The thing is, if this is your threat model you're already screwed. Apple has said they comply with laws in jurisdictions where they operate. The state can pass whatever surveillance laws they want, and I do believe Apple has shown they'll fight them to an extent, but at the end of the day they're not going to shut down the company to protect you. This all seems orthogonal to the CSAM scanning.
Additionally, as laid out in the report, the human review process means even if somehow there is a match that isn't CSAM, they don't report it until it has been verified.
the opportunity being to add general functions in photo viewing apps that add a little entropy to every image (for this specific purpose), to rotate hashes, rendering the dual databases useless
monetization I guess being to hope for subscribers on github, as this could likely just be a nested dependency that many apps import. a convenient app for this specific purpose might not last long in app stores.
My immediate thought is that this could still be poisoned by Five Eyes participants, and that it does not preclude state actors forcing Apple to replicate this functionality for other purposes (which would leave the integrity of the CSAM database alone, thus not triggering the tripwire).