It would be great to have information about how much work this took. The only thing I've seen mentioning timescale was the leaked Apple/NCMEC memo:
> Today marks the official public unveiling of Expanded Protections for Children, and I wanted to take a moment to thank each and every one of you for all of your hard work over the last few years.
So this doesn't sound like a feature where all the puzzle pieces were in place and they just needed NeuralHash.
Even if you casually “put aside” NeuralHash as you suggest, the amount of research and testing that has happened to ship the system they've described is not trivial.
I stand by the idea that this was never a point release away.
> Apple could have already implemented a secret image tag for terrorist material" or "drug material" that is attached automatically to images
Apple cannot even consistently tag cats and dogs. There is no way it was ready to ship a feature that tags drug or terrorist material in a way that generates few enough false positives that agencies won't just turn it off.
I do agree with you that we have no way to know what's running on closed-source devices (or even open-source ones, unless we personally audit the whole process from dust to device).
For me, though, “you can't ever really know what's running on your device so why care about contentious new things you've just learned will definitely be running on your device” is not compelling.
I might swallow 10 spiders a year in my sleep, but if someone offers to feed me one I think it's fair to decline.
> Today marks the official public unveiling of Expanded Protections for Children, and I wanted to take a moment to thank each and every one of you for all of your hard work over the last few years.
So this doesn't sound like a feature where all the puzzle pieces were in place and they just needed NeuralHash.
Even if you casually “put aside” NeuralHash as you suggest, the amount of research and testing that has happened to ship the system they've described is not trivial.
I stand by the idea that this was never a point release away.
> Apple could have already implemented a secret image tag for terrorist material" or "drug material" that is attached automatically to images
Apple cannot even consistently tag cats and dogs. There is no way it was ready to ship a feature that tags drug or terrorist material in a way that generates few enough false positives that agencies won't just turn it off.
I do agree with you that we have no way to know what's running on closed-source devices (or even open-source ones, unless we personally audit the whole process from dust to device).
For me, though, “you can't ever really know what's running on your device so why care about contentious new things you've just learned will definitely be running on your device” is not compelling.
I might swallow 10 spiders a year in my sleep, but if someone offers to feed me one I think it's fair to decline.