- It's pretty close to impossible for a consumer to judge if the methods used are actually privacy preserving, or just lip service. It's just too technical.
- It's much harder to implement than non-private learning.
- Governments will likely not be able to regulate at the level of technical detail needed to allow privacy preserving, and not the non-private learning
- You complain about using the consumers CPU/electricity, but that's often very helpful for privacy. The private alternative is taking DP data off the device, in which need to collect a lot more data for same privacy levels.
Lots of issues come with privatized ML though:
- It's pretty close to impossible for a consumer to judge if the methods used are actually privacy preserving, or just lip service. It's just too technical.
- It's much harder to implement than non-private learning.
- Governments will likely not be able to regulate at the level of technical detail needed to allow privacy preserving, and not the non-private learning
- You complain about using the consumers CPU/electricity, but that's often very helpful for privacy. The private alternative is taking DP data off the device, in which need to collect a lot more data for same privacy levels.