Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Where did you get the idea that's its way better than openai's? Aren't they both proprietary?



Without the "so we can spy on you" part.


But they won't even make good on that: https://arstechnica.com/tech-policy/2023/12/apple-admits-to-...

There's your bleeding, sorry truth there. It's only a matter of time until we get another headline like it.


There's a difference between being forced to compromise user security and doing it willingly in the name of vague "AI safety" concerns.

Furthermore, most governments don't like the "march in with a warrant and demand information" approach, because it's loud and noisy. People might move data out of a given cloud if they know there's spooks inside. And more importantly, it creates a paper trail, which they don't want. So there's a lot of effort put into compromising cloud servers by intelligence agencies.

Looking at Apple's blog post regarding Private Cloud Compute[0], they've basically took every security precaution they could to prevent covert compromise of their servers. They also have some fancy attestation stuff that, most notably, creates a paper trail whenever software changes. Once again, spooks absolutely hate this. It's technically possible for Apple to subvert this scheme, but that would require coordination from several different business units at Apple. Which, again, creates a paper trail. Spooks would much rather exploit a vulnerability than demand code signing keys that would provide evidence of cooperation.

To be clear: no, this isn't end-to-end. You can't currently do end-to-end encrypted cloud compute[1]. But it's still Apple putting lots of money into a significant improvement in terms of privacy and transparency regarding cloud services. OpenAI in contrast does not give two flying fucks about your data privacy, and makes building an AI Panopticon one of their deliberate, expressly stated design goals. Their safety team, at least by their own admission, cannot operate without total knowledge of everything their models get prompted with so they can implement reactive controls for specific exploits.

[0] https://security.apple.com/blog/private-cloud-compute/

[1] Homomorphic encryption is not theoretically impossible, but imposes significant performance penalties that negate the performance advantages of Apple using a cloud service. I suspect that they at least gave it some thought though.


The article did say Apple was compelled to supply the data. Not sure what your point is.


Apple isn’t collecting data from their customers.

Edit: to feed back into their AI training.


Apple has an ad business. They are fooling users for years while claiming that they have the right to collect user data in a recent class action lawsuit. If you don't google because you think they might track you, apple is the reason.


Apple necessarily collects data from customers for mandatory reasons, like everyone else. (Like, you need someone's address to ship them their order.)

More useful questions are if they're using it for other purposes without opt-in or accidentally leaking it.


Be careful with the wordplay here. Apple isn't. OpenAi is not Apple.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: