Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Oh no, don't get me wrong. I like the privacy features, it's already way better than OpenAI's "we make it proprietary so we can spy on you" approach.

What I don't like is the hypocrisy that basically every AI company has engaged in, where copying my shit is OK but copying theirs is not. The Internet is not public domain, as much as Eric Bauman and every AI research team would say otherwise. Even if you don't like copyright[0], you should care about copyleft, because denying valuable creative work to the proprietary world is how you get them to concede. If you can shove that work into an AI and get the benefits of that knowledge without the licensing requirement, then copyleft is useless as a tactic to get the proprietary world to bend the knee.

[0] And I don't.

My opinion is that individual copyright ownership is a bad deal for most artists and we need collective negotiation instead. Even the most copyright-respecting, 'ethical' AI boils down to Adobe dropping a EULA roofie in the Adobe Stock Contributor Agreement that lets them pay you pennies.



Where did you get the idea that's its way better than openai's? Aren't they both proprietary?


Without the "so we can spy on you" part.


But they won't even make good on that: https://arstechnica.com/tech-policy/2023/12/apple-admits-to-...

There's your bleeding, sorry truth there. It's only a matter of time until we get another headline like it.


There's a difference between being forced to compromise user security and doing it willingly in the name of vague "AI safety" concerns.

Furthermore, most governments don't like the "march in with a warrant and demand information" approach, because it's loud and noisy. People might move data out of a given cloud if they know there's spooks inside. And more importantly, it creates a paper trail, which they don't want. So there's a lot of effort put into compromising cloud servers by intelligence agencies.

Looking at Apple's blog post regarding Private Cloud Compute[0], they've basically took every security precaution they could to prevent covert compromise of their servers. They also have some fancy attestation stuff that, most notably, creates a paper trail whenever software changes. Once again, spooks absolutely hate this. It's technically possible for Apple to subvert this scheme, but that would require coordination from several different business units at Apple. Which, again, creates a paper trail. Spooks would much rather exploit a vulnerability than demand code signing keys that would provide evidence of cooperation.

To be clear: no, this isn't end-to-end. You can't currently do end-to-end encrypted cloud compute[1]. But it's still Apple putting lots of money into a significant improvement in terms of privacy and transparency regarding cloud services. OpenAI in contrast does not give two flying fucks about your data privacy, and makes building an AI Panopticon one of their deliberate, expressly stated design goals. Their safety team, at least by their own admission, cannot operate without total knowledge of everything their models get prompted with so they can implement reactive controls for specific exploits.

[0] https://security.apple.com/blog/private-cloud-compute/

[1] Homomorphic encryption is not theoretically impossible, but imposes significant performance penalties that negate the performance advantages of Apple using a cloud service. I suspect that they at least gave it some thought though.


The article did say Apple was compelled to supply the data. Not sure what your point is.


Apple isn’t collecting data from their customers.

Edit: to feed back into their AI training.


Apple has an ad business. They are fooling users for years while claiming that they have the right to collect user data in a recent class action lawsuit. If you don't google because you think they might track you, apple is the reason.


Apple necessarily collects data from customers for mandatory reasons, like everyone else. (Like, you need someone's address to ship them their order.)

More useful questions are if they're using it for other purposes without opt-in or accidentally leaking it.


Be careful with the wordplay here. Apple isn't. OpenAi is not Apple.


> then copyleft is useless as a tactic to get the proprietary world to bend the knee

I have bad news




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: