Hacker News new | past | comments | ask | show | jobs | submit login

My assumption has been that Google, Amazon, and Microsoft run the heavy-duty AI in the cloud when possible, benefiting from huge scale and easier updates. Maybe that assumption is wrong?

If it's right, is Apple adopting a more decentralized model, with AI (or more AI) running locally? Could that compete with cloud-based AI's advantages? Obviously it would be better for offline usage, for responsiveness when transmission to the cloud is a significant part the latency, and for confidentiality.




Google's been working on distributed training as well.


Why? What is the benefit to Google?

Also, are they doing training for the local user or for Google's 'general' systems or for both?


So they can do on device AI/ML with TensorFlow Lite, through the use of specialized neural network DSP's, as discussed during the keynote at I/O 2017.

https://youtu.be/Y2VF8tmLFHw?t=1h22m8s




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: