Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I used to give Apple the ole' eye roll for that as well. Then I realised, as I got a MacBook myself and dove into running Machine Learning models on it, the RAM setup is pretty unique.

Essentially, the RAM is so close to the CPU and GPU that it can effectively be used as VRAM, at least for the M1 and up chipsets as far as I'm aware. That means a 32GB RAM MacBook would be able to run incredibly large (e.g. LLM) networks on-device. Nvidia GPUs with that much VRAM (although they are clearly better at GPU tasks) can cost as much as an expensive MacBook already.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: