Hacker News new | past | comments | ask | show | jobs | submit login

> after running LLMs with 128 GB RAM on the M3 Max,

These are monumentally different. You cannot use your computer as an LLM. Its more novelty.

I'm not even sure why people mention these things. Its possible, but no one actually does this out of testing purposes.

It falsely equates Nivida GPUs with Apple CPUs. The winner is Apple.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: