Hacker News new | past | comments | ask | show | jobs | submit login

It's simply bizarre that you would ask that question when the research to figure it all out is trivially accessed. Everyone thought that "unified memory" would be a boon when it was advertised, but Apple never delivered on a CUDA alternative. They killed OpenCL in the cradle, and pushed developers to use Metal Compute Shaders instead of a proper GPGPU layer. If you are an Apple dev, the mere existence of CoreML ought to be the white flag that makes you realize Apple hardware was never made for GPU compute.

Again, I'm not accusing you of bad-faith. I'm just saying that asking such a bald-faced and easily-Googled question is indistinguishable from flamebait. There is so much signalling that should suggest to you that Apple hardware is far from optimized for AI workloads. You can look at it from the software angle, where Apple has no accessible GPGPU primitives. You can look at it from a hardware perspective, where Apple cannot beat the performance-per-watt of desktop or datacenter Nvidia hardware. You can look at it from a practical perspective, where literally nobody is using Apple Silicon for cost-effective inference or training. Every single scrap of salient evidence suggests that Apple just doesn't care about AI and the industry cannot be bothered to do Apple's dirty work for them. Hell, even a passing familiarity with the existence of Xserve should say everything you need to know about Apple competing in markets they can't manipulate.

> funny in my circles people are talking about unions, AI compute exacerbating climate change, and AI being used to disenfranchise and make more precarious the tech working class.

Sounds like your circles aren't focused on technology, but popular culture and Twitter topics. Unionization, the "cost" of cloud and fictional AI-dominated futures were barely cutting-edge in the 90s, let alone today.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: