Hacker News new | past | comments | ask | show | jobs | submit login

Not sure what you're saying is changing on the server -- people are going to go from not using OpenCL to not using Metal.

Everything is CUDA. Everything depends on the shitty unstable software designed by a hardware company (Nvidia). This sucks and I hope someone can disrupt it, but Apple has no influence in the field of GPU computing.




When did Apple ever have influence in the field of GPU computing.

And I work in data science and nobody is using their own laptops when you have AWS.


I didn't say they did. My claim is that Apple deprecating OpenCL is a straightforward and uninteresting thing; it's a company that has no influence on GPU computing getting out of the business of a technology that also has no influence on GPU computing.

I work in data science too, and who cares about laptops. Desktop computers with GPUs, SSDs, and a lot of RAM are what you need. You can thoroughly bling out the hardware and the entire computer will still cost less than your monthly AWS bill to access a GPU. (This is all getting pretty irrelevant to Apple, though, who doesn't make such computers.)


Whatever criticisms can justifiably be levelled against nvidia, having a bad software stack isn't one of them.


Most of the field of machine learning is irreproducible right now because you can't not use CUDA, but you can't promise that it will work the same on anyone else's computer, or that it will work six months from now.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: