Hacker News new | past | comments | ask | show | jobs | submit login

There are plenty of graphics researchers not in NVIDIA's pocket cooking up stuff that doesn't require vendor-specific features, so I'm not worried that graphics research is being suppressed, if that's your theory.

Intel and AMD are both big players with a vested interest in promoting the capability of CPUs and promoting the capability of non-NVIDIA GPUs, since they sell both. They're big and well-capitalized, so if they wanted to they could be operating big graphics research teams (whether they are is unclear to me, and it's obviously not a 'snap your fingers and you have a big graphics research team' situation, but they have the resources)

In some cases if you see CUDA being used for a demo or a research project it's just because the alternative stacks kinda suck, not because only NVIDIA hardware is capable of doing the thing. Researchers aren't necessarily concerned with shipping on 99% of consumer machines, so they can reach for the most convenient approach even if it's vendor-locked.

I won't be surprised if we see some researchers start targeting Apple's platform-locked Metal API in the future since they have compelling unified memory offerings with capacities exceeding everybody else's.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: