Hacker News new | past | comments | ask | show | jobs | submit login

I'm not sure how it'll ever make sense unless you need a lot of customizations or care a lot about data leaks.

For small guys and everyone else.. it'll probably be cost neutral to keep paying OpenAi, Google etc directly rather than paying some cloud provider to host an at best on-par model at equivalent prices.




> unless you need a lot of customizations or care a lot about data leaks

And both those needs are very normal. "Customization" in this case can just be "specializing the LLM on local material for specialized responses".




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: